IBM announced last week it has moved its cognitive computing system into the cloud to form the Watson Discovery Advisor, allowing researchers, academics and anyone else trying to leverage big data the ability to test programs and hypotheses at speeds never before seen.
Since Watson is built to understand the nuance of natural language, this new service allows researchers to process millions of data points normally impossible for humans to handle. This can reduce project timelines from years to weeks or days.
The ability to understand natural language queries is a big deal. You can ask, for example: “I’m going to be in Boston. I like basketball. What do you suggest, Watson?” You might get several answers: Celtics tickets, Boston College tickets, Harvard tickets. Or in the offseason, Watson may suggest you drive to the Basketball Hall of Fame in Springfield (MA). Companies are already using Watson this way. Fluid, Inc.’s Watson-based retail solutions deliver granular results to queries such as “I am taking my wife and three children camping in upstate New York in October and I need a tent.” Consider this: Watson has been taught to pass the medical boards. Would you trust it to diagnose you and prescribe medication? What if you claim to be in pain (e.g., back pain, migraines, depression) and Watson doesn’t believe your subjective input? Here’s more food for thought: What if Watson could learn to code? Why not? It’s hardly heretical to suggest that as Watson works with developers, it will one day be able to generate solutions based on a natural language query. That’s equally exciting and worrisome. Now if you want to poke a little fun at Watson, read this Steve Lohr piece in The New York Times (2013) about Watson in the kitchen. Just skim it — the kicker is at the end.
Where Are Indonesia’s Missing Children?
Alongside tents and drinking water, RAF planes dropped more than 1,000 solar-powered lanterns attached to chargers for all types of mobile handsets to the stranded members of the Yazidi religious community below.
It is the first time the lanterns have been airdropped in such a relief effort, but humanitarian workers say it is part of growing efforts to develop technology designed to make a difference in disaster zones.
Imagine a solar-powered lantern that you might take camping with an umbilical cord to a power source with connections to myriad types of phones. The inability to communicate during crisis situations is debilitating, and becomes more so within days (see below).
In a separate project, Dr Paul Gardner-Stephen of Australia invented a “mesh network” that lets people in emergencies communicate via mobile even if they have no Internet connection. Users can send text messages, make calls and send files to other users nearby, creating a mobile network through a web of users. Why is this so important during times of crisis such as war zones or earthquakes? Gardner-Stephen states:
You typically have about three days to restore communications before the bad people realize the good people aren’t in control any more.
He adds succinctly, throwing down a gauntlet:
There’s plenty of technology for rich white men. It’s the rest of the world that we need to help.
As he introduces us to the Sunlite solar-powered lantern, Lane provides a welcome reminder not only of the wonders of technology being used in developing countries, but the need for even more innovation and distribution of technology and knowledge worldwide.
Death by distance. Roy Smythe, a Forbes contributor, argues the merits of healthcare delivered from a distance.
Fellow Forbes contributor Roy Smythe jumps right into the question posed above. He begins by citing Hannah Arendt and referencing Stanley Milgram in support of his proposition that we can become desensitized to death. That’s not new, and Smythe makes clear that he’s not interested in that problem here. What’s interesting is Smythe’s corollary argument that the distance between healthcare providers and patients has become so great that healthcare delivery is at a “decisive turning point in history that separate[s] whole eras from each other,” to quote Arendt.
Myriad technologies create distance between patient and caregiver and all meant to make it more efficient to heal the sick. Smythe reminds us of telemedicine platforms and other forms of “virtual visits” or self-care tools. Such care will be the norm much more quickly than most would like. He cites Dr. Rushika Fernandapulle, the co-founder and CEO of Iora Health, for the position that medical care is still fundamentally human. Fernandapulle writes:
The thing that heals people is relationships – the problem is that technology has the ability to actually facilitate relationships, but it can also get in the way of them.
Above all, Smythe doesn’t want distant medicine to lead doctors to be desensitized by death. He draws an interesting parallel — the use of drones in war. Without boots on the ground or vivid and live battlefield images, death can become abstract and sanitized. Navigating a drone to a drop site is relatively easy–and we should all emphasize relatively–in terms of seeing and feeling the results of war. By contrast, tossing a grenade over a wall, driving over an IED, engaging in close quarter combat, and other critical military missions cannot bring one any closer to both one’s enemy and the realities of death.
Climbing out of this analogy back into the world of medicine can be difficult. When we do, however, we find that “distance medicine” at first seems innocuous by comparison, and then every bit as dangerous.
Rick Delgado at Smart Data Collective contributed insights about potential hurdles for the Internet of Things.
Two ideas crossed my mind while reading this piece. First, Delgado makes the obvious-but-equally-important point that being able to take advantage of the wealth of the Internet of Things requires something we take for granted: access to the Internet. I’m not going to belabor a rural electrification analogy. Many do not have Internet connectivity, including in the developed world and the United States. It gets worse as ignorance abounds. Delgado writes:
While businesses may talk excitedly about the Internet of Things, consumers are largely unaware of it. In a recent survey of 2,000 people, 87% of consumers said they had never even heard of the IoT. While hearing about the Internet of Things doesn’t necessarily signify a consumer would not use an item connected to the IoT, the survey results show a lack of awareness and understanding about what can be gained from it. If this lack of knowledge about the IoT leads to lack of interest, a major driving force for widespread adoption will be missing.
In one of the worst tech predictions of all time, IBM President Thomas Watson stated in 1943: “I think there is a world market for maybe five computers.” Talk about punching in the mouth the possibility of disruptive innovation at IBM. Watson was misguided and incorrect, but hardly dumb. Whether we wish to believe it, Mr. Watson, I suggest, knew far more about his industry at the time than today’s experts know about the Internet of Things, which is in its infancy but growing fast. According to Gartner, there will be approximately 25+ billion sensors in the world by 2020. It’s not surprising that a whopping 87% of consumers are unaware of the billions of sensors around the world. What would (I would hope) be surprising is if we don’t follow in Google’s footprints to widen Internet connection worldwide. That would be a Tragedy of the Commons with a mean twist. We’re not depleting a resource. On the contrary, it grows daily because we feed it. Our “just” not sharing precludes a global race to the top of technology, which I’ll restrict here for the sake of argument to non-military uses. Now that’s a race we should all want to enter.
Tracey Wallace over at the Umbel blog (Truth in Data) writes about data-driven cities and the Internet of Things .
Wallace describes how each city is turning itself into a data treasure trove and using new technologies. Let’s look at a few:
- Turning old phone booths into WiFi hot spots (NYC);
- All household waste is sucked directly from individual kitchens through a vast underground network of tunnels, to waste processing centers, where it is automatically sorted, deodorized and treated. (Songdo, South Korea);
- Wi-Fi provides city communities with hot spots that promote city services such as water meters, leak sensors, parking meter and other city services to operate on the same secure government network. (Dallas); and
- There are no light switches or water taps in the city; movement sensors control lighting and water to cut electricity and water consumption by 51 and 55% respectively. (Masdar, UAE).
These initiatives are amazing. Think about what Masdar is doing. It’s like an automatic, energy-saving Clapper (“clap on, clap off”). Consider their savings and what it would mean for energy consumption if such a program were implemented to the extent possible around the world. Wow. There’s certain to be an enterprise wrapped around this as we speak. So . . . which of you will be the first to sit on a bench at the edge of a park and use a nearby phone booth across the street as your hot spot? That’s pretty cool.
Richard Boire at the Smart Data Collective poses the following question: The Demise of the Data Scientist: Heresy or Fact? The CEO of Williams-Sonoma certainly has an opinion.
Boire comments on an article by an “IT leader of a well-respected U.S. organization” whom he doesn’t name. Boire writes of this apparition:
[The author] hypothesized that data scientists will in the future become like switchboard operators: obsolete. The primary reason for this declining demand according to the author was that increased automation and operationalization of business processes will not require the technical skills of the data scientist.
Boire takes the contrary position:
With Big Data and big data analytics, the need for analytics and more customized type solutions is experiencing exponential growth. Methods and approaches in employing analytics need to be quicker and more flexible which require IT support for more operationalization and automation. This does not replace the data scientist.
We can leave the automatization debate primarily to the Quants. But I do think they ignore the fact that data science is also an inherently human endeavor. Thomas Davenport, for example, argues that both creativity and instinct are essential to interpreting data. This is especially true when an executive’s intuition may display a lack of data science understanding. He writes in Keeping Up With The Quants: “The goal, then, is to make analytical decisions while preserving the role of the executive’s gut.” That battle-tested gut can be critical to evaluating even a data-driving initiative. There’s more great, related content: The September issue of Harvard Business Review has an article by Laura Alber, CEO of Williams-Sonoma for the past four years. (The article is gated.) She describes the creativity found in Williams-Sonoma’s headquarters in San Francisco, as well the “data analysts crunching numbers, building models, and analyzing reports.” She continues:
If Williams-Sonoma has a “secret sauce,” it is these teams working together in remarkable alignment to develop and execute our strategy and tactical priorities. In my 19 years at the company and four as CEO, I’ve found that the very best solutions arise from a willingness to blend art with science, ideas with data, and instinct with analysis.