The Internet of Me: It’s All About My Screens
Cardiff University, 23 February 2016
On Tuesday evening, Cardiff University welcomed Robert Schukai, the Head of Applied Innovation for Thomson Reuters, to give the annual Institute of Engineering and Technology Turing Lecture. Schukai, a charismatic and entertaining speaker, spoke about how through Cognitive Computing, Artificial Intelligence and Natural Language Processing the future of data is ‘all about you’.
Schukai began his talk with these incredible statistics: firstly, by the end of the decade there will 9 billion mobile phones in use across the globe. Secondly, 40 – 50% of network traffic is currently taken up with people watching video.

And thirdly, by 2020 that percentage will have increased to 70% of all traffic, or, in data terms, 50 exabytes per month. According to Schukai, the success of the Apple iPhone, in the USA, at least, is not because the iPhone was the first smartphone on the market (it was not), but because when it was launched AT&T charged $30 per month for an unlimited monthly data plan. This meant that iPhone users were no longer worried about how much data they were using and could happily download and watch as much content as they liked. For Schukai data is ‘mobile broadband fuel’ and we always want more and more of it. In short, then, there is rather a lot of data being created and consumed and it is all contributing to what has become known as ‘big data’.
And here is one more remarkable statistic: by the end of the decade there will be 28 billion devices, globally, with an internet protocol. These devices, alongside the more ‘traditional’ computers, tablets, and smartphones that we use today, are what has become known as the ‘internet of things’: a fridge, for example, may be fitted with an internet connection, or the lights in our homes may also become part of the network, pulling from the cloud information about how we like our living room to be lit when we arrive home from work. However, if we begin to think that even the data produced from the ‘internet of things’ is big, then it pales in comparison to the amount of data that will be generated when we begin, as human beings, to have our entire genome mapped.
How can we even begin to make sense of such vastness? Well, the answer lies in ‘Cognitive Computing’. You train a machine to learn. According Jon Iwata, the senior Vice President of IBM, computers are baffled by natural language, popular culture, allegories and puns. So, Iwata and his team at IBM created Watson, a system capable of answering questions in natural language, with the ultimate goal of competing on the game show Jeopardy against former winners. Famously, in 2011, Watson did just that, won, and received $1million in winnings. Watson is not programmed but is a system that learns. As Iwata observes, an iPad can only do what a software engineer has programmed it to do. Watson, however, improves itself through learning – by being trained by humans. And it is systems like Watson that are going to help us understand all that big data which is being produced.
There are four related ideas that are needed for Cognitive Computing to be successful, Schukai says. These are: 1) machine learning 2) machines have to think 3) machines have to interact with humans 4) machines have got to act with other machines. How do you train a machine? Through taxonomy. Through the specific language used in certain fields and discourses. However, learning is only as good as the human’s training of it. ‘Garbage in, garbage out’, Schukai affirms, and, significantly, learning does not equal understanding. Google, for example, does not know the reason why Bob is doing a search for Arsenal FC. It is completely contextless information. Bob could be a fan, of course, but he could also be a Tottenham fan digging around for information about the his club’s rivals. Artificial Intelligence then has to step in and Google is already, once again, ahead of the game here. Google’s Deep Learning Artificial Intelligence engine is built on neural networks and it tries to replicate how the brain works through binary code. Recently the system beat a human competitor at the game of Go a decade in advance of when experts thought it would be possible to do so.
We are already contributing to machine learning when we use the personal assistants Siri or Cortana on our smartphones. The problem with both systems, however, is that you have to be very prescriptive in your language. Asking Siri something, is, oftentimes, not very natural at all. Schukai then begins to use the Amazon Echo, a remarkable gadget yet to be released in the United Kingdom, where you can ‘mix it up a bit’: the Echo, or Alexa, as Schukai calls ‘her’, understands slang, niceties, colloquialisms and pleasantries. Schukai asks Alexa, as he’s walking casually around the room, to play 6 Music. Within seconds, Alexa has ‘woken up’ found the station and music begins to play. What marks out the Echo from other devices which do similar things is the naturalness of the language you can use to get the Echo to find information for you. It appears to be a highly intuitive and impressive gadget. According Schukai it is ‘quite the toy’.
Cognitive Computing is, Schukai says, the most important technology out there. It is going to change the way we fundamentally interact with Big Data. Machine Learning, Artificial Intelligence and Natural Language Processing will sift through the cloud and make things happen for us. It is going too become less about search: these new devices will intuitively recognise our current needs and proactively push data to us without us asking for it. It will become more conversational and more natural. It will become ‘The Oracle of You’, a smart agent that knows you, assists you and learns from you. The Internet of Me, Schukai, at the end of his talk says, is the future of data and that future of data is all about you.
And, if you would like to contribute to the amount of internet traffic watching video, you can, appropriately enough, watch Schukai’s talk right here.
— Michael Goodman
GoodmanMJ@cardiff.ac.uk