Blockchain and AI












Blockchain is a new technology development that is garnering a lot of press of late. In fact, it is being reported that while 2016 was the proof of concept year for blockchain, 2017 will be the year we see blockchain technology being implemented in production. So, what is blockchain, why all the attention, and why now?

Blockchain is the technology infamously underlying bitcoin, a digital cryptocurrency that most have heard of. It was invented in 2008 by a person, or a team of people, named Satoshi Nakamoto – to this day no one really knows who he, or they, are. But blockchain can do a whole lot more than manage bitcoin. In fact, as a distributed digital ledger, it can manage almost any type of transaction imaginable, and therein lies its power.

Its design goal is to speed up and simplify how transactions are recorded. Any type of asset can be transacted using the blockchain in an entirely decentralized system. On the blockchain, trust is established, not by powerful intermediaries like banks, governments and technology companies, but through mass collaboration and clever code. Settlement and clearing time is reduced to seconds. The distributed ledger is replicated on thousands of computers around the world and is kept secure by powerful encryption algorithms. The technology is so disruptive that it may in time spur new changes in how companies and even governments work.

Blockchain provides the opportunity to introduce new products and services, reduce costs of existing services and significantly reduce transaction times, from days even to seconds.

Some are going so far as to say that it may be the technology that launches the next Google or Facebook!

The risks and challenges include security, environmental impact, equipment being expensive, and interoperability. The Bitcoin & Ethereum cryptocurrencies have both been hacked, although new safeguards were put in place each time to close their vulnerabilities.

There is an impact on the environment due to the energy required to run these large networks. Expensive equipment is required to run the blockchain networks - think in terms of data centers. And there are likely to be many blockchains so interoperability becomes a central issue.

Examples of blockchain applications include, but are not limited to:

Legal agreements (contracts)
Finance
 - Cryptocurrency - peer-to-peer version of electronic cash
 - Other asset classes (bonds, commodities, etc.)
 - Credit Suisse has conducted 10 proofs of concept with blockchain startups to achieve cost reductions
Autonomous vehicles
Transportation infrastructure
Accommodation
Hotels, apartments, smart locks
Energy grid

Deep Learning and Neuromorphic Chips













There are three main ingredients to creating artificial intelligence: hardware (compute and memory), software (algorithms), and data. We’ve heard a lot of late about deep learning algorithms that are achieving superhuman level performance in various tasks, but what if we changed the hardware?

Firstly, we can optimise CPU’s which are based on the von Neumann architectures that we have been using since the invention of the computer in the 1940’s. These include memory improvements, more processors on a chip (a GPU of the type found in a cell phone, might have almost 200 cores), FPGA’s and ASIC’s.

Such is the case with research being done at MIT and Stanford. At the International Solid State Circuits Conference in San Francisco earlier this month, MIT researchers presented a new chip designed specifically to implement neural networks. It is 10 times as efficient as a mobile GPU, so it could enable mobile devices to run AI algorithms locally, rather than uploading data to the cloud for processing. Whereas many of the cores in a GPU share a single, large memory bank, each of the Eyeriss cores has its own memory. The Stanford EIE project is another CPU optimization effort whereby the CPU’s are optimized for deep learning.

The second method relies not just on performance tweaks on the CPU architectures, but instead on an entirely new architecture, one that is biologically inspired by the brain. This is known as neuromorphic computing, and research labs around the world are currently working on developing this exciting new technology. As opposed to normal CPU’s and GPU’s, neuromorphic computing involves neuromorphic processing units (NPU’s), spiking neural networks (SNN’s) and analogue circuits and spike trains, similar to what is found in the biological neural circuitry in the brain.

Neuromorphic chips attempt to model in silicon the massively parallel way the brain processes information as billions of neurons and trillions of synapses respond to sensory inputs such as visual and auditory stimuli. Those neurons also change how they connect with each other in response to changing images, sounds, and the like. This is the process we call learning and memories are believed to be held in the trillions of synaptic connections. Companies developing neuromorphic chips include IBM, Qualcomm, Knowm and Numenta. Government funded research projects include the Human Brain Project (EU), IARPA (US) and Darwin (China). Let’s look at each of these now in a little more detail.

IBM Research has been working on developing the TrueNorth chip for a number of years now and are certainly making steady progress. Qualcomm has also been working on the Zeroth NPU for the past several years, and it is capable of recognizing gestures, expressions, and faces, and intelligently sensing its own surroundings. Numenta, headed up by Jeff Hawkins, started in 2005 in Silicon Valley and has been making good progress both theoretical and applied in emulating the cortical columns found in the brains’ neocortex. They have released products based on the NuPIC (Numenta Platform for Intelligent Computing) architecture which is used to analyze streaming data. These systems learn the time-based patterns in data, predict future values, and detect anomalies. Lastly, founded in 2002, Knowm has an interesting offering based around its patented memristor technology.

The Human Brain Project, a European lead multibillion dollar project to simulate a human brain, has incorporated Steve Furber’s group from the University of Manchester’s neuromorphic chip design into their research efforts. SpiNNaker has so far been able to accomplish the somewhat impressive feat of simulating a billion neurons with analogue spike trains in hardware. Once this hardware system scales up to 80 billion neurons we will have in effect the first artificial human brain, a momentous and historical event. This is predicted to occur around 2025 right in line with Ray Kurzweil’s prediction in his book “How to Create a Mind”.

Darwin is an effort originating out of two universities in China. The successful development of Darwin demonstrates the feasibility of real-time execution of Spiking Neural Networks in resource-constrained embedded systems. Finally, IARPA, a research arm of the US Intelligence Department, has several projects ongoing involving biologically inspired AI and reverse engineering the brain. One such project is MICrONS or Machine Intelligence from Cortical Networks which “seeks to revolutionize machine learning by reverse-engineering the algorithms of the brain.” The program is expressly designed as a dialogue between data science and neuroscience with the goal to advance theories of neural computation.

So overall, a very active area of research at the moment, and one we can foresee only growing in the future in terms of resources allocated to it. Whether that’s money spent or scientists and engineers involved in the research and development work necessary to produce a machine as general purpose as the brain. A true artificially engineered brain on a chip which will clearly lead to more intelligence in the Enterprise as well as in all aspects of our daily lives.


Peter Morgan - Oct 2016

            

Internet of Things
 - The Internet of Everything needs a Ledger of Everything
Supply chain management
Things that haven’t been thought of yet
Blockchain as a Service

Further evidence that blockchain is here to stay, is Blockchain as a Service (BaaS) offerings from both IBM and Microsoft, along with the creation of the open source standards organizations Hyperledger, R3 and EEA all of whom have dozens of corporate members having signed up.

What happens when we start to merge artificial intelligence and the blockchain into a single, powerful prototype?

We have blockchain tech's promise of near-frictionless value exchange and artificial intelligence’s ability to accelerate the analysis of massive amounts of data. The joining of the two could mark the beginning of an entirely new paradigm. We can maximize security while remaining immutable by employing artificial intelligent agents that govern the chain. With more companies and institutions adopting blockchain-based solutions, and more complex, potentially critical data stored in distributed ledgers, there's a growing need for sophisticated analysis methods, which AI technology can provide.

State Street is doing just this by issuing blockchain-based indices. Data is stored and made secure using Blockchain and they use AI to analyze the data while it remains secure. State Street reports that 64% of wealth and asset managers polled expected their firms to adopt blockchain in the next five years. Further, 49% of firms said they expect to employ artificial intelligence.

IBM Watson is also merging blockchain with AI via the Watson IoT group. In this, an artificially intelligent blockchain lets joint parties collectively agree on the state of the device and make decisions on what to do based on language coded into a smart contract. Using blockchain tech, artificially intelligent software solutions are implemented autonomously. Risk management and self-diagnosis are other use cases being explored.

In conclusion, AI is being coupled with blockchain technologies to analyze data securely and to make predictions. Both startups and corporations are running POC’s at the moment. Blockchain with AI is already being offered as a service by major cloud providers including IBM and Microsoft. In the future, almost every transaction could be running on blockchain technology.

A host of economic, legal, regulatory, and technological hurdles must be scaled before we see widespread adoption of blockchain technology, but first movers are making incredible strides. Within the next handful of years, large swaths of your digital life may begin to run atop a blockchain foundation - and you may not even realize it, PC Mag, Feb 2017.


References:
State of Blockchain and Artificial Intelligence in Fintech
State Street is Betting AI Can Help Monetize Blockchain
IBM’s New Watson Centre Merges Blockchain with AI
Blockchain: The Invisible Technology That's Changing the World
Microsoft and IBM Set Sights on the Next Cloud Frontier: Blockchain-as-a-Service 
The Impact of the Blockchain Goes Beyond Financial Services - HBR


Peter Morgan - March 2017

AI Developments


Check out a presentation given by our CEO Peter Morgan on recent developments in AI at the London Deep Learning Lab Meetup.















July 2017


                Advancements in AI













The last few years have seen some truly dramatic developments in the field of artificial intelligence (AI). Hardly a week goes by these days without some announcement of a new record being broken by clever artificial intelligence algorithms. So what is AI and why now? Intelligence is an agent’s ability to adapt to and to achieve goals in its environment. Artificial simply means non-biological. However, the lines are starting to blur between biological and non-biological (machine) intelligence. In fact, it’s become pretty clear that there is no fundamental difference between the two, and that they are in fact, fast converging. The important concept seems to be computation.

So why now? There are several reasons for this. They involve the amount of labelled data available, tremendous increase in compute power as well as advances in algorithms, the software that processes all this new data. In November, for example, Google stunned the machine learning community by open sourcing it’s prized framework TensorFlow, that it uses in over one hundred of its internal projects, making this software freely available to the world overnight. This was followed almost instantly by a flurry of companies all open sourcing similar frameworks, including Microsoft and Samsung.

For hardware development, the advent of GPU’s has been a major factor in enabling the rapid progress of AI. Once used exclusively by the gaming industry to render graphics in real time, GPU’s are now used by research and industry for a multitude of applications including virtual reality, self-driving cars, drug discovery and untold more processor intensive work. For example, the Nvidia Tesla K80 GPU has nearly 5000 cores on a single processor. Along with the hardware, data is also increasing exponentially due to the Internet of both people and things. This includes labelled and unlabelled data — deep learning algorithms can analyse and process both types. The development of specialised hardware such as FPGA’s and ASICs (for example, Google’s TPU) are also propelling the advancement of AI.

Other events which signal that a deep impact on society is just around the corner, include Google Deepmind’s AlphaGo artificial intelligence system beating one of the world’s top Go players, Lee Sodol. The important point here is not only that the machine won, but how short a period it took this system to beat a world champion from project inception to actual win — around two years. AlphaGo used general purpose learning and search algorithms with the profound implication that machines could quickly develop superhuman capability in many other types of tasks.

Finally, the UK and US government has become concerned enough to launch their own public inquiries into robotics and artificial intelligence, in effect asking experts from science and industry to come forward with their views and insight into what is happening in the field of AI. The inquiry is asking how jobs, the workplace and wider society will be affected by the rise of robotics and AI, along with social, ethical and legal issues which may arise along with the technology. Given the exponentially accelerating nature of AI developments, these enquiries are timely indeed.


Peter Morgan - Oct 2016

Facebook

The AI Revolution is Here












The market in AI is hot right now. So much so that in the last week alone, we have seen two acquisitions by large companies worth over $600million. Last week, Apple bought Turi (ex GraphLab), a company that creates and sells data science algorithms, or Data Science as a Service. It is headed by University of Washington professor Carlos Guestrin, and was acquired for $200million. Not only do Apple get the product, they get the talent as well, at least for a set number of years as stipulated in the contract. Then came Intel’s buy out of Nervana, a maker of deep learning chips, ASICs that give a 10X speedup over conventional hardware, such as GPUs and CPUs. This is an interesting play by Intel as it means that we will start to see this technology in future Intel products, which will help speed the AI revolution.

Google did a similar thing when they recently announced the TPU, again a hardware specific ASIC purpose built to optimize the deep learning algorithms used in almost all of their products including image classification, search, ads, mobile and spam filters. In fact, Google recently announced that they had been using TPU’s to power AlphaGo, the system that beat the world champion Go player, Lee Sodol, four matches to one. Google has also been using DeepMind’s algorithms to improve energy efficiency of their data centers by 40%, a huge savings, given that data centers now consume an ever increasing portion of the world’s energy. You can view a recent talk by Deepmind’s cofounder Demis Hassibis at the MIT-Harvard collaborative Center for Brains, Minds, Machines (CBMM), on their quest to uncover general artificial intelligence algorithms or AGI.


Startup AcquirerAmount ($million)

Siri       

Apple200
DeepmindGoogle400
SwiftKeyMicrosoft250
Wit.aiFacebook   -
VocalIQApple   -
Prediction.io..........Salesforce...........   -           
MetamindSalesforce 35
EviAmazon 30
ApicalARM350
Magic PonyTwitter150
EmotientApple   -
TuriApple200
NervanaIntel400
















So what does this tell us? Well, the field is certainly heating up, for one thing with all the AI acquisitions. You can also read about some of the other AI acquisitions in recent articles by CB Insights here and here.

Based on all this activity, both research and commercial, many have speculated that we are at the beginning of an AI revolution. In fact, it has been labelled the fourth industrial revolution, or the intelligence revolution, by some, following, steam, electricity, and the PC.

Check out this superb blog by Klaus Schwab the Founder and Executive Chairman of the World Economic Forum where he outlines the impact of these technologies.

With revolution comes inevitable upheaval — social, economic and employment.

Whereas in the past, new jobs have always been created to replace those destroyed by new technology, this time will be different. If you’re actually replacing human intelligence, there really isn’t that much left for us to do in the way of work. This poses its own set of unique challenges. The good thing is that it frees us up to pursue whatever we’d like to do — the arts and sciences, outdoor pursuits, anything creative. But then how can we afford to cover the essentials — food, shelter, communications. Some kind of new economic system will surely be required — perhaps a guaranteed basic income. Such a system has been trialled in many places before and is currently in trial in several more — in all cases it has been found to work well with people free to create the lives they want. Also, we will probably see a transition to a resource based economy, such as The Venus Project proposed by Jacque Fresco.

One thing for sure, we are going to see a lot of changes over the next twenty years, as the AI revolution takes hold, from self-driving cars, to personal assistants, to domestic carers, to AI doctors and lawyers. Both blue and white collar workers will be displaced, no one escapes as all jobs are eventually automated.

And of course the change will be exponential, so we will see it only accelerating. Take the purchase of Nervana by Intel, for example, which will bring deep learning technology to the masses, both in the business and consumer markets over the next few years. The good news is better healthcare, better decision-making, and faster scientific discoveries. The risks are economic and social upheaval if we don’t stay on top of these changes and manage them well.


Peter Morgan - Sept 2016

Deep Learning Framework Adoption













The major deep learning frameworks today are the following: 

TensorFlow        www.tensorflow.org
Keras                 https://keras.io
CNTK                 https://www.microsoft.com/en-us/cognitive-toolkit/
MXnet               http://mxnet.io
Caffe2               https://caffe2.ai
Torch                 http://torch.ch
PyTorch             http://pytorch.org
Deeplearning4j  https://deeplearning4j.org
Intel Nervana     http://neon.nervanasys.com

All can be implemented in production on distributed systems, with Deeplearning4j being the only one written natively in java. Keras is a user-friendly frontend to TensorFlow and CNTK. TensorFlow is by far the most popular – one measure of adoption is the number of stars the framework has on github. As of today, those numbers are:

TensorFlow     62,199
Keras              17,086
CNTK              11,593
MXnet             10,217
Caffe2             5,076
Torch               7,038
PyTorch           5,647
DL4j                 6,850
Intel Nervana   3,093

Table 1 – Framework vs stars on github

As we can see from the table and from Figure 1 below, TensorFlow is by far the most popular framework.










Figure 1 - The Dramatic Rise of TensorFlow


Putting aside the astounding rise of TensorFlow, all of these frameworks are being used extensively within the companies from which they are primarily being developed, e.g., 

Amazon     MXnet
Microsoft    CNTK
Google       TensorFlow
Facebook   Torch/PyTorch
Intel            Neon

Keras is a frontend and DL4J is used in several enterprises, so they are all getting traction in the Enterprise. The DL4J commercial arm is here https://skymind.ai. Validation that these frameworks are being used in production comes from the fact that they are being used in these hyperscale companies, being applied at scale to some of the largest data sets in the world.

References:

Benchmarking state-of-the-art deep learning software tools, Feb 17, 1017
The Top 10 deep learning frameworks, Packt, May 25, 2017  
Google’s latest platform play is artificial intelligence, and it’s already winning, The Verge May 18, 2017


​Peter Morgan - July 2017

Contact

Business Consulting

 

Twitter

Landscape of Deep Learning Frameworks











Deep learning refers to algorithms based on artificial neural networks (ANNs), which in turn are based on biological neural networks, such as the human brain. Due to more labeled data, more compute power, better optimization algorithms, and better neural net models and architectures, deep learning has started to supersede humans when it comes to image recognition and classification.

Work is being done to obtain similar levels of performance in natural language processing and understanding. Deep learning applies to supervised, unsupervised and reinforcement learning. According to Jeff Dean in a recent interview, Google have implemented it in over one hundred of their products and services including search and photos.

In this article we briefly describe some of the more familiar deep learning frameworks including TensorFlow, Torch and Theano, providing an overview of common benchmarks, then including references so that the interested reader can compare the similarities and differences between them. We will use the terms deep learning and neural networks interchangeably.

TensorFlow is the newly open sourced deep learning library from Google. It is their second generation system for the implementation and deployment of large-scale machine learning models. Written in C++ with a python interface, it is borne from research and deploying machine learning projects throughout a wide range of Google products and services. Google and the open source community are constantly adding changes including releasing a version that runs on a distributed cluster.













Figure 1 - Since its release in November 2015, TensorFlow has become the clear leader in open source deep learning frameworks .

Torch is a neural network library written in Lua with a C/CUDA interface originally developed by a team from the Swiss institute EPFL. At the heart of Torch are popular neural network and optimization libraries which are simple to use, while being flexible in implementing different complex neural network topologies. Finally, Theano is a deep learning library written in python and popular for its ease of use. Using Theano, it is possible to attain speeds rivaling hand-crafted C implementations for problems involving large amounts of data.

So what are the various metrics we can use to compare open source software libraries in general, and these deep learning libraries in particular? The most common ones are speed of execution, ease of use, languages used (core and front-end), resources (CPU and memory capacity) needed in order to run the various algorithms, GPU support, size of active community of users, contributors and committers, platforms supported (e.g., OS, single devices and/or distributed systems), algorithmic support, and number of packages in their library. Various benchmarks and comparisons are available herehere and here.

Of course, these libraries are not static, but rather dynamic, living repositories, constantly evolving as the user base adds to and modifies them. As Google report in their white paper, published to support the release of TensorFlow, “We will continue to use TensorFlow to develop new and interesting machine learning models for artificial intelligence, and in the course of doing this, we may discover ways in which we will need to extend the basic TensorFlow system. The open source community may also come up with new and interesting directions for the TensorFlow implementation.”

Over the course of 2015/2016, Google, IBM, Samsung, Microsoft, Nervana, Baidu and others all open sourced their machine learning frameworks. Suffice to say, there are many open source deep learning libraries out there for people to use. Frameworks familiar to researchers and developers in this space include CaffeCuDNNDeeplearning4JCNTK and MXnet. In fact, there is such a plethora of machine learning libraries that many are beginning to ask how do we decide which ones to use, and should we start thinking about combining them to remove confusion and add efficiencies? This is an interesting space to be working in right now, with refinements being added seemingly every day providing a constantly evolving landscape.

It is certainly an interesting journey we are on as these developments help bring us towards the holy grail of artificial general intelligence, intelligence that can truly multitask just as biological intelligence can. I can’t help sometimes but to step back and watch in awe and wonder as the field of artificial intelligence unfolds and to contemplate the ramifications that go along with this progress.


Peter Morgan - August 2016


Update July 2017: New frameworks that have appeared in the interim include PyTorch, Caffe2 and Keras. TensorFlow is also being used extensively within Google as the following graph shows.












Figure 2 - TensorFlow use within Google