It’s something we use dozens of times throughout our day, and yet we may not even recognize it. Machine learning is a branch of artificial intelligence that is beginning to bring major changes to how we live and do business today — and we’ve only just begun to tap its true potential.
In 1959, computer scientist Arthur Samuel brought it into the modern era when he first coined the phrase “machine learning.” But only recently have we have seen anything approaching the prospect of computers behaving in the same way as humans. While this goal of general AI remains, it is machine learning that is proving to be the area where computer scientists and businesses are putting most of their resources — and seeing the biggest advances. With increased availability of GPUs that make parallel processing faster, cheaper, and more powerful, combined with the cloud computing revolution, virtually infinite storage and increased data production, artificial intelligence is slowly integrating itself into the mainstream as humans start to make sense of it all.
Machine Learning Explained
The term refers to the practice of feeding a computer algorithm a huge amount of data and allowing it to identify patterns within the information without being programmed by humans to do so — and then making a prediction about the world around it. There are a number of methods which can be used to enable these algorithms to learn and, in turn, use predictive modeling to improve over time.
Use AI to Work Without Barriers
Learn how AI-powered devices enable the future of work. Download Now
Stanford University defines this brand of artificial intelligence as “the science of getting computers to act without being explicitly programmed.” Nidhi Chappell, head of AI at Intel, recently explained in WIRED: “The way I think of it is: AI is the science and machine learning is the algorithms that make the machines smarter. So the enabler for AI is machine learning.”
What is it Used for Today?
The goal of AI is to mimic human behavior. In its latest TechRadar report on AI, Forrester says AI differs from traditional technologies “in its ability to sense, think and act while constantly learning.”
One application is self-driving cars. By using cameras, cars can “sense” the world around them. Powerful processors allow them to “think” about what they should do, and automated control systems let driverless cars “act” — all without human intervention. Smartphone digital assistants are another example. They hear our voice (sense); parse what we mean (think); and carry out the task requested (act) without the user having to touch the screen.
Financial services providers are using machine learning to spot trends in data as well as automatically detect fraud. The medical world is using it to identify potential health risks and even suggest treatments. Governments are applying it to the vast troves of data they typically collect to find cost savings and efficiencies.
Finally, retailers are also taking advantage of the lowering cost of AI technology to give them new insights into customer behavior.
The next advance will come from deep learning, which makes use of neural network algorithms. These systems are ideally designed for image, video and audio recognition, and as they become more accessible, businesses will be able to increasingly leverage their power.
Take, for example, a manufacturing line. By employing a deep neural net and training it to maximize output, this technology can have a huge impact on the bottom line of a business. Organizations can use information generated from the neural net to identify inefficiencies within legacy machines and implement preventative maintenance.
Beyond the manufacturing floor, a U.S. beer company is using basic deep learning systems to create more drinkable beer, and the system can be applied to all types of food and drink. The NBA’s Golden State Warriors and the Cleveland Cavaliers both use a deep learning system to analyze player performance.
Businesses will also be able to combine other artificial intelligence technologies with machine learning to give them unparalleled insight and offer new services. For example, technologies such as iris scanning in combination with predictive modeling can provide automated and secure access to hotel rooms or conferences.
We’re still writing the first chapter of the artificial intelligence revolution, and because new technologies are still evolving, they’re in need of constant human oversight. However, these technologies will come to dominate the 21st century, and will become so pervasive that all companies — no matter their size or industry — need to embrace them or risk being left behind.
Trends that are contributing to the rise of the digital workforce include augmented reality and the Internet of Things.