Prince Joseph Erneszer Javier, Data Scientist, Mynt
Artificial intelligence (AI) is the ability of computers or machines to mimic aspects of human intelligence.
Recent advances in AI, such as machine learning, deep learning, cloud computing, edge computing, and
AI chips have been impacting businesses and government organizations.
Machine learning enables computers to predict by learning patterns in data. Instead of explicitly coding hard rules or if-else statements, an algorithm will be trained by being shown many examples. Deep learning, a powerful kind of machine learning that is inspired by the human brain is gaining traction due to more data generated and faster computers. Deep learning is the algorithm behind self-driving cars, Dota-playing OpenAI, Jeopardy-playing IBM Watson, and Youtube’s auto-captions.
One project I and my colleagues have implemented for an energy company in the Philippines is a deep learning model that predicts future hourly power demand from variables such as historical demand, weather, and holidays. The model uses a Long Short-Term Memory Neural Network (LSTM) architecture that is inspired by the brain’s memory system. It has the capacity to remember important long-term information and forget unnecessary information. It improved prediction accuracy over previous models used by the company such as ARIMA reducing error from 5% to 3%.
More accurate predictions can guide power generators to better optimize their power plant operations to reduce operating costs.
Governments are realizing the value of AI as well. The Philippine Central Bank is developing an AI chatbot that can enhance customer experience by automatically addressing customer issues through SMS or social media chat. Meanwhile, the country’s Department of Science and Technology uses AI to automatically count the number of mango trees from satellite images thus automating tree auditing.
As models become more complex and more data are generated, AI training becomes more computation-intensive. With limited access to local hardware, organizations can opt to rent computing capacity from cloud computing services like Amazon Web Service, Google Cloud Platform, Microsoft Azure, and IBM Watson. Servers located around the globe can now be accessed through the internet. Just like using electricity or tap water, cloud computing is paid on-demand and scalable. With cloud computing, prototypes can be first coded and trained on a 4 core server before scaling up to a larger dataset using a 72 core server and beyond.
AI chips are also gaining traction. These are chips whose architectures are optimized to run AI models especially deep learning. Because AI chips are capable of quickly running models, they can be deployed on the “edge” or near the sensors. Edge computing, the process of running models in AI chips much closer to the sensors, reduces the delay from input to the response. Quick responses are crucial in certain instances such as autonomous vehicles and worker safety. One company, Horizon Robotics in China, develops AI chips that have been used in real-time object recognition for CCTV cameras to tag construction workers who are not wearing proper protective equipment.
More data are being generated while computer hardware and AI algorithms continue to improve. Cloud computing makes computation accessible while edge computing expands the use cases of AI. With these technological trends and increasing awareness in the value of AI, we can only expect more organizations to soon adopt AI.