Artificial Intelligence

The AI Process. Source: Cook 2018.
The AI Process. Source: Cook 2018.

Intelligence demonstrated by machines that mimics human decisions and actions for problem solving is called Artificial Intelligence. Intelligence accumulated in the machine is a result of an incremental learning process from data accumulated in the business environment.

Human intelligence consists of varied aspects – discriminating between objects, identifying people and places, understanding languages, recognizing sounds and music, logical analysis and mathematical computations. The list is endless. AI systems are being developed for specific goal-oriented applications covering some of these abilities.

Machine Learning, Big Data and Robotics are its key enablers. Neural networks (a study that understands how the human brain works) helps to add human-like intelligence to the computer. Examples include Google Street View, iPhone Face unlock and Walmart's automated supply chain. AI is still in a nascent stage, but is one of most rapidly growing technologies in the world.


  • AI is not a new discovery. Why is there such a buzz around AI all of a sudden?

    A popular theme in science fiction and folklore, the field of AI research was founded as an academic discipline as early as 1956. There was thriving activity for two decades, with ambitious targets set for matching human intelligence.

    However, nothing substantial could be achieved by the 1970s with heavy rule-based systems having poor computational power. A bleak withdrawal period called the AI Winter followed when research and funding slowed down.

    In the 21st century, several parallel developments renewed interest in AI. Exponential increase in computer processing speed, major drop in memory costs, upsurge of open source programming frameworks, advancement in voice and image processing techniques turned out to be game changers. The advent of big data and cloud computing enabled large scale distributed deployments feasible at company/national level.

    When machine learning moved from rule-driven to data-driven algorithms, AI got the biggest boost. Historical data turned into a powerful enabler. Wireless networks, smart devices, IOT and developments in robotics have completed the circle. AI solutions have reached the maturity to be cost effective and implementable.

  • What are the major components of an AI solution?
    Components of AI solutions. Source: Costenaro 2018
    Components of AI solutions. Source: Costenaro 2018

    Listed below are the main components that act as building blocks in any modern day AI solution:

    • Machine Learning - Computers use statistical techniques to "learn" (progressively improve performance on a specific task) from data, without being explicitly programmed. ML is supported by large volumes of data called Big Data. Data could be historical or real-time data, structured or unstructured. Data is stored in distributed cloud computing and storage systems. ML also employs a computing system called Artificial Neural networks, inspired by how biological nervous systems and the brain process information. Eg: Image recognition, how a child identifies a cat.
    • Natural Language Processing - Ability to detect words and meanings from human voice in various spoken languages. Systems like voice assistants learn from past data and improve their language interpretation over time.
    • Robotics - The mechanical arm of AI that enables automated decisions to be implemented though robots, driverless cars, drones and other smart devices.
    • Speech and Vision - Enabling the computer with human-like senses of speech and vision using speech encoding and image recognition algorithms.
  • What are the different approaches to AI?

    AI means empowering machines to act as humans would. This intelligence can be fed into machines in several ways. These approaches can be grouped into one of the two categories listed below:

    • Symbolic, rule-based approach – Expert Systems – Artificial General Intelligence: Goal is to produce general human-like intelligence, not specific to a given problem at hand. Symbolic representations are used to cover responses to unplanned, uncertain situations adding weightages to social, emotional aspects as well as logic. This approach was prevalent in the early years (1960s-1990s) with the development of fuzzy logic and expert systems. IBM chess program that beat Garry Kasparov in the 1990s is a good example of this approach.
    • Statistical, data-based approach – Machine Learning – Narrow/Weak AI: With advent of Machine Learning, approach to AI changed. ML employs statistical modelling algorithms on historical data and derives "intelligence" from experience. This method is effective for specific business problems, hence the name Narrow AI. Goal is to build AI by consolidating intelligence from different scenarios. Applications in the medical field for disease detection and drug testing on patients is a prominent application of this approach.
  • Which are the common open source frameworks to build AI solutions?

    One of the main reasons for explosive growth in AI research and development in the recent times is the availability of reliable, efficient open source programming frameworks that support AI solutions development. Not just major IT corporations, university projects and small startups are also joining the AI bandwagon because of this. Some of the widely used programming languages and frameworks are listed below:

    • Python and R Programming Languages - Basis for almost all ML algorithm implementations powering AI solutions. They have a strong developer community and excellent package library support for most applications. Python is the base over which several ML/DL and NLP frameworks are built.
    • TensorFlow Deep Learning Framework - Python based computational framework from Google for building Deep Learning models. Widely used for building hierarchical ML applications, deep belief nets. It's being used for translation in Google's Translate, for OCR in WPS Office, fraud detection in PayPal.
    • PyTorch - Deep Learning framework from FaceBook. Major vendors like Microsoft and Amazon are expected to provide complete support to the framework across their cloud products.

    Keras, Theano, and MS CNTK are some other frameworks.

  • What are the applications of AI that are currently in the market?

    AI applications have pervaded all areas of work and life. From the mobile phone in hand to futuristic space research, AI is everywhere. AI applications domain-wise are listed below:

    • Smartphone Apps and Utilities - Google Street View, iPhone Face unlock, voice assistants.
    • Retail - Automated supply chains, scanning product images and shop, automated billing, customized shopping experience.
    • Healthcare - Early disease detection, preventive diagnostics, automated drug trials.
    • Financial Services - Bots on the stock market, fraud detection, estimating customer credit worthiness.
    • Auto Industry - Vehicle preventive maintenance, driverless cars, delivery drones.
    • Fashion - Customized clothing for buyers, smart inventory management.
    • Human Resources - Resume elimination and talent acquisition process automation, personalised training and on-boarding for employees.
  • What are the technical challenges associated with AI research and development?
    AI-Technology Challenges. Source: Deloitte. 2017.
    AI-Technology Challenges. Source: Deloitte. 2017.
    • Computing Power Limitations - There is exponential growth in processing speeds, distributed computing abilities and large-scale memory storage. But demands are even greater, especially with deep learning applications. Only large corporations can match demand, and this limits the scope for startups.
    • Skilled Manpower - AI applications require niche computer science skills such as ML, NLP, image processing, robotics, and data sciences. Since most of these fields of study are still evolving, finding high quality manpower is a challenge. Reskilling engineers from traditional programming or design skills to new-age AI skills is the need of the hour.
    • Data Security - Data is the biggest wealth of an organization today. With more and more private business data coming online, it becomes vulnerable to external misuse and attacks. Companies need to focus on data security and network frameworks to protect their data.
  • What are the social and economic risks associated with AI technology?

    However promising and revolutionary AI may sound, it comes saddled with several risks impacting society and livelihoods. Key risks are:

    • Fear of widespread job loss - Since AI brings a new bunch of technologies into the fore, several IT and business skills such as manual testing, customer support, traditional programming, and managerial tasks are becoming obsolete. In the wider world, impact is even more pronounced. Autonomous cars replace human drivers. Drone deliveries replace home delivery staff. AI and robotics replace several manufacturing jobs. Impact on healthcare, financial services and retail are expected to be high.
    • Bias due to bad data - AI systems are as good or bad as the data fed into them. If companies and governments feed biased data with malicious intent, results can be disastrous. Self-regulation is the only method in practice. Stricter legislations and tight international data policing rules are needed.
    • Over-reliance on digital devices - Smart devices running AI are invading organizations, homes, and government institutions. Lifestyle deficiencies such as exposure to wireless radiation, excessive time spent with gadgets, and lack of basic computational and social skills are worrying.



Arthur Samuel writes the first computer learning program. This is a game of checkers that runs on an IBM machine. It improves its game every time it plays. Samuel is also credited for coining the term Machine Learning (ML).


Frank Rosenblatt designs the first neural network for computers, called the perceptron, which simulates the thought processes of the human brain.


Students at Stanford University invent the robotic Stanford Cart. This can navigate obstacles in a room on its own.


Through the 1970s and into the 1980s, AI sees a subdued period known as the AI Winter. Interest and development dwindles, people become pessimistic about its chances of success, and funding drops. Research continues but AI steps out of the limelight.


In the 1990s, work on machine learning shifts from a knowledge-driven approach to a data-driven approach.


In a contest of man vs machine, IBM's Deep Blue beats the world champion at chess. This showcases computing capability to tackle complex calculations for drug discovery, large database searches, broad financial modelling, and more. In February 2011, another IBM machine named Watson beats humans at another game, called Jeopardy! Computers can now process and reason about natural languages, heralding rich human-computer interactions for the future.


Geoffrey Hinton coins the term Deep Learning (DL) as a term for artificial neural networks using many layers of neurons. With innovations happening in the Computer Vision space, DL algorithms enable computers to "see" and distinguish objects and text in images and videos.


Apple includes Siri as a built-in digital assistant in its iPhone 4S. Siri triggers a new age of automation, personalization and AI-driven assistants.


Many major automotive manufacturers, including General Motors, Ford, Mercedes Benz, Volkswagen, Audi, Nissan, Toyota, BMW, and Volvo, start testing driverless car systems. Tesla Motors announces its first version of AutoPilot. Model S cars equipped with this system are capable of lane control with autonomous steering, braking, automated parking and speed limit adjustment.


Facebook releases DeepFace, a software algorithm that is able to recognize or verify individuals on photos to the same level of accuracy as humans.


Over 3,000 AI and Robotics researchers, endorsed by Stephen Hawking, Elon Musk and Steve Wozniak (among many others), sign an open letter warning of the danger of autonomous weapons that can select and engage targets without human intervention.


Apple CEO Tim Cook says, "We are focussing on autonomous systems". He describes Apple's secretive Project Titan as "the mother of all A.I. projects".


  1. Anuradha C. 2018. "How is Big Data empowering Artificial Intelligence: 5 essentials you need to know." YourStory, February 27. Accessed 2019-05-26.
  2. AppleInsider. 2020. "Siri: Assistant timeline, history, and features." AppleInsider, July 17. Accessed 2020-07-24.
  3. Bachinskiy, Arthur. 2019. "The Growing Impact of AI in Financial Services: Six Examples." Towards Data Science, via Medium, February 21. Accessed 2019-05-26.
  4. Chowdhry, Amit. 2014. "Facebook's DeepFace Software Can Match Faces With 97.25% Accuracy." Forbes, March 18. Accessed 2020-07-24.
  5. Cook, Kimberly. 2018. "Top 10 Predictions For AI, Big Data, And Analytics in 2018-19." HouseOfBots, December 10. Accessed 2019-05-26.
  6. Costenaro, Dave. 2018. "Preparing for Artificial Intelligence." Becoming Human: Artificial Intelligence Magazine, via Medium, January 09. Accessed 2019-06-15.
  7. D'Souza, Rhett. 2018. "Symbolic AI v/s Non-Symbolic AI, and everything in between?" Data Driven Investor, via Medium, October 19. Accessed 2019-05-26.
  8. DataRobot. 2019. "What does Artificial Intelligence mean?" DataRobot Wiki, . Accessed 2019-05-26.
  9. Deloitte. 2017. "Artificial Intelligence for the real world." HBR.Org. Accessed 2019-05-26.
  10. Dickson, Ben. 2017. "4 challenges Artificial Intelligence must address." The Next Web, February 27. Accessed 2019-05-26.
  11. Dickson, Ben. 2018. "What is the AI winter?" TechTalks, November 12. Accessed 2019-05-26.
  12. Evry. 2019. "Whitepaper - The New Wave of Artificial Intelligence." Accessed 2019-05-26.
  13. Fagella, Daniel. 2019. "Artificial Intelligence in Retail – 10 Present and Future Use Cases." Emerj, March 28. Accessed 2019-05-26.
  14. Genç, Özgür. 2019. "Notes on Artificial Intelligence, Machine Learning and Deep Learning for curious people." Towards Data Science, via Medium, January 26. Accessed 2019-05-26.
  15. Gibbs, Samuel. 2015. "Musk, Wozniak and Hawking urge ban on warfare AI and autonomous weapons." Guardian News & Media Limited, July 27. Accessed 2019-05-26.
  16. Gill, Jagreet Kaur. 2019. "Overview of Artificial Intelligence, Deep Learning and NLP in Big Data." XenonStack, January 21. Accessed 2019-05-26.
  17. Google AI. 2019. " Geoffrey E. Hinton." People, Google AI. Accessed 2019-05-26.
  18. Hawkins, Andrew. 2017. "How Tesla changed the auto industry forever." The Verge, July 28 Accessed 2019-05-26.
  19. Hill, Simon. 2019. "As Google races ahead, where's Apple's A.I. strategy?" Digital Trends, June 05. Updated 2019-06-09. Accessed 2019-05-26.
  20. IBM. 2012. "Deep Blue." Icons of Progress, IBM 100, March 7. Accessed 2020-07-24.
  21. Kriesel, David. 2005. "A Brief Introduction to Neural Networks." Accessed 2019-05-26.
  22. Kumar, Chethan GN. 2018. "Artificial Intelligence: Definition, Types, Examples, Technologies." Medium, August 31. Accessed 2019-05-26.
  23. Marr, Bernard. 2016. "A Short History of Machine Learning -- Every Manager Should Read." Forbes, February 19. Accessed 2020-07-24.
  24. Marr, Bernard. 2019. "The Biggest Business and Social Challenges For AI." Accessed 2019-05-26.
  25. Moravec, Hans P. 1990. "The Stanford Cart and the CMU Rover." Chapter in Cox I.J., Wilfong G.T. (eds), Autonomous Robot Vehicles, Springer, New York. Accessed 2019-05-26.
  26. NVidia. 2019. "Deep Learning For Retail." NVidia Corporation. Accessed 2019-05-26.
  27. Patrizio, Andy. 2018. "Big Data vs. Artificial Intelligence." Datamation, May 30. Accessed 2019-05-26.
  28. Prabhu, Manish. 2018. "Security and Privacy in Artificial Intelligence and Machine Learning — Part 1: Lay of the Land." Towards Data Science, via Medium, July 29. Accessed 2019-05-26.
  29. Puget, JeanFrancois. 2016. "What Is Machine Learning?" IBM Community. Accessed 2019-05-26.
  30. Ruuse, Liisi. 2017. "Artificial Intelligence: Everything You Want to Know." Scoro, August 04. Accessed 2019-05-26.
  31. Shetty, Sunith. 2018. "What is PyTorch and how does it work?" Packt Publishing Limited. Accessed 2019-05-26.
  32. TensorFlow. 2019. "Why TensorFlow." TensorFlow Case Studies. Accessed 2019-05-26.
  33. Tiempo. 2019. "Artificial Intelligence’s (AI) Biggest Challenges in Technology [2019]." Tiempo Development, March 19. Accessed 2019-05-26.
  34. Wislow, Eva. 2017. "Top 5 ways to use artificial intelligence (AI) in human resources." Big Data Made Simple, Crayon Data, October 24. Accessed 2019-05-26.

Further Reading

  1. Anuradha C. 2018. "How is Big Data empowering Artificial Intelligence: 5 essentials you need to know" YourStory, February 27. Accessed 2019-05-26.
  2. Dickson, Ben. 2018. "What is the AI winter?" TechTalks. Accessed 2019-05-26.
  3. Redmore, Seth. 2019. "Machine Learning for Natural Language Processing." Accessed 2019-05-26.
  4. Shaleynikov, Anton. 2018. "10 Best Frameworks and Libraries for AI." Accessed 2019-05-26.
  5. Delipetrev, Blagoj, Chrisa Tsinaraki, and Uroš Kostić. 2020. "Historical Evolution of Artificial Intelligence." EUR 30221 EN, Joint Research Centre, European Commission. doi: 10.2760/801580. Accessed 2020-11-22.

Article Stats

Author-wise Stats for Article Edits

No. of Edits
No. of Chats

Cite As

Devopedia. 2020. "Artificial Intelligence." Version 9, November 22. Accessed 2020-11-24.
Contributed by
2 authors

Last updated on
2020-11-22 04:53:55