Industrial Revolution and Artificial Intelligent

DOI : 10.17577/IJERTCONV5IS01087

Download Full-Text PDF Cite this Publication

Text Only Version

Industrial Revolution and Artificial Intelligent

Industrial Revolution and Artificial Intelligent

Chanda Chouhan

Supriya Mandhare

Snehal Kathale

Department of Information Technology

Department of Computer Engineering

Department of Information Technology

Atharva College of Engineering

Atharva College of Engineering

Atharva College of Engineering

Maharashtra, India

Maharashtra, India

Maharashtra, India

Abstract of the many assorted and riveting challenges we face today, the most intense and important is how to understand and figure out the new technology revolution, which entails nothing less than an alteration of humanity. We are at the beginning of a revolution that is fundamentally changing the way we live, work, and relate to one another. In its scale, scope and complexity, what I believe that the fourth industrial revolution is contrasting anything humankind has experienced before. AI-driven automation will continue to create wealth and expand the economy in the coming years, but, while many will help, that growth will not be priceless and will be accompanied by changes in the skills that workers need to succeed in the economy, and structural changes in the economy. Aggressive policy action will be needed to help.

Keywords AI,problem solving,critical thinking.

  1. INTRODUCTION

    The fourth business insurrection is conceptualized as an upgrade on the third revolution and is marked by a fusion of technologies spanning the physical, digital and biological worlds. In a paper on The Fourth Industrial Revolution: what it means, how to respond, Schwab says that three things about the ongoing transformation mark it out as a new phase rather than a prolongation of the current revolution velocity, scope, and systems impact. The speed of change is utterly unprecedented, it is disrupting almost every industry in every country, and it heralds the transformation of entire systems of production, management, and governance. There are both opportunities and challenges. Like the earlier industrial revolutions that can lift global incomes and improve lives worldwide, and the supply-side miracle due to technological innovation will lead to long-term gains in efficiency and productivity.AI is the science and engineering of making intelligent machines, especially intelligent computer programs where intelligence is the computational part of the ability to achieve goals in the world.

    An intelligent machine can be a machine that mimics the way humans think, feel, move and make decisions. It could also act in conjunction with a human to compliment and improve their ability to do those things. There are many possible approaches to the challenge and the definition has never had a static solution.Even the name 'Artificial Intelligence' has been subject to argument, as some researchers feel it it sounds unscientific. They argue the word 'artificial' suggests lesser or fake intelligence, more like science fiction than academic research. They prefer to use terms like computational neuroscience or emphasize the particular subset of the field they like semantic logic or machine learning. This paper does

    not attempt to come up with a precise characterization of the field. Instead, it examines what Artificial Intelligence has been so far by leading the reader through an admittedly non- comprehensive collection of projects and paradigms. Unlike many fields, Artificial Intelligence has not had a linear progression and its research and breakthroughs have not grown toward an easily identified Sun.The path of AI, however, more resembles the intertwining World Wide Web, spiraling out and looping back in many directions.

    Recent progress in Artificial Intelligence (AI) has brought renewed awareness to questions about automation driven by these advances and their impact on the economy. The current wave of progress and enthusiasm for AI began, driven by three mutually reinforcing factors: the availability of big data from sources including e-commerce, businesses, social media, science, and government; which provided raw material for dramatically improved machine learning approaches and algorithms; which in turn relied on the capabilities of more powerful computers. During this period, the pace of improvement surprised AI experts.[1]

  2. LITERATURE SURVEY

The industrial revolution is divided in 3 phases: machines and labor (1760 1840), technical revolution (1840 1920), technological revolution (1920)

  1. Phases of the industrial revolution: The industrial revolution changes the existence of the humans and

    accelerated the rhythm of the history. It wasnt very fast it occurred slowed it needs time.Technologic invents: The precursors of the industrial revolution were the watches the manufactures gave to the British engineers the idea of constructed machines of high precision.

  2. Second phase: the technical revolution (1840 1920) The technical progress and economic development created by the industrial revolution began to transform the habits andrelationships between people. The communication and transportation methods make shortest distances.

  3. Third phase: technical revolution (1920…) We can locate the face as of 1920.Around this time the aviation and space exploration received a big push. Developed the media of communication like the radio, television, film, telephone and the informatics and the media of transportation. Some of the characteristics of this time were called the Industrial Automation or empire of machines programmed.

    Figure (1) stages of revolutions

    The Fourth Industrial Revolution has the potential to raise global income levels and improve the quality of life for populations around the world. To date, those who have gained the most from it have been consumers able to afford and access the digital world; technology has made possible new products and services that increase the efficiency and pleasure of our personal lives. Ordering a cab, booking a flight, buying a product, making a payment, listening to music, watching a film, or playing a gameany of these can now be done remotely.

    In the future, technological innovation will also lead to a supply-side miracle, with long-term gains in efficiency and productivity. Transportation and communication costs will drop, logistics and global supply chains will become more effective, and the cost of trade will diminish, all of which will open new markets and drive economic growth.[2]

    Man vs. Machine debate is never far from the news agenda, but the next machine age is no longer the future it is inarguably the present. The Fourth Industrial Revolution, the overarching and most imperative theme at Davos this year, is being driven by the automation and augmentation of knowledge-based work. By creating new ways to deploy virtual labor to automate knowledge-based tasks, we are restructuring the way that humans and machines live and work in tandem to create a better, stronger digital economy.

    AI and automation is quickly being used to enhance human creativity through cognitive technology, enabling innovation across a broad range of industries and bringing with it three fundamental changes: new opportunities through infinite data, efficiencies through self-learning and the ability to bring machine interaction even closer to human interaction. In isolation, these are momentous change as standalone topics, but were seeing them happening simultaneously. This means that the collective impact has and will create challenges that will need to be mitigated in advance.[3]

  4. APPLICATIONS

Complicated activities including making medical diagnoses, predicting when machines will fail or gauging the market value of certain assets, involve thousands of data sets and non-linear relationships between variables. In these cases, its difficult to use the data we have to best effect to optimiseour predictions. In other cases, including recognising objects in images and translating languages, we cant even develop rules to describe the features were looking for. What if we could transfer the difficulty of making complex predictions the data optimisation and feature specification from the programmer to the program. This is the promise of modern artificial intelligence. Machine learning: offloading optimization

C. ROLE OF AI

Figure (2) navigation of revolution

Figure (3) evolution of AI

Machine learning (ML) is a sub-set of AI. All machine learning is AI, but not all AI is machine learning (Figure 1,

Artificial Intelligence (AI)-driven machine one that changes the way we structure new jobs and human effort. In the last two decades, we have seen significant changes .Basic AI has existed for decades, via rules-based programs that deliver elementary displays of intelligence in specific contexts. Progress, however, has been limited because algorithms to tackle many real-world problems are too complex for people to program by hand.in how people and businesses connect through technology and IT. Estimates of the Internets contribution to GDP vary, but the general consensus shows that globally it is worth more than sectors such as education, agriculture and energy. Artificial Intelligence (AI) helping to cover an easier path to the future, In short, impending digital technology has huge implications for the very structure of the global economy, with the pros far outweighing the cons. The

above). Interest in AI today reflects enthusiasm for machine learning, where advances are rapid and significant. Machine learning lets us tackle problems that are too complex for humans to solve by shifting some of the burden to the algorithm. As AI pioneer Arthur Samuel wrote in 1959, machine learning is the field of study that gives computers the ability to learn without being explicitly programmed.The goal of most machine learning is to develop a prediction engine for a particular use case. An algorithm will receive information about a domain (say, the films a person has watched in the past) and weigh the inputs to make a useful prediction (the probability of the person enjoying a different film in the future). By giving computers the ability to learn, we mean passing the task of optimisation of weighing the variables in the available data to make accurate predictions about the future to the algorithm. Sometimes we can go

further, offloading to the program the task of specifying the features to consider in the first place. Machine learning algorithms learn through training. An algorithm initially receives examples whose outputs are known, notes the difference between its predictions and the correct outputs, and tunes the weightings of the inputs to improve the accuracy of its predictions until they are optimised. The defining characteristic of machine learning algorithms, therefore, is that the quality of their predictions improves with experience. The more data we provide (usually up to a point), the better the prediction engines we can create (Figures 2 and 3, below. Note that the size of data sets required is highly context dependent we cannot generalise from the examples below.)

There are more than 15 approaches to machine learning, each of which uses a different algorithmic structure to optimise predictions based on the data received. One approach deep learning is delivering breakthrough results in new domains and we explore this below. But there are many others which, although they receive less attention, are valuable because of their applicability to a broad range of usage cases. Some of the most effective machine learning algorithms beyond deep learning includes:[4]

  • random forests that create multitudes of decision trees to optimise a prediction;

  • Bayesian networks that use a probabilistic approach to analyse variables and the relationships between them; and

  • Support vector machines that are fed categorised examples and create models to assign new inputs to one of the categories.

Each approach has its advantages and disadvantages and combinations may be used (an ensemble approach). The algorithms selected to solve a particular problem will depend on factors including the nature of the available data set. In practice, developers tend to experiment to see what works.

Use cases of machine learning vary according to our needs and imagination. With the right data we can build algorithms for myriad purposes including: suggesting the products a person will like based on their prior purchases; anticipating when a robot on a car assembly line will fail; predicting whether an email was mis-addressed; estimating the probability of a credit card transaction being fraudulent; and many more.Even with general machine learning random forests, Bayesian networks, support vector machines and more its difficult to write programs that perform certain tasks well, from understanding speech to recognising objects in images. Why? Because we cant specify the features to optimise in a way thats practical and reliable. If we want to write a computer program that identifies images of cars, for example, we cant specify the features of a car for an algorithm to process that will enable correct identification in all circumstances. Cars come in a wide range of shapes, sizes and colours. Their position, orientation and pose can differ. Background, lighting and myriad other factors impact the appearance of the object. There are too many variations to write a set of rules. Even if we could, if wouldnt be a scalable solution. Wed need to write a program for every type of object we wanted to identify.[5]

E. CONCLUSION

Although the opportunities this presents should generate excitement rather than fear, the socio-economic impact cannot be ignored. We are moving into a world where the traditional way in which wealth is distributed will shift into unchartered territory, and it rests on key policy makers to ensure that existing arbitrary measurements are brought into this new machine age to reflect the needs of an increasingly automated society. Although the conversations taking place in Davos this week will spark a a global call-to-action, it is only the starting point of a much larger imperative in which industry leaders and technology pioneers must continue to create the right platforms and solutions that will drive success for all constituents in todays digital economy.[6]

REFERENCES

  1. projects.csail.mit.edu/films/aifilms/AIFilms

  2. https://pdfs.semanticscholar.org/b2ad/d02d066a1b91810284e5 49d

    80fc71df357d3.pdf

  3. https://www.whitehouse.gov/sites/whitehouse.gov/files/documents

    /Artificial-Intelligence-Automation-Economy.PDF

  4. https://medium.com/mmc-writes/the-fourth-industrial-revolution- a- primer-on-artificial-intelligence-ai-ff5e7fffcae1#.6dh0qeelm

  5. www.psfk.com/…/how-the-fourth-industrial-revolution- will- impact-our-everyday-live..

[6] enap2016.weebly.com/uploads/5/5/1/9/55194205/cappa_fv_2016- 1.pdf

Leave a Reply