How should engineers and technology leaders approach their work and reconcile the impact that robotics will have on production line workers? What’s more, as robotics are introduced to handle repetitive procedures (from routine surgeries to meal preparation and even hotel room service delivery), what ethical considerations arise for engineers who develop and facilitate adoption of robotics and automated systems?

To explore these questions (and others as part of a series of articles), Engineering360 Editorial Director David Wagman reached out to Massoud Amin, doctor of science and professor of electrical and computer engineering at the University of Minnesota.

(Read Part 1 of this series, "In conversation: A framework for assessing new technology.")

Amin is widely credited as being the father of the smart electric power grid and a cyber-physical security leader, who directed all security-related R&D for North American utilities after the 9/11 tragedies. Amin also directed the Technological Leadership Institute at the University of Minnesota from 2003 until late 2018. He is an IEEE fellow, chaired IEEE Smart Grid and is also an ASME fellow. He holds degrees from the University of Massachusetts-Amherst and from Washington University.

Massoud Amin: As the father of modern management science, the late Peter Drucker noted, “If you can’t measure it, you can’t improve it.” Or “you can't manage what you can't measure.”

Dr. Massoud Amin, University of MinnesotaDr. Massoud Amin, University of MinnesotaTo be sure, one cannot know whether he or she is successful unless success is defined and also tracked. As engineers, we learn that if we know what to sense/measure, and if we can do so with safety and performance/cost constraints in mind, then we can manage it.

However, while working for the U.S. Air Force in the 1980s and 1990s, and during 1998-2003 at the Electric Power Research Institute (EPRI), I learned to add that if you value/price it, you can manage it even better!

The importance of knowing what and how to measure (and also how to attach value) extends to nearly all aspects of our lives — from toxic air and water emissions, to global technology and everyday business developments.

However, as my colleague Michael Wright points out in a forthcoming article, the inherent complexity of our emerging technologies, along with exponential rates of adoption, mean that the consequences of our engineering and development decisions are no longer entirely obvious. His message is a sobering one: We may no longer be capable of seeing the results of our actions over time; as a result, we truly are blind when it comes to scaling up technology.

Let's explore this further.

Science and technology as fuel

My mentor at EPRI, Dr. Chauncey Starr, and I introduced in 2003 the concept and methodologies of what we termed Global Transition Dynamics. This concept states that politics, society, health, environment, education and social development are propelled and held together by judicious advancements of science and technology.

Robot based on a design by Leonardo da Vinci. Source: Erik Möller. Leonardo da Vinci. Mensch - Erfinder - Genie exhibit, Berlin 2005Robot based on a design by Leonardo da Vinci. Source: Erik Möller. Leonardo da Vinci. Mensch - Erfinder - Genie exhibit, Berlin 2005Engineering and science play critical roles in transforming societies; they enhance the quality of life, grow economies and serve societies. Understanding this critical role allows us to assess broad technology megatrends such as the information revolution (including computational sciences, software, hardware, VLSI, GPS, the internet, Wi-Fi, 5G and beyond, AI, ML and quantum computing), materials advances and the new genetics.

Assessment is crucial. To be sure, businesses, organizations and institutions must make hard, deliberate choices. They cannot try to do everything because then nothing will result. Outcomes become boring and normal; I call this “vanilla” or “peanut butter” innovation.

To avoid bland outcomes, Starr and I used a macroeconomic rationale that says gross domestic product (GDP) is a function of investments in and availability of pertinent capabilities in research and development (R&D), physical capital and human capital.

However, success is tied not only to a good assessment of new technologies, but also to an understanding of business trends, ideology, principles, culture, policy and what are known as bifurcation points. In this case, bifurcation describes a complex dynamic system that changes or evolves over time.

For example, at what point does a business take off, remain at a “normal” growth rate for its sector, become a platform technology, an exponentially adopted technology or even decline?

Most importantly, our communities and the world face unprecedented challenges in scaling of technologies in an ethical and beneficial way. As the German-American philosopher Hans Jonas pointed out half a century ago, “technological progress requires ethical progress or we risk the destruction of society.”

The why?

Speed, innovation, anxiety, revolution, brand strategy and exponential adoption rates are some of the words that Michael Wright, author of The New Business Normal, uses to characterize today’s business climate. If one understands the business climate and applies a macroeconomic rationale, then a greater chance exists of reaching a successful decision. Wright and I have collaborated for the last 15 years, and recently have focused on identifying the scaling risks inherent in the rapid adoption of emerging technologies.

Engineers and scientists have a responsibility to share that understanding not just with policy makers but also with concerned citizens. Source: BMW Werk LeipzigEngineers and scientists have a responsibility to share that understanding not just with policy makers but also with concerned citizens. Source: BMW Werk LeipzigWright notes that for the first time in history we may be on the cusp of discovering how to build models that will allow us to explore the ramifications of our decisions.

However, as Jonas noted nearly 50 years ago, from a moral perspective we also are responsible both for the decisions and the outcomes. This presents an ongoing challenge to identify what constitutes acceptable moral behavior as well as our subsequent ethical responsibility as engineers to others.

This ethical responsibility has long been recognized, as engineers and scientists alike have special social responsibilities because, by virtue of their training and specialization, they know a lot more about the dangers from technology and how to reduce them than does the average layperson. Engineers and scientists have a responsibility to share that understanding not just with policy makers but also with concerned citizens.

Diagrams and equations

A few examples illustrate this point and show that ethical questions are not limited to our era. In February 1931, Albert Einstein spoke at CalTech and told his audience, as reported the next day by the New York Times, that “concern for man himself and his fate must always constitute the chief objective of all technological endeavors.” Einstein cautioned the audience to “never forget this in the midst of your diagrams and equations.”

And in 1949, a committee of science advisors chaired by J. Robert Oppenheimer — and all of them veterans of the World War II nuclear project — questioned the morality of developing an even more powerful atomic bomb.

More recently the use of genetics without proper scientific procedures aligned with ethical conduct, have raised tangible concerns.

These examples illustrate that, ultimately, each of us is responsible to our fellow human beings and not just to the political leadership of the day or self-interests.

The how?

The approach that Chauncey Starr and I developed at EPRI uses frameworks, dashboards and assessment methodologies to identify business opportunities for industry, academia, non-governmental organizations or government.

The methodology first looks at dynamic scenarios that are coupled with technology needs in order to identify opportunities. Second, it looks at the scenarios as they are coupled with the strategies or applications. Third, the views are fused and combined to identify where needs and potential applications merge.

George Day’s Real-Win-Worth assessment also may be applied. His assessment asks whether or not the opportunity is real, if we can win with it, and if the effort is worth it, in addition to several other filters.

Amin says that technology is both the cause of many of the world's problems and the best hope for their cure.Amin says that technology is both the cause of many of the world's problems and the best hope for their cure.This sort of analysis is critical as we understand over time that technology is both the cause of many of the world's problems and the best hope for their cure. Social impacts and ethics are indeed key parts of the "value/cost" that we measure.

All the classical economists of the late 18th and early 19th centuries, including Adam Smith, David Ricardo and John Stuart Mill, stressed the importance of technological changes. Thomas Malthus's pessimism that runaway population would lead to increasing misery as the world's population expanded more rapidly than the food supply proved wrong because technological changes stayed ahead of population growth.

Today's prosperity is closely linked to technological advances. A high level of technology is critical to the world’s continued economic growth. It provides the knowledge to convert factors of production into goods and services. It also gives rise to an efficient division of labor, improves productivity and permits capital accumulation.

The Russian economist Nikolai Kondratiev (1892-1938) maintained that economic progress took place in long waves, each lasting about half a century. Each wave had periods of prosperity, recession, depression and recovery. Technological progress was not linear.

The Austrian economist, Joseph Schumpeter, connected technological innovations to these waves of economic growth. Schumpeter’s argument was that technological change was a series of explosions in which one set of technologies replaced another. The first period (1782-1845) was one in which there were major innovations in steam power and textiles; the second (1845-1892) was one in which there were major innovations in railroads, iron, coal and construction. The third period (1892-1948) was one that featured major innovations in electrical power, automobiles, chemicals, and steel.

Observations of the obvious

More recently, the American sociologist Daniel Bell proposed that the period since 1973 was one of post-industrialism. In a post-industrial era, ideas are more important than material forces that dominated previous periods of economic growth. In this era, information technology (IT) is the dominant force. Medical technology and genetics also have important roles to play, as does alternative energy. Other technologies worth including are artificial intelligence, the material sciences and nanotechnology.

One key question is how to balance it all, particularly in a “merit-based” competitive system of rewards/promotion that “values” inventions and innovations. The balancing problem may be even more challenging as robotics and artificial intelligence flourish and begin to exhibit the very human characteristics of self-reproduction and intelligence.

“We are at a point of departure in human existence,” Michael Wright argues, that is being accelerated by the exponential growth in technologies. These technologies can "extend beyond themselves autonomously and that have no cohesive direction or envisioned constraints modeled or employed.”

As these technologies are adopted at ever-faster rates, we will continue to face significant ethical challenges as engineers and scientists.

“Our current ethics are outdated," Wright says in a compelling argument, "because they are based on observations of the obvious.”

(Read part 3 of this series: "In conversation: A framework for designing for disruptive technologies.")