|
When Peter Drucker first met IBM CEO Thomas J. Watson in the 1930s, the legendary management thinker and journalist was somewhat baffled. He began talking about something called data processing, Drucker recalled, and it made absolutely no sense to me. I took it back and told my editor, and he said that Watson was a nut, and threw the interview away. Things that change the world always arrive out of context for the simple reason that the world hasn’t changed yet. So we always struggle to see how things will look in the future. Visionaries compete for our attention, arguing for their theory of how things will fit together and impact our lives. Billions of dollars are bet on competing claims. This is especially true today, with artificial intelligence making head-spinning advances. But we also need to ask: What if the future looked exactly like the past? Certainly, theres been no lack of innovation since Drucker met Watson. How did those technologies impact the economy and shape our lives? If we want to know what to expect from the future, thats where we should start. The First Productivity Paradox Thomas J. Watson [Photo: IBM] Watson would, of course, build IBM into an industrial giant. But it was his son Thomas Watson Jr. who would transform the industry in 1964 with the $5 billion gamble (nearly $50 billion in todays dollars) on the System/360, a platform that would dominate the computing world for two decades. It was, essentially, the Apple iPhone and Microsoft Windows of its time, combined. Just as the elder Watson had foreseen, data processing became central to how industry functioned. In the 1970s and 80s, business investment in computer technology was increasing by more than 20% per year. Yet, strangely, productivity growth was falling. Economists coined the term “productivity paradox” to describe this strange contradiction. The productivity paradox dumbfounded economists because it violated a basic principle of how a free market economy is supposed to work. If profit-seeking businesses continue to make substantial investments, youd expect to see a return. Yet with IT investment in the 70s and 80s, firms continued to increase their investment with negligible measurable benefit. A paper by researchers at the University of Sheffield in England sheds some light on what happened. First, productivity measures were largely developed for an industrial economy, not an information economy. Second, the value of those investments, while substantial, was a small portion of total capital investment. Third, businesses werent necessarily investing to improve productivity, but to survive in a more demanding marketplace. By the late 1990s, however, that began to change. Increased computing power, combined with the rise of the internet, triggered a new productivity boom. Many economists hailed a new economy of increasing returns, in which the old rules no longer applied. The mystery of the productivity paradox, it seemed, had been solved. We just needed to wait for the technology to hit critical mass and deliver us to the promised land. The Second Productivity Paradox By the turn of the century, the digital economy was going full steam. While old industrial companies like Exxon Mobil, General Motors, and Walmart still topped the Fortune 500, new economy upstarts like Google, Apple, and Amazon were growing quickly and, after a brief dotcom bust, would challenge the incumbents for dominance. By 2004, things were humming again. Social media was ramping up, and Tim OReilly proclaimed the new era of Web 2.0. A few years later, Apple launched the iPhone and that, combined with the new 4G standard, ushered in the era of mobile internet. New cloud computing services such as Amazon Web Services and Microsoft Azure would make vast computing power available to anyone with a credit card. Yet as economist Robert Gordon has pointed out, by 2006 it had become clear that productivity was slumping again and, despite some blips here and there, it hasnt recovered since. For all of the hype coming out of Silicon Valley, weve spent the past 20 years in the midst of a second productivity paradox. Clearly, things have qualitatively changed over the past two decades. We are no longer tethered to our desks at work. A teenager with a smartphone in a developing country has more access to information today than a professional working at a major institution did back then. It is, to paraphrase Robert Solows famous quip, as if we can see the digital age everywhere but in the productivity statistics. Searching For Utopia . . . And Finding So-So Technologies Business pundits claim that things have never moved faster, but the evidence shows exactly the opposite. In fact, weve been in a productivity slump for over half a century. Data also shows that industries have become more concentrated, not more competitive, over the past 25 years. U.S. corporate profits have roughly tripled as a percentage of GDP in that same time period. So what gives? The techno-optimists keep promising us some sort of utopia, with a hypercompetitive marketplace yielding productivity gains so vast that our lives will be utterly transformed for the better. But the data says otherwise. How do we reconcile the visions of the Silicon Valley crowd with the hard analysis of the economists? Some of the same factors behind the first productivity paradox are still at play. According to Statista, the digital economy makes up only about 9% of GDP. An analysis by the Federal Reserve Bank found that while AI is having a huge impact on some tasks, such as computing and math, its not having much of an effect at all on things like personal services, office and administration work, and blue-collar labor. Part of the answer may also lie in what economists Daron Acemoglu and Pascual Restrepo call so-so technologies, such as self-checkouts in supermarkets, screen ordering at airport bars, and automated customer service systems. These produce meager productivity gains and often put a greater burden on the consumer. The simple truth is that our economy is vast, and digital technology plays only a limited role in most of it. Next time youre checking your smartphone in traffic, ask yourself: Is your chatbot making your rent any cheaper? Is it getting you through traffic any faster? Or making your trip to the doctor any less expensive? Innovation Should Serve People, Not The Other Way Around In his 1954 essay, “The Question Concerning Technology,” German philosopher Martin Heidegger described technology as akin to art, in that it reveals truths about the nature of the world, brings them forth, and puts them to some specific use. In the process, human nature and its capacity for good and evil are also revealed. He offers the example of a hydroelectric dam, which uncovers a rivers energy and channels it into electricity. In much the same sense, the breakthrough technologies of todaylike the large language models that power our AI chatbots, the forces of entanglement and superposition that drive quantum computing, as well as technologies like CRISPR and mRNA that fuel tomorrows miracle cureswere not built, so much as they were revealed. In another essay, “Building Dwelling Thinking,” Heidegger explains that what we build for the world depends on how we interpret what it means to live in it. The relationship is, of course, reflexive. What we build depends on how we wish to dwell, and that act, in and of itself, shapes how we build further. As we go through yet another hype cycle, we need to keep in mind that were not just building for the future, but also for the present, which will look very much like the past. While it is, of course, possible that we are on the brink of some utopian age in which we unlock so much prosperity that drudgery, poverty, and pain become distant memories, the most likely scenario is that most people will continue to struggle. The truth is that innovation should serve people, not the other way around. To truly build for the world, you need to understand something about how people live in it. Breakthrough innovation happens when people who understand technical solutions are able to collaborate with people who understand real-world problems. Just like in the past, thats what we need more of now.
Category:
E-Commerce
Worldwide, an estimated 440 million people were exposed to a wildfire encroaching on their home at some point between 2002 and 2021, new research shows. Thats roughly equivalent to the entire population of the European Union, and the number has been steadily risingup 40% over those two decades. With intense, destructive fires often in the news, it can seem like more land is burning. And in parts of the world, including western North America, it is. Globally, however, our team of fire researchers also found that the total area burned actually declined by 26% over those two decades. How is that possible? We found the driving reasons for those changes in Africa, which has the vast majority of all land burned, but the total burned area there has been falling. Agricultural activities in Africa are increasingly fragmenting wildland areas that are prone to burning. A cultivated farm field and roads can help stop a fires spread. But more farms and development in wildland areas also means more people can be exposed to wildfires. Drawing on our expertise in climate and wildfire sciences and geospatial modeling, we analyzed global wildfire activity over the past two decades. The results highlight some common misperceptions and show how the fire risk to humans is changing. Global burned area down, intense fires up Wildfire is a natural process that has existed for as long as vegetation has covered the Earth. Occasional fires in a forest are healthy. They clear out dead wood and leaf and branch litter, leaving less fuel for future fires to burn. That helps to keep wildfires from becoming too intense. However, intense fires can also pose serious threats to human lives, infrastructure and economies, particularly as more people move into fire-prone areas. North and South America have both experienced a rise in intense wildfires over the past two decades. Some notable examples include the 2018 Camp Fire in California and the 2023 record-breaking Canadian wildfires, which generated widespread smoke that blanketed large parts of Canada and the eastern United States, and even reached Europe. The increase in intense wildfires aligns with the intensification of fire weather around the world. Heat, low humidity and strong winds can make wildfires more likely to spread and harder to control. The number of days conducive to extreme fire behavior and new fire ignitions has increased by more than 50% over the past four decades globally, elevating the odds that the amount of land burned in a particular region sets a new record. But fire weather is not the only influence on wildfire risk. The amount of dry vegetation, and whether its in a continuous stretch or broken up, influences fire risk. So do ignition sources, such as vehicles and power lines in wildland areas. Human activities can start fires and fuel climate change, which further dries out the land, amplifying wildfire activity. Fire suppression practices that dont allow low-intensity fires to burn can lead to the accumulation of flammable vegetation, raising the risk of intense fires. North America is a fraction of total burned area In recent years, a growing number of wildfire disasters in North America, Europe and Australia have captured global attention. From the deadly 2025 Los Angeles fires to the devastating 2019-2020 Australian bushfires and the 2018 wildfire in Athens, Greece, flames have increasingly encroached upon human settlements, claiming lives and livelihoods. However, wildfire exposure isnt limited to these high-profile regionswe simply hear more about them. The United States, Europe, and Australia collectively account for less than 2.5% of global human exposure to wildfire. Human exposure to fire occurs when peoples homes fall directly within the area burned by a wildfire. In stark contrast, Africa alone accounts for approximately 85% of all wildfire exposures and 65% of the global burned area. Remarkably, just five central African countriesthe Democratic Republic of Congo, South Sudan, Mozambique, Zambia, and Angolaexperience half of all global human exposure to wildfires, even though they account for less than 3% of the global population. These countries receive sufficient moisture to support plant growth, yet they are dry enough that trees and plants burn in frequent fires that in some places occur multiple times per year. Regional trends and drivers of wildfire We found that wildfire exposure increased across all continents except Europe and Oceania, but the underlying drivers of the increase varied by region. In Africa, agricultural expansion has led to more people living in fire-prone areas. In North America, particularly the United States, intensifying fire weatherthe hot, dry, windy conditions conducive to spreading fireshas led to increasingly uncontrollable wildfires that threaten human settlements. In South America, a combination of rising drought frequency and severity, intensifying heat waves and agriculural expansion has amplified wildfire intensity and increased the population in fire-prone regions. In Asia, growing populations in fire-prone areas, combined with more days of fire-friendly weather, led to increased human exposure to wildfires. In contrast, Europe and Oceania have seen declining wildfire exposures, largely due to more people moving to cities and fewer living in rural, fire-prone zones. What to do about it Communities can take steps to prevent destructive wildfires from spreading. For example, vegetation management, such as prescribed fires, can avoid fueling intense fires. Public education, policy enforcement, and engineering solutionssuch as vegetation reduction and clearance along roads and power linescan help reduce human-caused ignitions. As climate change intensifies fire weather and people continue to move into fire-prone zones, proactive mitigation will be increasingly critical. Mojtaba Sadegh is an associate professor of civil engineering and a senior fellow at the United Nations University Institute for Water, Environment and Health at Boise State University. John Abatzoglou is a professor of engineering at the University of California, Merced. Seyd Teymoor Seydi is a researcher in remote sensing at Boise State University. This article is republished from The Conversation under a Creative Commons license. Read the original article.
Category:
E-Commerce
When it comes to starting a new job, first impressions matter. That’s especially true when it comes to the impression a company makes on new hires during onboarding procedures. According to new research, onboarding procedures such as welcoming new employees, and training them can have long-term impacts. The quality of an employee’s onboarding can affect their long-term motivation and even how long they plan to stick around. A new survey from Software Finder of 1,010 employees who were hired within the last two years found that employees onboarding experiences varied greatly. While almost half (46%) found onboarding procedures to be welcoming, and about a third (34%) said their onboarding was well-structured, many described the experience in negative terms. About a third (29%) said the process was disorganized, 26% described it as rushed, and 21% called it underwhelming. Shockingly, only 28% of new hires said the onboarding process prepared them for their role. In fact, two-thirds (67%) of respondents said the procedures didn’t accurately represent their responsibilities or the company as a whole. A bad onboarding experience can impact how long employees want to stay with the company, the survey found. Nearly half of employees (48%) who said they had a bad onboarding experience, said they wanted to leave the company within six months. However, employees with positive onboarding experiences felt differently. Nearly 4 in 10 (39%) said an effective onboarding actually increased their desire to stick around long term. And over half of employees (55%) with a positive onboarding experience said they’d want to stay at the job long term. By comparison, only 10% of new hires with negative onboarding experiences felt similarly. In fact, 77% of employees who had a positive onboarding experience said they felt more connected to the company after the onboarding. Likewise, 61% of employees said the impact of onboarding has an impact on their future work ethic and engagement. Interestingly, employees seem up for reengagement. Seven out of 10 said they’d favor a re-onboarding experience after their first six months at a new job to help them align more fully with the company. That might be good news for companies who got it wrong the first time around.
Category:
E-Commerce
All news |
||||||||||||||||||
|