Saturday, March 30, 2013

Continue Preparation for the Ever Emergent Knowledge and Creative Economy (*Via Agile Adaptability!)


Robots Aren't the Problem: It's Us

Robots Aren't the Problem: It's Us
Swikar Patel for The Chronicle Review
Everyone has an opinion about technology. Depending on whom you ask, it will either: a) Liberate us from the drudgery of everyday life, rescue us from disease and hardship, and enable the unimagined flourishing of human civilization; or b) Take away our jobs, leave us broke, purposeless, and miserable, and cause civilization as we know it to collapse.
The first strand of thinking reflects "techno-utopianism"—the conviction that technology paves a clear and unyielding path to progress and the good life. George F. Gilder's 2000 book Telecosm envisions a radiant future of unlimited bandwidth in which "liberated from hierarchies that often waste their time and talents, people will be able to discover their most productive roles." Wired's Kevin Kelly believes that, although robots will take away our jobs, they will also "help us discover new jobs for ourselves, new tasks that expand who we are. They will let us focus on becoming more human than we were."
The technology critic Evgeny Morozov dubs today's brand of technology utopianism "solutionism," a deep, insidious kind of technological determinism in which issues can be minimized by supposed technological fixes (an extreme example he gives is how a set of "smart" contact lenses edit out the homeless from view). We latch on to such fixes because they enable us to displace our anxieties about our real-world distress, the New Yorker staff writer George Packer explains: "When things don't work in the realm of stuff, people turn to the realm of bits." Morozov points to a future in which dictators and governments increasingly use technology (and robots) to watch over us; Packer worries about "the politics of dissolution," the way information technology erodes longstanding identities and atomizes us.
On the other side stand the growing ranks of "techno-pessimists." Some say that technology's influence is greatly overstated, seeing instead a petering out of innovation and its productive forces. According to the George Mason University economist Tyler Cowen, for example, America and other advanced nations are entering a prolonged "great stagnation," in which the low-hanging fruits of technological advance have largely been exhausted and the rates of innovation and economic growth have slowed. Robert J. Gordon, an economist at Northwestern University, adds additional statistical ammunition to this argument in his much-talked-about paper, "Is U.S. Economic Growth Over?" Computers and biotechnology have advanced at a phenomenal clip, he demonstrates, but they have created only a short-lived revival of growth. Today's innovations do not have the kind of world-shaking impact that the invention of modern plumbing or the introduction of self-propelled vehicles did (they're "pipsqueaks" by comparison)—and they are more likely to eliminate than to add jobs.
Another techno-dystopian strand sees the "rise of the robots" as a threat not just to blue-collar jobs but also to knowledge work. "To put it bluntly, it seems that high-skill occupations can be mechanised and outsourced in much the same way as car manufacturing and personal finance," Tom Campbell, a novelist and consultant in the creative sector, blogs, pointing to commercial software that already analyzes legal contracts or diagnoses disease.
The dustbin of history is littered with dire predictions about the effects of technology. They frequently come to the fore in periods in which economies and societies are in the throes of sweeping transformation­—like today.
During the upheaval of the Great Depression, the late Harvard University economist Alvin Hansen, often called the "American Keynes," said that our economy had exhausted its productive forces and was doomed to a fate of secular stagnation in which the government would be constantly called upon to stoke demand to keep it moving. Of course we now know from the detailed historical research of Alexander J. Field that the 1930s were, in the title of his 2008 paper, "The Most Technologically Progressive Decade of the Century," when technological growth outpaced the high-tech innovations of the 1980s, 1990s, and 2000s.
As the late economist of innovation Christopher Freeman long ago argued, innovation slows down during the highly speculative times leading up to great economic crises, only to surge forward as the crisis turns toward recovery. While data are scanty so early into our current recovery cycle, a new, detailed report from the Brookings Institution shows a considerable uptick in patented innovations over the last couple of years,
More than 100 years ago, during an earlier depression, H.G. Wells's The Time Machine imagined a distant future when humanity had degenerated into two separate species—the dismal Morlock, the descendants of the working class, who lived underground and manned the machines, and the ethereal Eloi, their former masters, who had devolved to a state of abject dependency. A little more than half a century later, Kurt Vonnegut's Player Piano depicted a world in which "any man who cannot support himself by doing a job better than a machine" is shipped off to the military or assigned to do menial work under the auspices of the government.
This either-or dualism misses the point, for two reasons.
The obvious one is the simple fact that technology cuts both ways. In their influential book Race Against the Machine, Erik Brynjolfsson and Andrew McAfee, both at the Massachusetts Institute of Technology, point out how technology eliminates some jobs but upgrades others. Similarly, Scott Winship, an economist with Brookings, recently noted in an article in Forbes that "technological development will surely eliminate some specific jobs." But the productivity gains from those developments, he added, "will lower the cost of goods and produce more discretionary income, which people will use to pay other people to do things for them, creating new jobs."
What economists dub "skill-biased technical change" is, in fact, causing both the elimination of formerly good-paying manufacturing jobs and the creation of high-paying new jobs. As a result, work is being bifurcated­­—into high-pay, high-skill knowledge jobs and low-pay, low-skill service jobs.
The second and more fundamental problem with the debate between utopians and dsytopians is that technology, while important, is not deterministic. As the great theorists of technology, economic growth, and social development Karl Marx and Joseph Schumpeter argued—and modern students of technological innovation have documented—technology is embedded in the larger social and economic structures, class relationships, and institutions that we create. All the way back in 1858, inGrundrisse, Marx noted: "Nature builds no machines, no locomotives, railways, electric telegraphs, self-acting mules, etc. These are products of human industry." Technological innovation, he went on "indicates to what degree general social knowledge has become a direct force of production, and to what degree, hence, the conditions of the process of social life itself have come under the control of the general intellect and been transformed in accordance with it."
In his landmark 1990 book on economic progress from classical antiquity to the present, The Lever of Riches, the economic historian Joel Mokyr also distinguisheshomo economicus, "who makes the most of what nature permits him to have," from the Promethean homo creativus, who "rebels against nature's dictates." He places emphasis, like Schumpeter perhaps, on human beings' underlying creative ability to mold technology by building institutions, forging social compacts, making work better, building societies. Technology does not force us into a preordained path but enables us, or, more to the point, forces us to make choices about what we want our future to be like.
We do not live in the world of The Matrix or the Terminator movies, where the machines are calling the shots. When all is said and done, human beings are technology's creators, not its passive objects. Our key tasks during economic and social transformations are to build new institutions and new social structures and to create and put into effect public policies that leverage technology to improve our jobs, strengthen our economy and society, and generate broader shared prosperity.
Our current period is less defined by either the "end of technology­" or the "rise of robots" than by deep and fundamental transformations of our economy, society, and class structures. The kinds of work that Americans do have changed radically over the course of the last two centuries, particularly during major economic crises, like the Panic and Depression of 1873; the Great Depression of the 1930s; the Crash of 2008. Each shift has been hugely disruptive, eliminating previously dominant forms of employment and work, while generating entirely new ones.
In 1800 more than 40 percent of American workers made their livings in farming, fishing, or forestry, while less than 20 percent worked in manufacturing, transportation, and the like. By 1870, the share of workers engaged in those agricultural jobs had dropped to just 10 percent; during those same decades, the ranks of blue-collar manufacturing workers had risen to more than 60 percent.
That was not a smooth change, to say the very least. Rural people feared—often rightly—that their friends and family who were moving to the cities were dooming themselves to immiseration and brutal exploitation, working 16-hour days for subsistence wages. When labor began to organize for better conditions, management hit back hard—in some cases unleashing armed Pinkertons on strikers. The Panic of 1873 and the Long Depression that followed it began as a banking crisis precipitated by insolvent mortgages and complex speculative instruments, and it brought the entire economy to a virtual standstill. But the technological advances perfected and put into place during that decade of economic stagnation—everything from telephones to streetcars—created the powerhouse industrial cities that underpinned a vast industrial expansion.
The battles, and the terrible working conditions, continued well into the 1930s, when my father went to work in a Newark, N.J., factory at age 13. Nine people in his family had to work—both parents, both grandparents, and several siblings—to make one family wage. The Industrial Revolution had been going on for more than a century before a new social compact was forged—a product of worker militancy, enlightened self-interest on the part of owners and management, and pressure from the government—that brought safety, dignity, and security to blue-collar work. It was this compact that buttressed the great age of productivity in the post-World War II era. When he returned from the war, my father's job in the very same factory he had previously worked in had been transformed into a good, high-paying occupation, the kind we pine for today, which enabled him to buy a home and support a family.
But beginning around 1950, when Kurt Vonnegut was working for General Electric and writing Player Piano, the share of working-class jobs began to fall precipitously. It wasn't just automation that was doing it—our whole economy was shifting again, and our society was changing with it. There was the civil-rights movement and later the anti-war youth movement, feminism, and gay rights. People began to rebel against the enforced conformity of corporate life. A new ethos was bubbling up, in Haight-Ashbury and Woodstock through music and art and fashion, and in Silicon Valley with computers and high tech. Some economists began to talk about how the industrial economy was transitioning to a service economy; others, like the sociologist Daniel Bell, saw the rise of a postindustrial economy powered by science, technology, and a new technocratic elite. The pioneering theorist Peter Drucker dubbed it a "knowledge economy."
Almost a decade ago, in my book The Rise of the Creative Class, I called it a "creative economy," because creativity, not knowledge, has become the fundamental factor of production. Our economy uses technology, but it is not principally powered by it. Its motive force is creativity. Economic and social progress result from the interweaving of several distinctive, related strands of creativity: innovative or technological creativity, entrepreneurship or economic creativity, and civic or artistic creativity.
Our current economic circumstance is not simply the product of faceless technology; it is also informed and structured by socioeconomic class.
The key organizing unit of the postindustrial creative economy is no longer the factory or the giant corporation. It is our communities and our cities. Cities are the organizing or pivot point for creativity, its great containers and connectors. Unlike the services we produce, the technologies we create, or the knowledge and information that is poured into our heads, creativity is an attribute we all share. It is innate in every human being. But it is also social, it lives among us: We make each other creative. With their dense social networks, cities push people together and increase the kinetic energy among them. If the powerhouse cities of the industrial era depended on their locations near natural resources or transportation centers, our great cities today turn on the people who live in them—they are where we combine and recombine our talents to generate new ideas and innovations.
Like the Industrial Revolution, the rise of the knowledge-driven, creative economy has transformed the composition of the work force, with harrowing consequences. The picture is brutally clear: Working-class employment has declined by 50 percent in the last half century. Blue-collar workers made up 40 percent of the work force in 1980; they are just 20 percent of the work force today. In just the one decade between 2000 and 2010, the United States shed more than 5.7 million production jobs.
As the working class, like the agricultural class before it, has faded, two new socioeconomic classes have arisen: the creative class (40 million strong in the United States, roughly a third of the work force) and the even larger service class (60 million strong and growing, about 45 percent of the work force). If the creative class is growing, the service class is growing even faster. Last year the U.S. Bureau of Labor Statistics published a list of the fastest-growing occupational categories in the United States, projected out to 2020. Most of the top 10 were in the service sector. The two fastest-growing jobs, which are expected to grow by roughly 70 percent by 2020, were personal-care aides and home health aides. The former, which pays a median of just $19,640, will add more than 600,000 jobs; the latter, which pays $20,560, will grow by more than 700,000 jobs. There was only one clearly creative-class job in the top 10—biomedical engineer (an $81,540-a-year job).
Our current economic circumstance is not simply the product of faceless technology; it is also informed and structured by socioeconomic class. The creative class is highly skilled and educated; it is also well paid. Creative-class jobs average more than $70,000 in wages and salaries; some pay much more. Service-class jobs in contrast average just $29,000. The service class makes up 45 percent of the work force but earns just a third of wages and salaries in the United States; the creative class accounts for just a third of employment but earns roughly half the wages and salaries.
The divide goes even deeper. Add the ranks of the unemployed, the displaced, and the disconnected to those tens of millions of low-wage service workers, and the population of postindustrialism's left-behinds surges to as many as two-thirds of all Americans. That produces a much larger, and perhaps more permanent, version of the economic, social, and cultural underclass that Michael Harrington long ago dubbed "the other America." Only this time, it's a clear majority.
The effects of class extend far beyond our work and incomes to virtually every facet of our social lives. One class is not only wealthier and better educated than the other, its members are also healthier, happier, live in places with better services and resources of all sorts, and they pass their advantages on to their children.
To blame technology for all this is to miss the point. Instead of looking at technology as a simple artifact that imposes its will on us, we should look at how it affects our social and economic arrangements—and how we have failed to adapt them to our circumstances.
If nearly half the jobs that our economy is creating are low paid and unskilled and roughly two-thirds of our population is being left behind, then we need to create new and better social and economic structures that improve those jobs. That means more than just raising wages (though that has to be done), but actively and deliberately improving jobs. We did it before with factory jobs, like my father's.
We have to do it again, this time with low-wage, low-skill service work. That isn't charity or an entitlement—it's tapping workers' intelligence and capabilities as a source of innovation and productivity improvements.
My own research, and that of others, has identified two sets of skills that increase pay and improve work. Cognitive skills have to do with intelligence and knowledge; social skills involve the ability to mobilize resources, manage teams, and create value. These skills literally define high-wage knowledge work: When you add more of them to that work, wages go up. But here's the thing: When those skills are added to service work, wages increase at a steeper rate than they do in creative jobs.
Paying workers better also offers substantial benefits to the companies that employ them and to the economy writ large. While that may seem counterintuitive, detailed academic research backs it up. Zeynep Ton of MIT's Sloan School of Management argues that the notion that keeping wages low is the long way to achieve low prices and high profits is badly mistaken: "The problem with this very common view is that it assumes that an employee working at a low-cost retailer can't be any more productive than he or she currently is. It's mindless work so it doesn't matter who does it. If that were true, then it really wouldn't make any sense to pay retail workers any more than the least you can get away with."
In a study published in the Harvard Business Review, Ton finds that the retail companies that invest the most in their lowest paid workers "also have the lowest prices in their industries, solid financial performance, and better customer service than their competitors." As she has pointed out, the companies and jobs provide a powerful model that can be extended to other service-based jobs like those in hospitals, restaurants, banks, and hotels. Upgrading service jobs in this way, she says, "could help provide the kind of economic boost the economy needs."
We can't simply write off the tens of millions of workers who toil in dead-end service jobs, or the millions more who are unemployed and underemployed. The key to a broadly shared prosperity lies in new social and economic arrangements that more fully engage, not ignore and waste, the creative talents of all of our people.
Just as we forged a new social compact in the 1930s, 40s, and 50s that saw manufacturing workers as a source of productivity improvements and raised their wages to create a broad middle class to power growth, we need a new social compact—a Creative Compact—that extends the advantages of our emergent knowledge and creative economy to a much broader range of workers. Every job must be "creatified"; we must harness the creativity of every single human being.
I'm optimistic, even in the face of deep economic, social, and political troubles, because the logic of our future economic development turns on the further development and engagement of human creativity.
As in the past, it won't be technology that defines our economic future. It will be our ability to mold it to our needs.
Richard Florida is director of the Martin Prosperity Institute at the University of Toronto's Rotman School of Management and Global Research Professor at New York University.

    Post Industrial Revolution & Robotics (Prepare Everyone for Mankind's Mastery of Machines Competitiveness)


    The New Industrial Revolution

    A coming wave of robots could redefine our jobs. Will that redefine us?

    The Barbed Gift of Leisure
    Illustration by Coneyl Fay, Science Source
    Baxter is a new type of worker, who is having no trouble getting a job these days, even in a tight economy. He's a little slow, but he's easy to train. And companies don't hire him, they buy him—he even comes with a warranty.
    Baxter is a robot, not a human, though human workers in all kinds of industries may soon call him a colleague. His plastic-and-metal body consists of two arms loaded with sensors to keep his lifeless limbs from accidentally knocking over anyone nearby. And he has a simulated face, displayed on a flat-panel computer monitor, so he can give a frown if he's vexed or show a bored look if he's waiting to be given more to do.
    Baxter is part of a new generation of machines that are changing the labor market worldwide—and raising a new round of debate about the meaning of work itself. This robot comes at a price so low—starting at just $22,000—that even businesses that never thought of replacing people with machines may find that prospect irresistible. It's the brainchild of Rodney Brooks, who also designed the Roomba robot vacuum cleaner, which succeeded in bringing at least a little bit of robotics into millions of homes. One computer scientist predicts that robots like Baxter will soon toil in fast-food restaurants topping pizzas, at bakeries sliding dough into hot ovens, and at a variety of other service-sector jobs, in addition to factories.
    I wanted to meet this worker of the future and his robot siblings, so I spent a day at this year's Automate trade show here, where Baxter was one of hundreds of new commercial robots on display. Simply by guiding his hands and pressing a few buttons, I programmed him to put objects in boxes; I played blackjack against another robot that had been temporarily programmed to deal cards to show off its dexterity; and I watched demonstration robots play flawless games of billiards on toy-sized tables. (It turns out that robots are not only better at many professional jobs than humans are, but they can best us in our hobbies, too.)
    During a keynote speech to kick off the trade show, Henrik Christensen, director of robotics at Georgia Tech, outlined a vision of a near future when we'll see robots and autonomous devices everywhere, working side by side with humans and taking on a surprisingly diverse set of roles. Robots will load and unload packages from delivery trucks without human assistance—as one company's system demonstrated during the event. Robots will even drive the trucks and fly the cargo planes with our packages, Christensen predicted, noting that Google has already demonstrated its driverless car, and that the same technology that powers military drones can just as well fly a FedEx jet. "We'll see coast-to-coast package delivery with drones without having a pilot in the vehicle," he asserted.
    Away from the futuristic trade floor, though, a public discussion is growing about whether robots like Baxter and other new automation technologies are taking too many jobs. Similar concerns have cropped up repeatedly for centuries: when combines first arrived on farms, when the first machines hit factory assembly lines, when computers first entered businesses. A folk tune from the 1950s called "The Automation Song" could well be sung today: "Now you've got new machines for to take my place, and you tell me it's not mine to share." Yet new jobs have always seemed to emerge to fill the gaps left by positions lost to mechanization. There may be few secretaries today, but there are legions of social-media managers and other new professional categories created by digital technology.
    Still, what if this time is different? What if we're nearing an inflection point where automation is so cheap and efficient that human workers are simply outmatched? What if machines are now leading to a net loss of jobs rather than a net gain? Two professors at the Massachusetts Institute of Technology, Andrew McAfee and Erik Brynjolfsson, raised that concern in Race Against the Machine: How the Digital Revolution Is Accelerating Innovation, Driving Productivity, and Irreversibly Transforming Employment and the Economy (Digital Frontier Press, 2011). Arecent report on 60 Minutes featured the book's thesis and quoted critics concerned about the potential economic crisis caused by robots, despite the cute faces on their monitors.
    But robots raise an even bigger question than how many jobs are left over for humans. A number of scholars are now arguing that all this automation could make many goods and services so cheap that a full-time jobs could become optional for most people. Baxter, then, would become a liberator of the human spirit rather than an enemy of the working man.
    That utopian dream would require resetting the role work plays in our lives. If our destiny is to be freed from toil by robot helpers, what are we supposed to do with our days?
    To begin to tackle that existential question, I decided to invite along a scholar of work to the Automate trade show. And that's how my guest, Burton J. Bledstein, an expert on the history of professionalism and the growth of the modern middle class, got into an argument with the head of a robotics company.
    It happened at the booth for Adept Technology Inc., which makes a robot designed to roam the halls of hospitals and other facilities making deliveries. The latest model­—a foot-tall rolling platform that can be customized for a variety of tasks­—wandered around the booth, resembling something out of a Star Wars film except that it occasionally blasted techno music from its speakers. Bledstein was immediately wary of the contraption. The professor, who holds an emeritus position at the University of Illinois at Chicago, explained that he has an artificial hip and didn't want the robot to accidentally knock him down. He needn't have worried, though; the robot is designed to sense nearby objects and keep a safe distance.
    The New Industrial Revolution
    Sam Ogden, Science Source
    The company's then-CEO, John Dulchinos, assured us that on the whole, robots aren't taking jobs—they're simply making life better for human employees by eliminating the most-tedious tasks. "I can show you some very clear examples where this product is offloading tasks from a nurse that was walking five miles a day to allow her to be able to spend time with patients," he said, as the robot tirelessly circled our feet. "I think you see that in a lot of the applications we're doing, where the mundane task is done by a robot which has very simple capability, and it frees up people to do more-elaborate and more-sophisticated tasks."
    The CEO defended the broader trend of companies' embracing automation, especially in factory settings where human workers have long held what he called unfulfilling jobs, like wrapping chicken all day. "They look like zombies when they walk out of that factory," he said of such workers. "It is a mind-numbing, mundane task. There is absolutely no satisfaction from what they do."
    "That's your perception," countered Bledstein. "A lot of these are unskilled people. A lot of immigrants are in these jobs. They see it as work. They appreciate the paycheck. The numbness of the work is not something that surprises them or disturbs them."
    "I guess we could just turn the clock back to 1900, and we can all be farmers," retorted Dulchinos.
    But what about those displaced workers who can't find alternatives, asked Bledstein, arguing that automation is happening not just in factories but also in clerical and other middle-class professions changed by computer technology. "That's kind of creating a crisis today. Especially if those people are over 50, those people are having a lot of trouble finding new work." The professor added that he worried about his undergraduate students, too, and the tough job market they face. "It might be a lost generation, it's so bad."
    Dulchinos acknowledged that some workers are struggling during what he sees as a transitional period, but he argued that the solution is more technology and innovation, not less, to get to a new equilibrium even faster.
    This went on for a while, and it boiled down to competing conceptions of what it means to have a job. In Bledstein's seminal book, The Culture of Professionalism, first published in 1976, he argues that Americans, in particular, have come to define their work as more than just a series of tasks that could be commodified. Bledstein tracks a history of how, in sector after sector, middle-class workers sought to elevate the meaning of their jobs, whether they worked as athletes, surgeons, or funeral directors: "The professional importance of an occupation was exaggerated when the ordinary coffin became a 'casket,' the sealed repository of a precious object; when a decaying corpse became a 'patient' prepared in an 'operating room' by an 'embalming surgeon' and visited in a 'funeral home' before being laid to rest in a 'memorial park.'"
    The American dream involves more than just accumulating wealth, the historian argues. It's about developing a sense of personal value by connecting work to a broader social mission, rather than as "a mechanical job, befitting of lowly manual laborer."
    Today, though, "there's disillusionment with professions," Bledstein told me, noting that the logic of efficiency is often valued more than the quality of service. "Commercialism has just taken over everywhere." He complained that in their rush to reduce production costs, some business leaders are forgetting that even manual laborers have skills and knowledge that can be tough to simulate by machine. "They want to talk about them as if these people are just drones," he said as we took a break in the back of the exhibit hall, the whir of robot motors almost drowning out our voices. "Don't minimize the extent of what quote-unquote manual workers do—even ditch diggers."
    In Genesis, God sentences Adam and Eve to hard labor as part of the punishment for the apple incident. "Cursed is the ground because of you; through painful toil you will eat food from it all the days of your life" was the sentence handed down in the Garden of Eden. Yet Martin Luther argued, as have other prominent Christian leaders since, that work is also a way to connect with the divine.
    People's relationship to work has been complex from the start, and its cultural resonance has shifted over time. Today many people's identities are tied up in their jobs. "Beyond mere survival, we create ourselves in our work," writes Al Gini, a professor of business ethics at Loyola University Chicago, in his 2001 book, My Job, My Self.
    But Gini points to earlier periods when attitudes were quite different. The ancient Greeks, for instance, used slaves for most labor and "regarded work as a curse, a drudgery, and an activity to be conducted with a heavy heart." Their view, he writes, was that "work by its very nature inhibited the use of reason and thereby impeded the search for the ultimate ends of life."
    Aristotle never worked a day in his life.
    The New Industrial Revolution
    Swikar Patel for The Chronicle Review
    Burton Bledstein, a historian of professionalism and the middle class, at the Automate trade show
    Today Jeremy Rifkin is among those who make a case for what he calls "rethinking work." Rifkin, president of the Foundation on Economic Trends and a senior lecturer at the University of Pennsylvania's Wharton School of business, is best known for his 1995 best seller, The End of Work. In his most recent book, The Third Industrial Revolution, he says that a reshaping of society made possible by a variety of trends, including automation systems and green technology, could leave people more time for what he calls "deep play."
    He imagines robots' making manufacturing so cheap and efficient that most people will simply be able to work less to meet their basic needs. He says we will then be free to start new kinds of nonprofit activities that link us with other people in new ways, helping us lead more-fulfilling lives.
    "Why is it that being a productive worker is the highest value of being alive on this planet?" Rifkin asks. "The real mission of the human race is to learn how to begin to integrate ourselves into a single biosphere," he says, arguing that the Internet can bring about a true global village.
    "What we have to come to grips with now is that the most productive and efficient human being is not going to be as productive and efficient in a physical or intellectual way as the automated technology that's coming," he says.
    Work won't go away completely, in his view, but the workweeks for many will greatly decrease. "The average work day in forager or hunter-gatherer society is three to four hours—the rest is leisure or play," he says. In the robot age, "I think a five-to-six-hour day makes sense."
    Frithjof Bergmann, an emeritus professor of philosophy at the University of Michigan at Ann Arbor, goes further in his proposals for a radical restructuring of society that would bring about what he calls a "New Work system."
    He envisions a social structure in which large-scale manufacturing plants disappear, replaced by a series of neighborhood centers with advanced 3-D printers that can make a variety of goods on demand. People would spend part of their week doing self-service tasks to maintain their own lives—like homemade manufacturing and urban gardening—and spend a couple of days a week at what he calls a "Paid Calling," some task uniquely suited to each worker. That way "the impulse for the work arises from within me" and comes "from the very heart and core of my soul," as he put it in an essay, published in 2000, titled "Ecology and New Work."
    Bergmann runs an organization in Flint, Mich., called the Center for New Work, to advance this vision, and he argues that the current economic recession provides an opportunity to phase in his ideas, some of which he has been promoting since the 1980s. "I spent just now two weeks teaching people in Detroit how to make the best possible use of 3-D printers," he told me. "You can use fabricators like you're already using urban gardening, so you do have the opportunity to spend much more time than you had in the past to do things that are to your taste."
    Even some of the roboticists here at the Automate show believe that their inventions could lead to a rebooted work environment. One of them is Gary R. Bradski, a machine-vision scientist on leave from Stanford University to help start a company called Industrial Perception—the one demonstrating a robot that can unload boxes from a delivery truck without human assistance.
    "You're going to see in the next five to 10 years a significant increase in automation and robotics within the health-care space,"
    Bradski said he could imagine a world in which everyone owns shares of manufacturing companies where almost all of the work is done by robots, with those shares providing a "baseline" income to all. Those who want finer things or experiences could do extra work—by inventing or designing things. He notes that teenagers without jobs have no trouble filling their days, and that people could spend time with "storytelling and play and coming up with new ideas that some 3-D printer can implement."
    Utopian visions of machines eliminating the need for work date back to the earliest days of labor-saving devices, said Edward Granter, a lecturer at the University of Manchester's business school. "People have been writing about utopias like that since the time of St. Thomas More," he said, referring to the Renaissance social philosopher who made the term famous. Granter published a book in 2009, Critical Social Theory and the End of Work, that tracks the history of such utopian ideas. During the 1930s, some experts even interpreted the Great Depression as an "indication that technology was at the stage where people were being permanently eliminated from the production process," and some saw the prospect of a more leisured future as "replete with a certain promise," he writes.
    In an interview, Granter praised the latest versions of these ideas and noted that such visions are helpful reminders that the idea of work could be different. But he said that if history is a guide, we're unlikely ever to be freed from working.
    What is most surprising about the latest round of automation technology is that it is affecting not just working-class jobs but desk jobs as well, he said. Software that helps in legal research, or "document discovery," is replacing some lawyers, for instance, and plenty of other information workers, including tax preparers and copy editors, are at risk of being elbowed out by computer programs that can do part of their jobs. One researcher has even developed a software program that writes books automatically, drawing on facts posted in public-domain resources on the Internet.
    "We were supposed to be the elite," Granter said. "But information workers became even more precarious than industrial workers."
    What do the people who work with robots like Baxter think of their new co-workers? I called up a hospital that bought one of the Adept robot couriers to find out.
    "At first, when we were trialing the robot, there was a bit of resistance," said Jeremy Angell, coordinator of support services at CentraCare Health System's St. Cloud Hospital, in Minnesota. Angell supervises a robot courier named Rocky, who is custom-made to hold several vials in carefully marked slots, and whose job is to carry those specimens from nurses to lab technicians and back again. Some lab technicians worried that it would be cumbersome to figure out which sample was which when this rolling shelf pulled up.
    The assistants who had previously made the deliveries liked Rocky from the start, though. Carrying specimens around had been a hassle that left less time to do other tasks, like responding to phone requests from nurses and other hospitals that use the lab.
    Angell said no one at the hospital had lost a job because of Rocky. But the robot allowed the laboratory to handle more work without hiring the two full-time assistants that had previously been planned. "We did not have to bring in someone to do a menial task," he said.
    One of the laboratory assistants, Lynn Balaski, explained that she uses Rocky only during the busiest times, and that when things are slow she still prefers to hand-deliver the samples. "He's there when I need him, which isn't all the time," she said.
    Lab workers jokingly pretend that Rocky is more than just plastic and programming, and find themselves responding playfully to his preprogrammed jokes or comments about the weather—all recorded by Angell. But the robot's comic timing is so bad that the sheer ineptness makes Balaski laugh.
    The recent federal health-care-reform law has led more hospitals to consider bringing in these kinds of courier robots, said Sandy Agnos, a product manager for Swisslog Healthcare Solutions, which helped customize Rocky. "You're going to see in the next five to 10 years a significant increase in automation and robotics within the health-care space," she said. "You have hospitals that are being forced to cut staff, and then you have constrained resources where people have to multitask."
    Warehouses are the front lines of human-robot relations, though. Amazon, the online-retailing giant, has been a high-profile adopter of automation technology, bringing in fleets of sophisticated rolling robots to carry shelves from a storage area to "pick workers," who take what the robots bring and drop those items into boxes. The company declined my request to interview one of the humans who work with so many robotic colleagues. But another company that operates warehouses using the same robots connected me to a manager in its facility in Devens, Mass.
    "Out on the floor, we've kind of just become used to having them here," said the manager, Brian Lemerise, a senior director for the company, called Quiet Logistics. He said the robots, made by Kiva Systems, eliminate the need for humans to walk miles a day fetching items in a storage area the size of two football fields. That trekking was "a non-value-added use of time," he said. One robot can do the work of one and a half people, and because the company can afford more of them than it could human workers, packages ship faster.
    Lemerise said that because the robots had been around since the company's beginnings, about four years ago, employees see them as helping make their jobs possible rather than as threatening them.
    Ana Santana, a 27-year-old pick worker, said that she previously worked in a warehouse where she had to walk to items herself, and that she preferred leaving that part to the robots.
    "I feel like somebody's helping me," she said. Now she goes to the gym to get her exercise. "I know I can do half an hour and I'm done," she added with a laugh.
    The rolling robots are also much quieter than the system of conveyor belts that moves items around other warehouses, she said, so she can talk with two other human co-workers at stations near hers.
    The robotic system constantly adjusts the pace at which it brings items to the human pick workers, always making sure to have about 200 seconds' worth of work on deck, no more, no less. That means if a worker slows down, the robot sends less work over. Some workers try to see if they can outrun their mechanical partners, said Lemerise.
    Santana said she had no fear that robots could eventually replace her. "Humans need to be involved in orders," she said. "The robots cannot pack the orders, cannot pick them. They just make our jobs easier."
    One reason for all the fuss about Baxter and Rocky taking jobs may be a longstanding tendency to personify robots.
    "With robots, it feels a little more like it's replacing a person," said Benjamin F. Jones, an associate professor at Northwestern University's business school who specializes in innovation. "A robot is one-to-one, almost. But one combine harvester is probably replacing 100 people."
    Still, the question of whether robots are helping or hurting the work force has become a serious policy issue. Georgia Tech's Christensen, the keynote speaker at the trade show and a leading pro-robot spokesman, has argued to the Obama administration that new robot workers can help bring back manufacturing jobs to the United States that have moved overseas. Administration officials were skeptical at first, he acknowledged: "You're about killing jobs, why would we talk to you," he remembered being told. But he said they "got convinced," and he pointed to a recent move by Apple to move more production of its computers to the United States because automation made it cheap enough. The professor recently helped update a white paper sponsored by the National Science Foundation laying out a "National Robotics Roadmap" for the country.
    And Jeremy Rifkin, who writes about moving to an era of "deep play," is an adviser to the European Union.
    Bledstein said he may write something more about automation and how it has changed the middle class, and he mentioned that he would continue to teach and do research as long as he can. He wants to keep working. He thinks every professional does, as long as the work is meaningful. "People I know who have really retired, they have really deteriorated quickly," he said. "Work is far more than just a practical category. It's fundamental. We need work."
    By the end of a day at the Automate trade show, my feet were tired, and I was coming down with a cold. As I trudged out, I was struck by how steady and relentless the robots on display appeared, with some moving as many as 300 objects per minute in an endless loop. They weren't going to stop unless someone hit the off switch.
    Jeffrey R. Young is a senior editor at The Chronicle.

    BOREDOM (The Pregnant Pause for Creativity!)


    The iPhone Killed My Creativity

     yesterday


    The iPhone Killed My Creativity
    I love my iPhone. I take it with me everywhere. But I am starting to fear it may be killing my creativity. 
    Numerous studies and much accepted wisdom suggest that time spent doing nothing, being bored, is beneficial for sparking and sustaining creativity. With our iPhone in hand - or any smartphone, really - our minds, always engaged, always fixed on that tiny screen, may simply never get bored. And our creativity suffers.
    Peter Toohey, author of Boredom: A Lively Historytold the New York Timesthat boredom is the experience of "wanting to, but being unable to engage in satisfying activity." No wonder those of us with smartphones are able to avoid boredom so easily. We can always engage in some satisfying activity, no matter how trivial - snap a picture of our meal, play a quick game of Angry Birds, check-in on Foursquare or leave a tip. 
    We may be helpless, despite knowing the deleterious effects of these devices. Consider that Apple's latest marketing campaign perfectly captures the breadth of functions and fun the iPhone readily delivers to its millions of users. There is so much anyone can do with this magical device, so simply, so quickly, from any place, at any time. The problem is that this may not be a good thing. At least, not always.  

    Mobile Blinders

    Eradicating boredom and banishing downtime has its upside, of course. In a recent Bloomberg report, major advertisers, including Coca-Cola and Hearst, expressed their concern over lowered "impulse" sales at the grocery store checkout aisles. This is not at all surprising. Staring into their smartphones, with their "mobile blinders" on, people are less inclined to buy gum, candy or those trashy magazines.
    "For years, publishers could count on bored shoppers waiting in the checkout line to pick up a magazine, get engrossed in an article, and toss it into their cart alongside the milk and eggs. Then came 'mobile blinders.' These days, consumers are more likely to send a quick text and check their Facebook feed than to read a magazine or develop a momentary craving for the gum or candy on display." 
    Awesome. Score one for the smartphone!
    But this victory comes at a cost. Spending so much time texting and updating, tweeting and watching, calling and playing at every free moment, from every location, never alone with our thoughts, never allowing our thoughts to drift, impacts our creativity, which in turn can limit our full potential.
    Edward de Bono, business consultant and self-described "father of lateral thinking" has authored numerous works on creative thinking. de Bono calls moments of boredom "creative pauses," which allows the mind to drift, and avails the person to new forms of input and understanding. 
    Boredom may be even more important for children than adults. Spending so much time on gadgets may "short circuit the development of creative capacity" in children, according to educational expert Dr. Teresa Belton. Other education experts similarly suggest that a child's imagination and creativity is ultimately aided through bouts of boredom.
    Earlier this year, Science Omega examined the benefits of boredom.
    "Psychologists from the University of Central Lancashire (UCLan) have conducted research into the potential upsides of boredom and found that the time we spend daydreaming could improve our creative ability."
    The lead researcher on the UCLan study, Dr. Sandi Mann, emphasized boredom's role in society:
    "I do strongly believe that we shouldn’t be afraid of boredom and that we all – adults, children, workers, non-workers – need a little bit of boredom in our lives. Of course I’m not saying we should make people attend boring meetings for the sake of it, but allowing staff downtime where they can daydream and let their minds wander could possibly lead to benefits for an organisation."

    Short-term Gain

    There is the possibility, of course, that by killing our boredom smartphones are freeing up time for better, more productive or uplifting pursuits. For example, psychology professor Gary Marcus distinguishes between the two primary types of pursuits we use to defeat boredom.
    "Boredom is the brain’s way to tell you you should be doing something else. But the brain doesn’t always know the most appropriate thing to do. If you’re bored and use that energy to play guitar and cook, it will make you happy. But if you watch TV, it may make you happy in the short term, but not in the long term."
    So much of what we do on our smartphones, however, is decidedly short-term: a few moments playing a game while we stand in line, a minute to scan Instagram as the person in front of us at the grocery store pulls out their checkbook. 
    A study last year by UK carrier O2 examined the amount of time the typical user spends each day on their smartphone. It's a lot - more than two hours a day, everyday. Most of that is spent browsing the Internet, on social networking sites, playing games, listening to music, calling, emailing and texting - and not, for example, learning a new language.  

    No Off Switch

    At work, employees are often encouraged to 'think outside the box.' The assumption is that such thinking will lead to creativity, innovation, and newer, better solutions to existing or expected problems. Spending so much time with our heads focused inside the box - staring at our smartphone - may mean, however, that we are ultimately limiting our creativity. There is no time freed up to see the larger picture, to make connections where they previously never existed, to allow our brains rest, to see and hear and accept alternatives. 
    Though I confess I hope I am wrong about all of this.
    I spend far more time than the average user with my eyes staring into that small, bright and highly receptive screen. I am not sure I am able to shut it off, even now. 

    MOTION LEADERSHIP In Action (Purposeful Action in Education)

    iNACOL Competency Learning Report (Student-Centered Mastery)

    iNACOL Competency Learning (Student-Centered Mastery)
    http://www.competencyworks.org/wp-content/uploads/2013/02/inacol_cw_issuebrief_building_mastery_final.pdf

    Friday, March 29, 2013

    STEM & Engineering



    Bioengineering. Mechanical engineering. Environmental engineering. Aerospace engineering.
    Not exactly standard fare in elementary school, but several million children have been exposed to such fields through the fast-growing Engineering Is Elementary program since it was launched in 2004 by the Museum of Science in Boston. Its overarching goal is to "foster engineering and technological literacy among all elementary-aged children."
    Students design windmills, water filters, knee braces, and parachutes. They learn to think like an engineer and to tackle problems the way engineers do. Along the way, they explore relevant concepts in science and other disciplines.
    EIE curricular units are being used by about 45,000 teachers nationwide this year, more than triple the figure five years ago.
    Delaware is offering it to all public elementary schools, with support from the state's federal Race to the Top grant, said Christine M. Cunningham, the program's founder and director.
    Some school systems, including the Lakota district in Ohio, use it in all their elementary schools.
    Jennifer L. Haynes, a 2nd grade teacher at Woodland Elementary School in Liberty Township, Ohio, part of the Lakota district, got started last fall with the windmill unit.
    Students use concepts related to air and weather as they learn how windmills convert wind into energy. As part of the unit, they construct and test sails made of different materials and shapes to catch the wind. Then, they design, create, test, and improve their own windmills.
    Ms. Haynes appreciates the way the EIE program gets her students to think through problems, especially when a device they design doesn't work as expected the first time.
    "They have to stop and think and ask: 'I wonder what it was that I used that didn't work?' " she said. "They really do learn perseverance. ... In that mistake, they will learn something else that will make it better."
    Faye Harp, a curriculum specialist for the 17,000-student Lakota district, sees many benefits for children.
    "They are utilizing science concepts they're learning about, but also building those 21st-century skills: thinking critically, problem-solving, communication, collaboration," she said.

    'Go Wild and Have Fun'

    Teachers typically implement one or two units each school year, said Ms. Cunningham. A given unit typically takes one or two weeks to complete, with roughly 45 to 50 minutes per day spent on it, she said. There are 20 units in all.
    "Each unit is designed to integrate with a topic commonly taught in elementary science," she said. Those include ecosystems, energy, the human body, magnetism, and electricity.
    In addition, EIE staff, in collaboration with classroom teachers, recently developed math lessons for each unit and have mapped them against the Common Core State Standards.
    The Museum of Science also developed a high school engineering course, Engineering the Future. And it's planning to publicly roll out an after-school program for the middle grades later this year, called Engineering Everywhere.
    The Minneapolis district uses the elementary program systemwide, targeting grades 3-5. It chose units that "reinforced and extended concepts we already address in science," said Joseph F. Alfano, the 32,000-student district's K-5 coordinator for STEM, or science, technology, engineering, and mathematics.
    RELATED BLOG
    A bonus with the program, he said, is that as teachers come to understand the "instructional pathway" for engineering design, they discover engineering-design opportunities of their own that fit with the district's math and science curriculum.
    "It's super hands-on," said Amber Ringwelski, a 4th grade teacher at Pillsbury Community School in Minneapolis, of the EIE curriculum. "Students are really solving problems."
    She recently taught a unit in which students explore the properties of magnets and design a maglev transportation system. (Maglev trains are levitated by magnets.)
    "The kids love it," she said of the program. "They're used to us saying, step-by-step, this is what you're supposed to do. But it's not about that. It's about them designing, to go wild and have fun."
    The big takeaway for kids, she said, is about the engineering-design process: "Asking a question, imagining all the possibilities, designing something, creating something, and then going back and making it better."