We've seen how many blue-collar and service industry jobs are threatened by automation. The continuing impact of the post-2008 recession has also driven many 'traditional' jobs offshore, and they won't be coming back.
Now the Financial Times reports that the jobs of researchers and analysts are threatened by advances in artificial intelligence systems.
If Daniel Nadler is right, a generation of college graduates with well-paid positions as junior researchers and analysts in the banking industry should be worried about their jobs. Very worried.
Mr Nadler’s start-up, staffed with ex-Google engineers and backed partly by money from Google’s venture capital arm, is trying to put them out of work.
. . .
Warren – the name given to the system, in homage to investor Warren Buffett – is part of a new army of “smart” machines that are threatening to invade office life. These computers do not just collect and process information; they draw inferences, answer questions and recommend actions, too.
The threat to jobs stretches beyond the white-collar world. Advances in artificial intelligence (AI) also make possible more versatile robots capable of taking over many types of manual work. “It’s going to decimate jobs at the low end,” predicts Jerry Kaplan, a Silicon Valley entrepreneur who teaches a class about AI at Stanford University. Like others working in the field, he says he is surprised by the speed at which the new technologies are moving out of the research labs.
. . .
The impact of IT and automation on the world of work – and dire warnings about the job destruction they might cause – are as old as the technology itself. But the convergence of a number of tech trends has made the threat more immediate.
As a result, 47 per cent of jobs in the US are now at risk from computerisation, according to a prediction last year from Carl Benedikt Frey and Michael Osborne from Oxford university. McKinsey, the management consultancy, has estimated that by 2025, productivity gains in fields of “knowledge work”, ranging from clerical to professional services, could account for 40 per cent of all the current jobs in those areas.
One long-familiar tech trend is the relentless fall in the cost of computing power. According to Erik Brynjolfsson and Andrew McAfee, academics at the Massachusetts Institute of Technology, these incremental advances in computing have combined to make great leaps. Their book, The Second Machine Age , has stirred up angst this year about what the coming smart machines will do to job levels.
A second factor is the availability of vast bodies of digital data. Feeding off that information, advanced pattern-recognition systems – using a technique known as machine learning – are able to draw deductions that earlier machines could not attempt.
. . .
New ways of interacting with computers, making it easier for non-technical humans to work with machines on complex tasks, are a third part of the AI revolution. As with many other aspects of technology, smartphones have been the forerunner.
Services such as the Siri question-and-answer feature on Apple’s iPhone and the Google Now service that tries to anticipate a user’s information needs, may be the forerunners of similar user-friendly systems that will come to dominate the office.
This has fed two visions of the future of work. In one, the machines take on many of the boring parts of a job, setting humans free to supply the more advanced – and satisfying – brain work. The other vision is less harmonious: the machines leave many human workers on the scrap heap altogether.
There's much more at the link (which may expire or disappear behind a paywall after a short time). Disturbing, but essential reading if you're in the workforce, or have children who will be entering it in due course.
I can remember the initial impact of artificial intelligence (AI) systems on information technology (IT) systems and computer software development in the 1980's. I'd risen through the ranks as a computer operator, programmer, end-user computing assistant, systems analyst, and eventually project leader and manager. I'd grown used to terms such as 'systems engineering' and the like; but the first artificial intelligence project management and control systems were a revelation. They imposed and facilitated a 'building block' approach to systems development, breaking down everything into small functional units that could be used and re-used; they enforced common documentation standards, making it easier to understand why someone had done something in a particular way, and modify it in future when that person was no longer available; and they imposed a new structure on projects, forcing conformity to standards in a way that programmers had frequently evaded in the past. They were the harbinger of big changes to come as 'object-oriented programming' began to take over from older methods and tools.
AI today seems light years more advanced than those primitive systems that first changed the IT world. If it's at last bringing the same changes to other white-collar fields, it's not before time . . . but it's likely to have a similar impact on the workforce in those fields. In the 1970's and 1980's, to be an IT specialist was to be very well paid and have endless opportunities for advancement. Today programmers and entry-level systems analysts are ten a penny. Most of those jobs have been outsourced to India and the Far East - at Third World rates of pay, too.
Makes me glad I'm now a self-employed writer . . . at least I can't outsource me!