It used to be a relatively simple cost-benefit analysis to determine whether outsourcing various non-essential business processes was wise or not. If you were reducing costs and the new business processes got the job done as well as the older (expensive) processes, then it was pretty much a no-brainer.
During the great outsourcing migration at the start of the millennium, it seemed every company wanted to be first on the bandwagon to source technical development and quality assurance teams from Eastern Europe and Southeast Asia. It was relatively easy to find teams with specialized skillsets, such as expertise in developing touchscreen apps or natural-user interfaces. The low overhead promised massive savings over hiring domestically, but with few exceptions, outsourcing so far offshore failed to manifest the piles of money companies were expecting. Nowhere in the world was there as skilled a workforce working for lower wages — so what went wrong?
Outsourcing has technically been an acceptable business practice since ancient times, when certain artisans hired others to produce all or part of their products' components. Offshoring for IT projects, however, began to gain wider acceptance in the late 1990s as companies launched accelerated (and often belated) programs to address the upcoming issues associated with Y2K and to take advantage of the so-called "dot-com" bubble to harness what seemed to be limitless opportunities to embrace modern technology.