17 Stocks to Buy for the Dawn of Global AI Dominance

(Source: investorplace.com)  

AI stocks will power significantly higher over the next decade


June 23, 2021 | By Luke Lango, InvestorPlace Senior Investment Analyst


The year is 1950. The month is October. Alan Turing – the generational genius who cracked the Enigma code and helped end World War II – has just introduced a novel concept.

It’s called the “Turing Test,” and it is aimed at answering the fundamental question: Can machines think?

The world laughs. Machines? Think for themselves? Not possible.

But the Turing Test sets in motion decades of research into the emerging field of Artificial Intelligence.

It is research conducted in some of the most prestigious labs in the world, by some of the smartest people in the world, collectively working to create a new class of computers and machines that can, indeed, think for themselves.

Fast forward 70 years.

AI is everywhere.

AI is in your phones. What do you think powers Siri? Or how does a phone recognize your face?

AI is in your applications. How does Google Maps know directions and optimal routes? How does it make real-time changes based on traffic? How does Spotify create hyper-personalized playlists for you? Or Netflix recommend movies?

AI is on your computers. How does Google suggest personalized search items for you? How do websites use chatbots that seem like real humans?

As it turns out, the world shouldn’t have laughed back in 1950.

The great Alan Turing ended up creating a robust foundation upon which seven decades of groundbreaking research has compounded… year after year… ultimately resulting in self-thinking computers and machines not just being a “thing” – but being everything today.

Make no mistake. This decades-in-the-making “AI Revolution” is just getting started.

That’s because the machine learning (ML) and natural language processing (NLP) models upon which AI is built are informed with data.

Basically, the more data they have, the better the models get, and the more capable the AI becomes.

In the AI world, data is everything.

The volume and granularity of data globally is exploding right now, mostly because every object in the world is becoming a data-producing device.

Dumb phones became smartphones, and started producing bunches of phone usage data.

Dumb cars became smart cars, and started producing bunches of driving data.

Dumb apps became smart apps, and started producing bunches of consumer preference data.

Dumb watches became smart watches, and started producing bunches of fitness data.

Get the point?

As we’ve sprinted into the “Smart World” – where every object is a data-producing smart device – the amount and speed of data that AI algorithms have access to has exploded, making those AI algos more capable than ever…

Why else do you think AI has started popping up everywhere in recent years? It’s because 90% of the world’s data was generated in the last two years alone.

More data. Better ML and NLP models. Smarter AI.

It’s that simple.

And guess what? The world isn’t going to take any steps back in terms of this “smart” pivot. No. We love our smartphones, and smart cars, and smartwatches too much.

Instead, society is going to accelerate in this transition. Globally, the world produces about 2.5 exabytes of data per day today. By 2025, that number is expected to rise to 463 exabytes.

Let’s go back to our process…


More data. Better ML and NLP models. Smarter AI.

Thus, as the volume of data produced daily soars more than 185X over the next five years, ML and NLP models will get 185X better (more or less), and AI machines will get 185X smarter (more or less).

Folks… the AI Revolution is just getting started.

As my friends in the AI and robotics fields like to remind me: Most things a human does, a machine will be able to do better, faster, and cheaper. If not now, then soon.

Given the advancements AI has made over the past few years with the help of data – and the huge flood of data set to come online over the next few years – I’m inclined in believe them.

Eventually – and inevitably – the world will be run by hyperefficient and hyperintelligent AI.

I’m not alone in thinking this. Gartner predicts that 69% of routine office work will be fully automated by 2024, while the World Economic Forum has said that robots will handle 52% of current work tasks by 2025.

The AI Revolution is coming – and it’s going to be the biggest revolution you’ve ever seen in your lifetime.

As a hypergrowth investor, you need to be invested in this emerging technological megatrend that promises to change the world forever.

But, alas, the question remains: What AI stocks should you start buying right now?

You could play it safe, and go with the blue-chip tech giants, all of whom are making inroads with AI and represent low-risk, low-reward plays on the AI Revolution. I’m talking Microsoft (MSFT), Alphabet (GOOG), Amazon (AMZN), Adobe (ADBE), and Apple (AAPL).

Or, you could go for the “picks-and-shovels” chip-makers that are making the processing units upon which AI is developed. Those stocks offer a little more upside potential, but are still in the mid-risk, mid-reward category. Nvidia (NVDA) stands out for its world-class GPUs, and Advanced Micro Devices (AMD) comes to mind as the upstart in the industry with a lot of room to grow.

Or, you could go bigger with AI-specific software companies. These are smaller, more specialized, and singularly-focused tech companies that have a lot to gain in the even their AI efforts pay off.


In this category, you can go for the mid-cap stocks that with less risk, but less reward potential. Sound like a match for you? Consider these names:

  • Enterprise AI software developer C3.ai (AI), who is creating modular enterprise AI applications for use specifically in the oil and gas industries;
  • Low-code application platform provider Appian (APPN), who is leveraging AI to enable businesses to make automatable apps without doing any coding;
  • Data-science firm Palantir (PLTR), who is using AI to bring Batman-like technology to the real world;
  • Music streaming pioneer Spotify (SPOT), who is combining AI with its huge music dataset to improve new music discovery and create hyperpersonalized playlists for subscribers;
  • Databasing firm Snowflake (SNOW), who is creating the foundational database upon which ML and AI models can be built with ease.
  • Game design engine Unity Software (U), who is enabling its developers to use to AI to make more compelling gameplay.

Those are great AI stocks to buy today. But are they the best? Are they the ones that will score you 10X returns?

No. Instead, if you’re looking for 10X returns in the AI Revolution, you need to look in the small-cap world, where you have some early-stage companies pioneering potentially game-changing AI technologies that will, in time, unlock significant economic value.

I’m talking companies you’ve never heard of, but which are working on some very promising tech…

Companies like BigBear.ai (GIG) who is using AI to improve military operations, Stem (STEM) who is using AI to save you money on your electricity bill, LivePerson (LPSN) who is using AI to create virtual chatbots that don’t suck, and Editas (EDIT) who is using AI to cure blindness.

These small-cap AI stocks do have 10X upside potential in the AI Revolution.

So, go ahead and buy them, and if you want more picks, keep reading…

Because in my ultra-exclusive newsletter subscription service, Innovation Investor, I’m creating the ultimate portfolio of emerging megatrends and the individual companies leading them. These are stocks with the potential to score 10X-plus gains over the near- to long-term.

In fact, I have more than 40 hypergrowth stocks that could score investors Amazon-like returns over the next few months and years.

These stocks include the world’s most exciting autonomous vehicle startup, a world-class “Digitainment” stock creating the building blocks of the metaverse, a company that we fully believe is a “Tesla-killer,” and stocks emerging as leaders in artificial intelligence and machine learning.

Why to Invest in ARKQ, Artificial Intelligence

(Source: etftrends.com)  

TOM LYDON | JUNE 10, 2021


Artificial intelligence (AI) may be the most disruptive of all the disruptive technologies. At the very least, AI’s depth and rapid evolution are fast it making it a foundation in myriad industries – a status that carries with it an assortment of investment implications.

A plethora of exchange traded funds offer AI exposure in varying forms, but one of the dominant forces in that group is the ARK Autonomous Technology & Robotics ETF (CBOE: ARKQ). The actively managed ARKQ isn’t a dedicated AI fund, but it features exposure to industries AI intersects , including 3D printing, autonomous transportation, energy storage, robotics, and space exploration.

As is the case with so many disruptive technologies, hardware and semiconductors are the backbones of AI, and that’s not going to change anytime soon. For investors considering ARKQ, that’s a meaningful point because by some estimates, the AI hardware market alone could be worth $150 billion in four years.


ARKQ YTD Performance

ARKQ YTD Performance


ARKQ: Access to AI Leaders

With any disruptive market segment, stock selection is meaningful. On a related note, while past performance isn’t a guarantee of future returns, ARKQ is higher by almost 138% for the three years ending June 8, while the S&P 500 Growth Index is up “just” 73% during that period. That speaks to ARK stock-picking acumen.

“In our view, NVIDIA continues to be a thought leader,” Oppenheimer analyst Rick Schafer said in a research piece cited by CNBC. “Within the semiconductor space, NVIDIA was one of the first to recognize the AI market opportunity and notably pivoted its offerings to address the market more quickly than peers.”

NVIDIA is at the forefront of graphics processing units (GPU) acceleration, which is a key component in technologies ranging from data centers to personal and supercomputers. On the deployment side of the AI equation, NVIDIA is establishing prowess in autonomous vehicles and self-learning machines. GPU also backstops gaming consoles and cryptocurrency mining activities.

Oppenheimer’s Schaeffer highlights several other semiconductor names poised to benefit from AI expansion, including NXP Semiconductor (NXPI). MCU products produced by NXP are used by automotive and industrial machine clients as well as purveyors of Internet of Things (IoT) concepts.

NVIDIA and NXP combine for 4.43% of ARKQ’s roster, as of June 8. The ARK fund is up 6.67% year-to-date.

4 Robotics Stocks To Watch Amid Rising Shifts To Automation

(Source: stockmarket.com)  

Could these robotics stocks be the hidden gems of 2021?

By Amos C | June 7, 2021


These 4 Robotics Stocks Could Be Worth Your Attention


One key factor when it comes to investing in the stock market is to know what sector is thriving and what’s not. Robotics stocks may have fallen under the radar of many investors and this could prove to be a missed opportunity. The existence of robots dates back to the 1960s and their technology has been improving ever since. Sure, there are fears of robots taking over the world, especially if you have watched The Terminator. However, in the last ten years, many companies have turned to robots to improve efficiency, decrease equipment costs, and tackle the issue of rising labor costs. Therefore, it is no surprise that some investors may be looking for top robotic stocks to buy.

Many major industries are set to gain from investments in automation. For example, we have Tesla Inc (NASDAQ: TSLA) which aggressively pushes for automation to aid in the production of its electric vehicles. In May, Tesla along with Comau Robotics planned to set up a new series of automation equipment at its Fremont Factory in Northern California. In the past, the company may have admitted that it had overlooked the importance of having humans handling the manufacturing line. However, it still remains that automation plays an important role in meeting delivery goals, especially with the rising demand for electric vehicles. So, do you share the sentiment that robots could play a big role in our future? If you do, here is a list of 4 top robotics stocks to watch in the stock market today.


Best Robotics Stocks To Watch This Week

  • UiPath Inc (NYSE: PATH)
  • Brooks Automation, Inc (NASDAQ: BRKS)
  • ABB Ltd (NYSE: ABB)
  • Raytheon Technologies Corp (NYSE: RTX)

UiPath Inc


To kick off the list, we have an enterprise automation software company, UiPath. It offers an end-to-end platform for automation, combining Robotic Process Automation (RPA) solutions with a suite of capabilities that enable every organization to scale digital business capabilities. The company aims to have a robot for every person, hoping to transform the way humans work.


robotics stocks (PATH stock)

Source: TD Ameritrade TOS


A fortnight ago, the company announced the launch of UiPath Automation Awards 2021, its third annual competition designed to crown the most promising talents in enterprise software automation. UiPath aims to enable the development of creative business ideas and foster the capacity to scale early-stage companies and entrepreneurial ventures from Central and Eastern Europe and Turkey. With the fast pace of innovation in software automation and yet a huge pool of untapped potential, the company is keen to help emerging players into the space to reach its full potential.

Given that the company has only been publicly traded since April 21, its first financial earnings report would be highly anticipated. The company is scheduled to report its fiscal first-quarter 2021 on Tuesday. This would then give investors a clearer indication of the direction of the company. After all, the company’s stock did rise as high as $90.00 in May. So, would you be buying PATH stock ahead of its earnings report?


Brooks Automation, Inc


Another top robotic stock would be Brooks Automation. It is a provider of automation solutions for various applications and markets. Essentially it operates through two segments, Brooks Semiconductor Solutions Group and Brooks Life Science Systems. Its product offerings would include robots and integrated automation systems. BRKS stock has been on an upward trajectory over the past year. It has more than doubled in price within that period.


best robotics stocks to buy now (BRKS stock)

Source: TD Ameritrade TOS


In May, the company reported its second-quarter financial results. Revenue came in at $287 million, an increase of 30% year-over-year. Out of which, Life Sciences contributed $130 million and Semiconductor Solutions contributed $157 million. Meanwhile, operating income was $31 million, more than doubling the figure back in the prior year quarter. This record-level revenue is yet another testament to its strength and continued momentum in both its segments.

In fact, the company also announced its intention to separate the two segments into two separate independent publicly traded companies. The company believes that the separation will better position each of them to extend their advantages in the markets they serve. Given all this, would BRKS stock not be a top robotics stock to watch now?


ABB Ltd


ABB is a holding company that operates through four segments. These include Electrification Products, Robotics and Motion, Industrial Automation, and Power Grids. The company pushes the boundaries of technology by connecting software to its electrification, robotics, automation, and motion portfolio. ABB stock is yet another robotics stock that has been making waves in the stock market. It has risen by almost 60% over the past year.


top robotics stocks to watch right now (ABB stock)

Source: TD Ameritrade TOS


Last week, the company announced that it will be strengthening its commitment to reduce carbon emissions. It aims to reduce its emissions and to achieve carbon neutrality in its own operations by 2030. Given the hype surrounding clean and green energy, it is a significant step for companies to commit to such initiatives.

Financially, ABB has also reported its first-quarter earnings report back in April which surpassed analysts’ expectations. Revenues totaled 6.91 billion, up 11% from the prior year’s quarter. This upside is due to growth across most of its segments. Also, operational earnings before interest, taxes and amortization (EBITA) in the quarter increased by a whopping 50.8% to $959 million. Hence, with its impressive financial figures, would you consider adding ABB stock to your watchlist?


Raytheon Technologies Corp


To sum up the list, we have the aerospace and defense company, Raytheon. The company engages in providing advanced systems and services for commercial, military, and government customers worldwide. Furthermore, it also has a new rover on Mars with a long-term plan of sending people there. The company stock has risen over 30% year-to-date.


robotics stocks to buy right now (RTX stock)

Source: TD Ameritrade TOS


In May, the company along with GLOBALFOUNDRIES® (GF®), the global leader in feature-rich semiconductor manufacturing, will collaborate to develop and commercialize a new gallium nitride on silicon (GaN-on-Si) semiconductor. This is in hopes of enabling game-changing radio frequency performance for 5G and 6G mobile and wireless infrastructure appliances. This demonstrates its goal to make high-performance communications tech available at an affordable cost.

Aside from that, Raytheon is also retrofitting OxyTruck mobile oxygen filling stations to help with the COVID-19 crisis. The trucks can transport approximately 270,000 liters of oxygen. The company delivered these trucks to India to help tackle its alarming COVID-19 situation. This is to tackle the challenges faced in delivering oxygen to semi-urban and remote areas. All things considered, would you bet on RTX stock as the economy recovers?

How PepsiCo uses AI to create products consumers don’t know they want

(Source: venturebeat.com

Sage Lazzaro | June 28, 2021


If you imagine how a food and beverage company creates new offerings, your mind likely fills with images of white-coated researchers pipetting flavors and taste-testing like mad scientists. This isn’t wrong, but it’s only part of the picture today. More and more, companies in the space are tapping AI for product development and every subsequent step of the product journey.

At PepsiCo, for example, multiple teams tap AI and data analytics in their own ways to bring each product to life. It starts with using AI to collect intel on potential flavors and product categories, allowing the R&D team to glean the types of insights consumers don’t report in focus groups. It ends with using AI to analyze how those data-driven decisions played out.

“It’s that whole journey, from innovation to marketing campaign development to deciding where to put it on shelf,” Stephan Gans, chief consumer insights and analytics officer at PepsiCo, told VentureBeat. “And not just like, ‘Yeah, let’s launch this at the A&P.’ But what A&P. Where on the shelf in that particular neighborhood A&P.”


A new era of consumer research

When it comes to consumer research, Gans likes to say that “seeing is the new asking.” Historically, this stage of product development has always been based on asking people questions: Do you like this? Why don’t you like this? What would you like? But participants’ answers aren’t as telling as we’d like to think. They might not really care because they’re paid to be there, or they might just be trying to be nice. They might also be sincere in the moment, but it doesn’t mean they’ll still be excited about the product three years after launch.

“People will give you all sorts of answers,” Gans said. “It’s just not very close to what is ultimately driving their buying behavior.”

To uncover more telling insights PepsiCo can channel into product roadmaps, the company uses a tool called Tastewise, which deploys algorithms to uncover what people are eating and why. Also used by Nestlé, General Mills, Dole, and other major consumer packaged goods companies (CPGs), the AI-driven tool analyzes massive quantities of food data online. Specifically, Tastewise says its tool has monitored more than 95 million menu items, 226 billion recipe interactions, and 22.5 billion social posts, among other consumer touchpoints.

By collecting data from all these different sources — which represent what people are voluntarily talking about, searching for, and ordering in their daily lives — Gans says his team “can get a really good idea as to what people are more and more interested in.” For example, it was findings from the tool that gave PepsiCo the idea to incorporate seaweed into a flavored savory snack. The company brought it to market as Off The Eaten Path, and long story short, Gans said it’s been a top seller since.

“If you would’ve asked consumers, ‘tell me what your favorite flavors are and let us know what you think would be a great flavor for this brand,’ nobody would have ever come up with seaweed. People don’t associate that typically with a specialty snack from a brand. But because of the kind of listening and the outside-in work that we did, we were able to figure that out through the AI that’s embedded in that tool,” he said.


Data-driven social prediction


Taking another angle to insights, PepsiCo also leans heavily on Trendscope, a tool it developed in conjunction with Black Swan Data. Rather than analyze menus and recipes, it focuses exclusively on social conversations around food on Twitter, Reddit, blogs, review boards, and more. The tool considers context and whether or not the conversation is relevant to the business; it measures not only the volume of specific conversations, but how they grow over time. Gans says this allows the team to do what they call “social prediction.”

“Because we have done this over and over and over again now, we can actually predict which of the topics are going to stick and which are just going to kind of fizzle out,” he said.

The pandemic, for example, caused a massive spike in interest around immunity. By using Trendscope, PepsiCo determined that specifically for beverages, the interest is here to stay. About six months ago, the company acted on that insight when it launched a new line of its Propel sports drinks infused with immunity ingredients.


From idea to a shelf near you


Once the products are developed, there’s still plenty for AI and machine learning to do. Jeff Swearingen, who heads up PepsiCo’s demand accelerator (DX) initiative, said the company uses the technology in agriculture and manufacturing, which has helped reduce water consumption. Sales and marketing, his domain, also leans heavily on AI. He said the company started “moving very quickly” in 2015 by building big internal datasets. One has 106 million U.S. households, and for about half of that, he says the company has first-party data at the individual level. There’s additionally a store dataset of 500,000 U.S. retail outlets, as well as a retail output dataset, he says. Both his and Gans’ teams use the data to engage core consumers in “uniquely personalized ways,” from customizing retail environments to online ads.

For the launch of Mountain Dew Rise Energy, for example, PepsiCo determined which consumers would be more likely than average to enjoy the drink, and then narrowed in further to determine a core target. The store data then enabled the company to figure out exactly which retailers those core consumers were likely to shop at and reach them with highly targeted “everything.” This includes digital media campaigns and content, as well as assortment, merchandising, and presentation.

“If you go back five years, if you were to walk into those 50,000 [targeted] stores, the assortment, presentation, merchandising, all of those things would probably look like the other 450,000,” Swearingen said, using sample numbers to make the point. “Now in those 50,000 stores, we’re able to truly celebrate this product in a way that recognizes the shopper that’s walking in that store.”

In regards to marketing, PepsiCo also uses AI to do quality control on massive amounts of personalized digital ads. Specifically, the company partnered with CreativeX to build algorithms that check each piece of advertising to make sure it meets an evolving set of “golden rules,” like that the brand logo is visible or the message still comes across with sound off. Gans said using AI is the only way they can do proper quality control when “you may end up making 1,000 [ads] to reach 1,000 different segments of consumers.” The company has invested “a ton” of resources into AI, he said, and will be investing more in the years to come.

Five years ago, the company was still relying on traditional broadcast advertising, according to Swearingen, who added that the new AI-enabled efforts are much more efficient. “There’s so much waste, number one, and you’re not customizing the message to those people that really love this proposition,” he said of the traditional route. “And now we’re able to do that.”


Maintaining human connections


When it comes to customer relations, PepsiCo, like many companies, is tapping natural language processing (NLP) to more efficiently help anyone who may call with a question, suggestion, or complaint. “Through a simple NLP-driven system, we can make sure that the person that you end up talking to already has the content that is relevant for you,” Gans said, noting that talking to a robot for 45 minutes would be “AI gone very wrong.”

It’s a good example of how the company is working to keep humans in the AI loop, which Gans said is “literally [his] favorite topic.” He feels that in integrating these technologies, it’s easy to become overly reliant on the data, which can’t always speak to people’s actual motivations. As an example, he referenced a recent Pepsi ad, which focuses on the shared human emotions of the pandemic and doesn’t feature any products.

“I’m always making sure there is both a data-driven and a human empathy perspective brought to commercial decision making,” Gans said. “That is a key role and the ongoing challenge for my team.”

Google Cloud: COVID-19 accelerates AI use by manufacturers

(Source: techrepublic.com)  

by Allen Bernard in Cloud on June 9, 2021


76% of executives say they have embraced "digital enablers" like artificial intelligence, data analytics and cloud.

google-cloud.jpg

Image: Google Cloud


According to new research from Google Cloud, manufacturers around the globe have accelerated their use of digital technologies due to the COVID-19 pandemic. Of the 1,154 senior manufacturing executives polled for the report, "Google Cloud Industries: Artificial Intelligence acceleration among Manufacturers," 76% said they have embraced "digital enablers" such as artificial intelligence, data analytics and cloud.

"We used to count the number of AI and machine learning projects at Ford. Now it's so commonplace that it's like asking how many people are using math. This includes an AI ecosystem that is fueled by data, and that powers a 'digital network flywheel,'" said Bryan Goodman, director of artificial intelligence and cloud, Ford Global Data & Insight and Analytics, in the report.

Most respondents (64%) are using AI in their day-to-day operations to assist with business continuity (38%), increase efficiency (38%) and help employees do their jobs better (34%), the report said. Sixty-six percent of this group said their reliance on AI is growing.

The top five areas where AI is being deployed include:

  • Quality inspections (39%)
  • Supply chain management (36%)
  • Risk management (36%)
  • Product and/or production line quality checks (35%), and
  • Inventory management (34%).
"AI also applies to many other use cases, from powering connected factories to assisting with predictive maintenance," the report said. Custom machine learning models "can predict machine events that, left unchecked, could cause unscheduled downtime and negatively impact production schedules. In construction, AI can help builders reduce critical errors that lead to delays – while optimizing energy consumption, and supporting complex logistics and scheduling tasks."

The top three industries deploying AI today are automotive/OEMs (76%), automotive suppliers (68%) and makers of heavy machinery (67%). AI usage is increasing rapidly in the metals (75%), industrial assembly (72%) and heavy machinery (69%) sectors.

The top five countries whose industries have embraced AI are Italy (80%), Germany (79%), France (71%), U.K. (66%) and the U.S. (64%). Korea (85%), Japan (83%), and the U.S. (81%) are all increasing their use of AI faster than the rest of the world.

There are some headwinds, however. A lack of skilled personnel to implement AI (25%), not having the proper IT infrastructure in place ( 23%) and cost (21%) are the top three reasons cited by survey respondents for not deploying AI, the report said.

"The key to widespread AI adoption lies in its ease of deployment and use," the report said. "As AI becomes more pervasive in solving real-world problems for manufacturers, we see a shift from 'pilot purgatory' to the 'golden age of AI'. The industry is no stranger to innovation—from the days of mass production to lean manufacturing, six sigma, and more recently, enterprise resource planning. And now, AI promises to deliver even more innovation."

About the Survey

The survey was conducted online by The Harris Poll on behalf of Google Cloud, from Oct. 15 to Nov. 4, 2020, and included 1,154 senior manufacturing executives in France, Germany, Japan, South Korea, the U.K. and the U.S. who work at companies with more than 500 employees.

Artificial Intelligence Has Caused A 50% To 70% Decrease In Wages—Creating Income Inequality And Threatening Millions Of Jobs

(Source: forbes.com)  

Jun 18, 2021,
Jack Kelly, Senior Contributor

The middle and working classes have seen a steady decline in their fortunes. Sending jobs to foreign countries, the hollowing out of the manufacturing sector, pivoting toward a service economy and the weakening of unions have been blamed for the challenges faced by a majority of Americans.

There’s an interesting, compelling and alternative explanation. According to a new academic research study, automation technology has been the primary driver in U.S. income inequality over the past 40 years. The report, published by the National Bureau of Economic Research, claims that 50% to 70% of changes in U.S. wages, since 1980, can be attributed to wage declines among blue-collar workers who were replaced or degraded by automation.

Artificial intelligence, robotics and new sophisticated technologies have caused a wide chasm in wealth and income inequality. It looks like this issue will accelerate. For now, college-educated, white-collar professionals have largely been spared the fate of degreeless workers. People with a postgraduate degree saw their salaries rise, while “low-education workers declined significantly.” According to the study, “The real earnings of men without a high-school degree are now 15% lower than they were in 1980.”

Much of the changes in U.S. wage structure, according to the paper, were caused by companies automating tasks that used to be done by people. This includes “numerically-controlled machinery or industrial robots replacing blue-collar workers in manufacturing or specialized software replacing clerical workers.”

Artificial intelligence systems are ubiquitous. AI-powered digital voice assistants share everything you want to know just by asking it a question. Instead of a live person addressing a problem, a corporate chatbot forces you to engage with it. The technology is remarkable. It helps diagnose cancer and health issues. Banks use sophisticated software to check for fraud and bad behaviors. Driverless automobiles, newsfeeds, social media and job applications are all controlled by AI.

The World Economic Forum (WEF) concluded in a recent report, “A new generation of smart machines, fueled by rapid advances in AI and robotics, could potentially replace a large proportion of existing human jobs.” Robotics and AI will cause a serious “double-disruption,” as the pandemic pushed companies to fast-track the deployment of new technologies to slash costs, enhance productivity and be less reliant on real-life people. The WEF asserts automation will slash about 85 million jobs by 2025. In a dire prediction, WEF said, “While some new jobs would be created as in the past, the concern is there may not be enough of these to go round, particularly as the cost of smart machines falls over time and their capabilities increase.”

Management consulting giant PriceWaterhouseCoopers reported, “AI, robotics and other forms of smart automation have the potential to bring great economic benefits, contributing up to $15 trillion to global GDP by 2030.” However, it will come with a high human cost. “This extra wealth will also generate the demand for many jobs, but there are also concerns that it could displace many existing jobs.”

Concerns of new technologies disrupting the workforce and causing job losses have been around for a long time. On one side, the argument is automation will create new and better jobs and erase the need for physical labor. The counterclaim is that people without the appropriate skills will be displaced and not have a home in the new environment.

Amazon, Google, Microsoft, Apple, Zoom and other tech giants greatly benefited financially during the pandemic. The virus outbreak accelerated trends, including choosing technology over people. There’s still a need for humans. For example, although Amazon invested heavily in automation for its warehouses, the online retail giant still needed to hire over 300,000 workers during the pandemic. This brings up another important overlooked issue: the quality of a job. Proponents of AI say that there’s nothing to worry about, as we’ve always successfully dealt with new technologies. You may have a job, but what is the quality of it?

To remain relevant, you will have to learn new skills to stay ahead of the curve. Bloomberg reported, “More than 120 million workers globally will need retraining in the next three years due to artificial intelligence’s impact on jobs, according to an IBM survey.” The amount of individuals who will be impacted is immense.

The world’s most advanced cities aren’t ready for the disruptions of artificial intelligence, claims Oliver Wyman, a management consulting firm. It is believed that over 50 million Chinese workers may require retraining, as a result of AI-related deployment. The U.S. will be required to retool 11.5 million people in America with skills needed to survive in the workforce. Millions of workers in Brazil, Japan and Germany will need assistance with the changes wrought by AI, robotics and related technology.

For those who may be left behind, there’s a call for offering a universal basic income (UBI). This idea gained national attention when it became a major part of Democratic candidate Andrew Yang’s 2020 presidential campaign. Yang’s policy was to lift people out of poverty or help them through rough patches with a guaranteed monthly income. Supporters say it gives people needed financial security to find good jobs and avoid debt. Critics have argued free money would be a disincentive to work, creating a society dependent on the state.

According to a Wells Fargo research report, robots will eliminate 200,000 jobs in the banking industry within the next 10 years. This has already adversely impacted highly paid Wall Street professionals, including stock and bond traders. These are the people who used to work on the trading floors at investment banks and trade securities for their banks, clients and themselves. It was a very lucrative profession until algorithms, quant-trading software and programs disrupted the business and rendered their skills unnecessary—compared to the fast-acting technology.

There is no hiding from the robots. Well-trained and experienced doctors will be pushed aside by sophisticated robots that can perform delicate surgeries more precisely and read x-rays more efficiently and accurately to detect cancerous cells that can’t be readily seen by the human eye.

Truck and cab drivers, cashiers, retail sales associates and people who work in manufacturing plants and factories have and will continue to be replaced by robotics and technology. Driverless vehicles, kiosks in fast-food restaurants and self-help, quick-phone scans at stores will soon eliminate most minimum-wage and low-skilled jobs.

The rise of artificial intelligence will make even software engineers less sought after. That’s because artificial intelligence will soon write its own software, according to Jack Dorsey, the tech billionaire boss of Twitter and Square. That will put some beginner-level software engineers in a tough spot. When discussing how automation will replace jobs held by humans, Dorsey told Yang on an episode of the Yang Speaks podcast, “We talk a lot about the self-driving trucks and whatnot.” He added, “[AI] is even coming for programming [jobs]. A lot of the goals of machine learning and deep learning is to write the software itself over time, so a lot of entry-level programming jobs will just not be as relevant anymore.”

When management consultants and companies that deploy AI and robotics say we don’t need to worry, we need to be concerned. Companies—whether they are McDonald’s, introducing self-serve kiosks and firing hourly workers to cut costs, or top-tier investment banks that rely on software instead of traders to make million-dollar bets on the stock market—will continue to implement technology and downsize people, in an effort to enhance profits.

This trend has the potential to adversely impact all classes of workers. In light of the study’s spotlight on the dire results of AI, including lost wages and the rapid growth in income inequality, it's time to seriously talk about how AI should be managed before it's too late.

Enterprise ML — Why getting your model to production takes longer than building it

(Source: towardsdatascience.com)  

A Gentle Guide to the complexities of model deployment, and integrating with the enterprise application and data pipeline. What the Data Scientist, Data Engineer, ML Engineer, and ML Ops do, in Plain English.

Ketan Doshi

Jun 28  ·  11 min read


Let’s say we’ve identified a high-impact business problem at our company, built an ML (machine learning) model to tackle it, trained it, and are happy with the prediction results. This was a hard problem to crack that required much research and experimentation. So we’re excited about finally being able to use the model to solve our user’s problem!

However, what we’ll soon discover is that building the model itself is only the tip of the iceberg. The bulk of the hard work to actually put this model into production is still ahead of us. I’ve found that this second stage could take even up to 90% of the time and effort for the project.
So what does this stage comprise of? And why is it that it takes so much time? That is the focus of this article.

Over several articles, my goal is to explore various facets of an organization’s ML journey as it goes all the way from deploying its first ML model to setting up an agile development and deployment process for rapid experimentation and delivery of ML projects. If you’re interested, here’s my other article on this topic:

In order to understand what needs to be done in the second stage, let’s first see what gets delivered at the end of the first stage.

What does the Model Building and Training phase deliver?

Models are typically built and trained by the Data Science team. When it is ready, we have model code in Jupyter notebooks along with trained weights.

  • It is often trained using a static snapshot of the dataset, perhaps in a CSV or Excel file.
  • The snapshot was probably a subset of the full dataset.
  • Training is run on a developer’s local laptop, or perhaps on a VM in the cloud
In other words, the development of the model is fairly standalone and isolated from the company’s application and data pipelines.

Real-time Inference and Retraining in Production (Image by Author)


What does “Production” mean?

When a model is put into production, it operates in two modes:
  • Real-time Inference — perform online predictions on new input data, on a single sample at a time
  • Retraining — for offline retraining of the model nightly or weekly, with a current refreshed dataset
The requirements and tasks involved for these two modes are quite different. This means that the model gets put into two production environments:
  • A Serving environment for performing Inference and serving predictions
  • A Training environment for retraining


Real-time Inference is what most people would have in mind when they think of “production”. But there are also many use cases that do Batch Inference instead of Real-time.

  • Batch Inference — perform offline predictions nightly or weekly, on a full dataset
Batch Inference and Retraining in Production (Image by Author)


For each of these modes separately, the model now needs to be integrated with the company’s production systems — business application, data pipeline, and deployment infrastructure. Let’s unpack each of these areas to see what they entail.

We’ll start by focusing on Real-time Inference, and after that, we’ll examine the Batch cases (Retraining and Batch Inference). Some of the complexities that come up are unique to ML, but many are standard software engineering challenges.

Inference — Application Integration

A model usually is not an independent entity. It is part of a business application for end users eg. a recommender model for an e-commerce site. The model needs to be integrated with the interaction flow and business logic of the application.

The application might get its input from the end-user via a UI and pass it to the model. Alternately, it might get its input from an API endpoint, or from a streaming data system. For instance, a fraud detection algorithm that approves credit card transactions might process transaction input from a Kafka topic.

Similarly, the output of the model gets consumed by the application. It might be presented back to the user in the UI, or the application might use the model’s predictions to make some decisions as part of its business logic.

Inter-process communication between the model and the application needs to be built. For example, we might deploy the model as its own service accessed via an API call. Alternately, if the application is also written in the same programming language (eg. Python), it could just make a local function call to the model code.

This work is usually done by the Application Developer working closely with the Data Scientist. As with any integration between modules in a software development project, this requires collaboration to ensure that assumptions about the formats and semantics of the data flowing back and forth are consistent on both sides. We all know the kinds of issues that can crop up. eg. If the model expects a numeric ‘quantity’ field to be non-negative, will the application do the validation before passing it to the model? Or is the model expected to perform that check? In what format is the application passing dates and does the model expect the same format?

Real-time Inference Lifecycle (Image by Author)


Inference — Data Integration

The model can no longer rely on a static dataset that contains all the features it needs to make its predictions. It needs to fetch ‘live’ data from the organization’s data stores.

These features might reside in transactional data sources (eg. a SQL or NoSQL database), or they might be in semi-structured or unstructured datasets like log files or text documents. Perhaps some features are fetched by calling an API, either an internal microservice or application (eg. SAP) or an external third-party endpoint.

If any of this data isn’t in the right place or in the right format, some ETL (Extract, Transform, Load) jobs may have to be built to pre-fetch the data to the store that the application will use.

Dealing with all the data integration issues can be a major undertaking. For instance:

  • Access requirements — how do you connect to each data source, and what are its security and access control policies?
  • Handle errors — what if the request times out, or the system is down?
  • Match latencies — how long does a query to the data source take, versus how quickly do we need to respond to the user?
  • Sensitive data — Is there personally identifiable information that has to be masked or anonymized.
  • Decryption — does data need to decrypted before the model can use it?
  • Internationalization — can the model handle the necessary character encodings and number/date formats?
  • and many more…
This tooling gets built by a Data Engineer. For this phase as well, they would interact with the Data Scientist to ensure that the assumptions are consistent and the integration goes smoothly. eg. Is the data cleaning and pre-processing done by the model enough, or do any more transformations have to be built?

Inference — Deployment


It is now time to deploy the model to the production environment. All the factors that one considers with any software deployment come up:

  • Model Hosting — on a mobile app? In an on-premise data center or on the cloud? On an embedded device?
  • Model Packaging — what dependent software and ML libraries does it need? These are typically different from your regular application libraries.
  • Co-location — will the model be co-located with the application? Or as an external service?
  • Model Configuration settings — how will they be maintained and updated?
  • System resources required — CPU, RAM, disk, and most importantly GPU, since that may need specialized hardware.
  • Non-functional requirements — volume and throughput of request traffic? What is the expected response time and latency?
  • Auto-Scaling — what kind of infrastructure is required to support it?
  • Containerization — does it need to be packaged into a Docker container? How will container orchestration and resource scheduling be done?
  • Security requirements — credentials to be stored, private keys to be managed in order to access data?
  • Cloud Services — if deploying to the cloud, is integration with any cloud services required eg. (Amazon Web Services) AWS S3? What about AWS access control privileges?
  • Automated deployment tooling — to provision, deploy and configure the infrastructure and install the software.
  • CI/CD — automated unit or integration tests to integrate with the organization’s CI/CD pipeline.
The ML Engineer is responsible for implementing this phase and deploying the application into production. Finally, you’re able to put the application in front of the customer, which is a significant milestone!
However, it is not yet time to sit back and relax 😃. Now begins the ML Ops task of monitoring the application to make sure that it continues to perform optimally in production.

Inference — Monitoring


The goal of monitoring is to check that your model continues to make correct predictions in production, with live customer data, as it did during development. It is quite possible that your metrics will not be as good.

In addition, you need to monitor all the standard DevOps application metrics just like you would for any application — latency, response time, throughput as well as system metrics like CPU utilization, RAM, etc. You would run the normal health checks to ensure uptime and stability of the application.

Equally importantly, monitoring needs to be an ongoing process, because there is every chance that your model’s evaluation metrics will deteriorate with time. Compare your evaluation metrics to past metrics to check that there is no deviation from historical trends.

This can happen because of data drift.

Inference — Data Validation


As time goes on, your data will evolve and change — new data sources may get added, new feature values will get collected, new customers will input data with different values than before. This means that the distribution of your data could change.

So validating your model with current data needs to be an ongoing activity. It is not enough to look only at evaluation metrics for the global dataset. You should evaluate metrics for different slices and segments of your data as well. It is very likely that as your business evolves and as customer demographics, preferences, and behavior change, your data segments will also change.

The data assumptions that were made when the model was first built may no longer hold true. To account for this, your model needs to evolve as well. The data cleaning and pre-processing that the model does might also need to be updated.

And that brings us to the second production mode — that of Batch Retraining on a regular basis so that the model continues to learn from fresh data. Let’s look at the tasks required to set up Batch Retraining in production, starting with the development model.


Retraining Lifecycle (Image by Author)


Retraining — Data Integration


When we discussed Data Integration for Inference, it involved fetching a single sample of the latest ‘live’ data. On the other hand, during Retraining, we need to fetch a full dataset of historical data. Also, this Retraining happens in batch mode, say every night or every week.

Historical doesn’t necessarily mean “old and outdated” data — it could include all of the data gathered until yesterday, for instance.

This dataset would typically reside in an organization’s analytics stores, such as a data warehouse or data lake. If some data isn’t present there, you might need to build additional ETL jobs to transfer that data into the warehouse in the required format.


Retraining — Application Integration


Since we’re only retraining the model by itself, the whole application is not involved. So no Application Integration work is needed.


Retraining — Deployment


Retraining is likely to happen with a massive amount of data, probably far larger than what was used during development.

You will need to figure out the hardware infrastructure needed to train the model — what are its GPU and RAM requirements? Since training needs to complete in a reasonable amount of time, it will need to be distributed across many nodes in a cluster, so that training happens in parallel. Each node will need to be provisioned and managed by a Resource Scheduler so that hardware resources can be efficiently allocated to each training process.

The setup will also need to ensure that these large data volumes can be efficiently transferred to all the nodes on which the training is being executed.

And before we wrap up, let’s look at our third production use case — the Batch Inference scenario.

Batch Inference


Often, the Inference does not have to run ‘live’ in real-time for a single data item at a time. There are many use cases for which it can be run as a batch job, where the output results for a large set of data samples are pre-computed and cached.

The pre-computed results can then be used in different ways depending on the use case. eg.

  • They could be stored in the data warehouse for reporting or for interactive analysis by business analysts.
  • They could be cached and displayed by the application to the user when they log in next.
  • Or they could be cached and used as input features by another downstream application.
For instance, a model that predicts the likelihood of customer churn (ie. they stop buying from you) can be run every week or every night. The results could then be used to run a special promotion for all customers who are classified as high risks. Or they could be presented with an offer when they next visit the site.

A Batch Inference model might be deployed as part of a workflow with a network of applications. Each application is executed after its dependencies have completed.

Many of the same application and data integration issues that come up with Real-time Inference also apply here. On the other hand, Batch Inference does not have the same response-time and latency demands. But, it does have high throughput requirements as it deals with enormous data volumes.

Conclusion


As we have just seen, there are many challenges and a significant amount of work to put a model in production. Even after the Data Scientists ready a trained model, there are many roles in an organization that all come together to eventually bring it to your customers and to keep it humming month after month. Only then does the organization truly get the benefit of harnessing machine learning.

We’ve now seen the complexity of building and training a real-world model, and then putting it into production. In the next article, we’ll take a look at how the leading-edge tech companies have addressed these problems to churn out ML applications rapidly and smoothly.

The most recent dish... enjoy!

How the IRS is trying to nail crypto tax dodgers

(Source: cnbc.com )  PUBLISHED WED, JUL 14 2021, 12:08 PM EDT; UPDATED THU, JUL 15 2021 2:00 PM EDT MacKenzie Sigalos  @KENZIESIGALOS KEY PO...

Popular Dishes