This article originally appeared on CIO.com. Steven Norton co-authored the piece.
You have heard the hype: Data is the “new oil” that will power next-generation business models and unlock untold efficiencies. For some companies, this vision is realized only in PowerPoint slides. At Western Digital, it is becoming a reality. Led by Steve Philpott, Chief Information Officer and head of the Digital Analytics Office (DAO), Western Digital is future- proofing its data and analytics capabilities through a flexible platform that collects and processes data in a way that enables a diverse set of stakeholders to realize business value.
As a computer Hard Disk Drive (HDD) manufacturer and data storage company, Western Digital already has tech-savvy stakeholders with an insatiable appetite for leveraging data to drive improvement across product development, manufacturing and global logistics. The nature of the company’s products requires engineers to model out the most efficient designs for new data storage devices, while also managing margins amid competitive market pressures.
Over the past few years, as Western Digital worked to combine three companies into one, which required ensuring both data quality and interoperability, Steve and his team had a material call to action to develop a data strategy that could:
To achieve these business outcomes, the Western Digital team focused on:
The course of this analytics journey has already shown major returns by enabling the business to improve collaboration and customer satisfaction, accelerate time to insight, improve manufacturing yields, and ultimately save costs.
Driving cultural change management and education
Effective CIOs have to harness organizational enthusiasm to explore the art of the possible while also managing expectations and instilling confidence that the CIO’s recommended course of action is the best one. With any technology trend, the top of the hype cycle brings promise of revolutionary transformation, but the practical course for many organizations is more evolutionary in nature. “Not everything is a machine learning use case,” said Steve, who started by identifying the problems the company was trying to solve before focusing on the solution.
Steve and his team then went on a roadshow to share the company’s current data and analytics capabilities and future opportunities. The team shared the presentation with audiences of varying technical aptitude to explain the ways in which the company could more effectively leverage data and analytics.
Steve recognized that while the appetite to strategically leverage data was strong, there simply were not enough in-house data scientists to achieve the company’s goals. There was also an added challenge of competing with silos of analytics capabilities across various functional groups. Steve’s team would ask, “could we respond as quickly as the functional analytics teams could?”
To successfully transform Western Digital’s analytics capabilities, Steve had to develop an ecosystem of partners, build out and enable the needed skill sets, and provide scalable tools to unlock the citizen data scientist. He also had to show his tech-savvy business partners that he could accelerate the value to the business units and not become a bureaucratic bottleneck. By implementing the following playbook, Steve noted, “we proved we can often respond faster than the functional analytics teams because we can assemble solutions more dynamically with the analytics capability building blocks.”
Achieving quick wins through incremental value while driving solution to scale
Steve and his team live by the mantra that “success breeds opportunity.” Rather than ask for tens of millions of dollars and inflate expectations, the team in IT called the High-Performance Computing group pursued a quick win to establish credibility. After identifying hundreds of data sources, the team prioritized various use cases based on those that met the sweet spot of being solvable while clearly exhibiting incremental value.
For example, the team developed a machine learning application called DefectNet to detect test fail patterns on the media surface of HDDs. Initial test results showed promise of detecting and classifying images by spatial patterns on the media surface. Process engineers then could trace patterns relating to upstream equipment in the manufacturing facility. From the initial idea prototype, the solution was grown incrementally to scale, expanding into use cases in metrology anomaly detection. Now every media surface in production goes through the application for classification, and the solution serves as a platform that is used for image classification applications across multiple factories.
A similar measured approach was taken while developing a digital twin for simulating material movement and dispatching in the factory. An initial solution focused on mimicking material moves within Western Digital’s wafer manufacturing operations. The incremental value realized from smart dispatching created support and momentum to grow the solution through a series of learning cycles. Once again, a narrowly focused prototype became a platform solution that now supports multiple factories. One advantage of this approach: deployment to a new factory reuses 80% of the already developed assets leaving only 20% site-specific customization.
Developing a DAO hybrid operating model
After earning credibility that his team could help the organization, Steve established the Digital Analytics Office (DAO), whose mission statement is to “accelerate analytics at scale for faster value realization.” Comprised of a combination of data scientists, data engineers, business analysts, and subject matter experts, this group sought to provide federated analytics capabilities to the enterprise. The DAO works with business groups, who also have their own data scientists, on specific challenges that are often related to getting analytics capabilities into production, scaling those capabilities, and ensuring they are sustainable.
The DAO works across functions to identify where disparate analytics solutions are being developed for common goals, using different methodologies and achieving varying outcomes. Standardizing on an enterprise-supported methodology and machine learning platform enables business teams faster time-to-insights with higher value.
To gain further traction, the DAO organized a hackathon that included 90 engineers broken into 23 teams that had three days to mock up a solution for a specific use case. A judging body then graded the presentations, ranked the highest value use cases, and approved funding for the most promising projects.
In addition to using hackathons to generate new demand, business partners can also bring a new idea to the DAO. Those ideas are presented to the analytics steering committee to determine business value, priority and approval for new initiatives. A new initiative then iterates in a “rapid learning cycle” over a series of sprints to demonstrate value back to the steering committee, and a decision is made to sustain or expand funding. This allows Western Digital to place smart bets, focusing on “singles rather than home runs” to maintain momentum.
Building out the data science skill set
“Be prepared and warned: the constraint will be the data scientists, not the technology,” said Steve, who recognized early in Western’s Digital journey that he needed to turn the question of building skills on its head.
The ideal data scientist is driven by curiosity and can ask “what if” questions that look beyond a single dimension or plane of data. They can understand and build algorithms and have subject matter expertise in the business process, so they know where to look for breadcrumbs of insight. Steve found that these unicorns represented only 10% of data scientists in the company, while the other 90% had to be paired with subject matter experts to combine the theoretical expertise with the business process knowledge to solve problems.
While pairing people together was not impossible, it was inefficient. In response, rather than ask how to train or hire more data scientists, Steve asked, “how do we build self-service machine learning capabilities that only require the equivalent of an SQL-like skill set?” Western Digital began exploring Google and Amazon’s auto ML capability, where machine learning generates additional machine learning. The vision is to abstract the more sophisticated skills involved in developing algorithms so that business process experts can be trained to conduct data science exploration themselves.
Designing and future-proofing technology
Many organizations take the misguided step of formulating a data strategy solely about the technology. The limitation of that approach is that companies risk over-engineering solutions with a slow time to value, and by the time products are in market, the solution may be obsolete. Steve recognized this risk and guided his team to develop a technology architecture that provides the core building blocks without locking in on a single tool. This fit-for-purpose approach allows Western Digital to future-proof its data and analytics capabilities with a flexible platform. The three core building blocks of this architecture are:
Designing and future-proofing technology: Collecting data
The first step is to be able to collect, store and make data accessible in a way that is tailored to each company’s business model. Western Digital, for example, has significant manufacturing operations that require sub-second latency for on-premise data processing at the edge, while other capabilities can afford cloud-based storage for the core business. Across both spectrums, Western Digital consumes 80-100 trillion data points into its analytics environment on a daily basis with more analytical compute power pushing to the edge. The company also optimizes where it stores data, decoupling the data and technology stack, based on the frequency with which the data must be analyzed. If the data is only needed a few times a year, the best low-cost option is to store the data in the cloud. Western Digital’s common data repository spans processes across all production environments and is structured in a way that can be accessed by various types of processing capabilities.
Further, as Western Digital’s use cases became more latency dependent, it was evident that they required core cloud-based big data capabilities closer to where the data was created. Western Digital wanted to enable their user community by providing a self-service architecture. To do this, the team developed and deployed a PaaS (Platform as a Service) called the Big Data Platform Edge Architecture using cloud native technologies and DevOps best practices in Western Digital’s factories.
Future-proofing technology: Process & govern data
With the data primed for analysis, Western Digital offers a suite of tools that allow its organizations to extract, govern, and maintain master data. From open source Hadoop to multi-parallel processing, NoSQL and TensorFlow, data processing capabilities are tailored to the complexity of the use case and the volume, velocity, and variety of data.
While these technologies will evolve over time, the company will continually need to sustain data governance and quality. At Western Digital, everyone is accountable for data quality. To foster that culture, the IT team established a data governance group that identifies, educates and guides data stewards in the execution of data quality delivery. With clear ownership of data assets, the trust and value of data sets is scalable.
Beyond ensuring ownership of data quality, the data governance group also manages platform decisions, such as how to structure the data warehouse, so that the multiple stakeholders are set up for success.
Future-proofing technology: Realize value
Data applied in context transforms numbers and characters into information, knowledge, insight, and ultimately action. In order to realize the value of data in the context of business processes – either looking backward, in real time, or into the future – Western Digital developed four layers of increasingly advanced capabilities:
By codifying the analytical service offerings in this way, business partners can use the right tool for the right job. Rather than tell people exactly what tool to use, the DAO focuses on enabling the fit-for-purpose toolset under the guiding principle that whatever is built should have a clear, secure, and scalable path to launch with the potential for re-use.
The platform re-use ability tremendously accelerates time to scale and business impact.
Throughout this transformation, Steve Phillpott and the DAO have helped Western Digital evolve its mindset as to how the company can leverage data analytics as a source of competitive advantage. The combination of a federated operating model, new data science tools, and a commitment to data quality and governance have allowed the company to define its own future, focused on solving key business problems no matter how technology trends change.
Asia Miles was established as a loyalty program under Cathay Pacific 20 years ago. Though it is still owned by Cathay, Asia Miles now partners with a great number of airlines as well as through an ecosystem of partners to add value to frequent travelers with different ideas of what a great loyalty program should yield.
Michael Yung is the Head of Digital Product and Technology at Asia Miles, and over the past five years, he has helped grow the company, and to help it continue its evolution to become a digital leader in the loyalty space. He explains the evolution of the company from largely a call center-based business to one that services customers across a wide array of digital formats. He describes the different types of customers Asia Miles serves. Yung also talks about the diverse team he has built. Lastly, he details his team’s creative use of blockchain for marketing campaigns using smart contracts. I caught up with Yung recently at Adobe Summit in Las Vegas, and covered all of these topics and more.
(To read future interviews like this one, please follow me on Twitter @PeterAHigh.)
Peter High: Please provide a brief overview of Asia Miles’ business.
Michael Yung: Asia Miles is the loyalty rewards program of the Hong Kong-based airline Cathay Pacific Airways. Similar to any loyalty rewards program, our members can earn miles by flying, traveling, shopping, dining, or even by having a mortgage with our banking partners.Our members can redeem points for many rewards such as hotel stays or laptops. We set up our program in 1999, so we are celebrating our 20th anniversary this year. Over the 20 years, we have accumulated over 11 million members, we have about 700 partners around the world to serve those members, and we are the leading loyalty program in Asia.
Craig Richardville has been named Senior Vice President and Chief Information Officer of SCL Health, a $2.6 billion faith-based nonprofit healthcare organization based in Denver, Colorado that operates eight care sites and more than 100 physician clinics in addition to home health care, hospice and other facilities.
His responsibilities include leading all aspects of the health system’s information technology strategy and operations, including enterprise systems and applications, information security, core infrastructure and leading the system’s digital transformation and information automation.
“I look forward to working with our executive team, medical group leaders and IT associates to further simplify and modernize our technology ecosystem so that we can be as efficient as possible in advancing patient care through analytics and high-performance business and clinical systems,” Richardville said. “Some key focus areas will include looking at how to apply big data and artificial intelligence technology to help to efficiently and effectively engage patients and staying proactive with cyber security to protect our organization,” he added.
“Given a ten percent chance of a 100 times payoff, you should take that bet every time. But you’re still going to be wrong nine times out of 10.” –Jeff Bezos
Leading organizations like Amazon, Walmart, Uber, Netflix, Google X, Intuit and Instagram have all vigorously embraced the philosophy that rapid experimentation is the most efficient and effective path to meeting customer needs. In an interview with Metis Strategy’s Peter High, entrepreneur Peter Diamandis explains that the most nimble and innovative companies like Uber and Google X “are running over 1,000 experiments per year and are creating a culture that allows for rapid experimentation and constant failure and iteration.”
Traditional strategic planning taught us to study all the pieces on the chess board, develop a multi-year roadmap, and then launch carefully sculpted new products or services. Executives believed that there was only one chance to “get it right,” which often left organizations allowing perfect to be the enemy of the good.
However, in the digital era, decision velocity is more important than perfect planning.
Accelerating decision velocity through experimentation
The most successful organizations cede the hubris of believing they will always be able to perfectly predict customer or user demands, and instead let data—not opinions—guide decision making. The data that informs decision making is derived from a series of experiments that test a hypothesis against a small but representative sample of a broader population.
The experiment should examine three questions
And then lead to one of three conclusions:
Often, experiments fall into the second category, in which case organizations demonstrate enough viability to iterate on the idea to further hone and enhance the product-market fit. The key is to gain this insight early, and course-correct as necessary. It is easy to correct being two degrees off course every ten feet but being two degrees off course over a mile will cause you to miss your target considerably (+/-0.35 feet vs. +/- 184 feet).
One simple example is when Macy’s was evaluating the desire build a feature that would allow customers to search for a product based on a picture taken with their smartphone. Other competitors had developed something similar, but before Macy’s invested significant sums of money, the retailer wanted to know if the idea was viable.
To test the idea, Macy’s placed a “Visual Product Search” icon on its homepage and monitored the click-through behavior. While Macys.com did not yet have the capability to allow for visual search, tens of thousands of customers clicked through, and Macy’s was able to capture emails of those that wanted to be notified when the feature was ready.
This was enough to begin pursuing the idea further. Yasir Anwar, the former CTO at Macy’s, said teams are “given empowerment to go and test what is best for our customers, to go and run multiple experiments, to test with our customers, (and) come back with the results.”
To accelerate decision velocity, we recommend that all companies develop a framework to create a “Business Experimentation Lab” similar to the likes of Amazon and Walmart. This Business Experimentation Framework (BEF) should outline how people with the right mindset, enabled by technology (though sometimes technology is not necessary), can leverage iterative processes to make more well-informed, yet faster decisions. Doing so frees organizations from entrenched, bureaucratic practices and provides mechanisms for rapidly determining the best option for improving customer experiences out of a list of possibilities.
A Business Experimentation Framework is crucial to:
While nearly every department can introduce some flavor of experimentation into their operating model, a core component and example in eCommerce is A/B testing, or split testing. A/B testing is a way to compare two versions of a single variable, and determine which approach is more effective.
At a recent meetup at Walmart’s Bay Area office, eCommerce product and test managers discussed the investments, processes, and roles required to sustainably hold A/B testing velocity while ensuring the occurrence of clean, accurate, and controllable experiments. Walmart began its journey towards mass A/B testing with a top-down decree—“What we launch is what we test”—and now is able to run roughly 25 experiments at any given time—and Walmart has grown the number of tests each year from 70 in 2016 to 253 in 2017.
To enable A/B testing at this velocity and quality, Walmart developed a Test Proposal process that organizes A/B tests and provides metrics for test governance, so teams can quickly make decisions at the end of a test. A Test Proposal defines:
To facilitate the lasting adoption of a Business Experimentation Framework, organizations must staff critical roles like test managers, development engineers, and test analysts. Walmart, for instance, has created the following roles to enable the launch and analysis of 250 tests per year:
Institutionalizing a bias for experimentation is not easy. We have seen several barriers to adopting a Business Experimentation Framework, such as:
Typically, enthusiasm for experimentation gains momentum with one beachhead department. That department develops a test-approval process that is supported by the tools and data necessary to test, analyze, learn, and make accurate go/no-go decisions.
Here is a blueprint for introducing a test-first culture:
If done well, establishing a Business Experimentation Framework will allow organizations to figure out what matters to most customers, within a limited amount of time, for a limited cost, and with a risk-reward tradeoff that will ultimately play to their favor.
As Bezos said, “We all know that if you swing for the fences, you’re going to strike out a lot, but you’re also going to hit some home runs. The difference between baseball and business, however, is that baseball has a truncated outcome distribution. When you swing, no matter how well you connect with the ball, the most runs you can get is four. In business, every once in a while, when you step up to the plate, you can score 1,000 runs. This long-tailed distribution of returns is why it’s important to be bold. Big winners pay for so many experiments.”
10/01/2018
By Peter High. Published on Forbes
Steve Randich has been a CIO many times over at organizations like the Chicago Stock Exchange, the NASDAQ, and with Citibank prior to taking on his current post as CIO of Financial Industry Regulatory Authority, Inc., better known as FINRA, which is a non-governmental organization that regulates brokerage firms and exchange markets. When asked about the evolution of the CIO role, he indicates that he has not seen much evolution, but that may be because he was a strategic leader, driving innovation from the CIO post long before others were presumptuous enough to think to do so.
At FINRA, Randich’s innovations center around leveraging artificial intelligence and machine learning to better surveil markets and broker-dealers. He also has led one of the most dramatic implementations of the public cloud. So extensive is the implementation that Amazon Web Services considers FINRA a best case example of the use of its technology. Randich has become an evangelist of the public cloud, citing it as the single technology across his career that actually gets cheaper as you use it more.
In this interview, he offers insights into all of the above and more.
(To listen to a podcast version of this interview, please visit this link. To read future articles like this one, please follow me on Twitter @PeterAHigh.)
Peter High: Could you give a rundown of FINRA’s operation and your role as the Chief Information Officer?
Steve Randich: FINRA goes back to 1937 when it was known as the National Association of Security Dealers. After going through some mergers, other regulatory functions, and stock exchanges, the company became known as FINRA ten years ago. FINRA, which employs roughly 3,500 people, focuses on protecting investors by surveilling both the markets and the broker-dealer activities. As CIO, I run a 1,100-person organization, which focuses on building surveillance systems that assist our staff in examining firms and regulating the markets.
High: How are the new weapons in the IT arsenal implemented into FINRA’s strategy?
Randich: FINRA processes enormous amounts of data as the company handles upwards of fifty billion transactions on a daily basis, including all of the quotes, orders, and trades collectively across the equity markets in the United States. Additionally, the company looks at the historical data to identify trends over time, which exposes market manipulation, insider trading, and fraud in the markets. Several years ago, FINRA decided to use open-source, big data technologies on public cloud platforms to handle the large amounts of data efficiently and with scale. Today, those efforts are largely completed, and we are now moving into machine learning and advanced analytics. This will enable machines to do more of the surveillance, which allows our surveillance analysts to avoid the work that is better suited for the machines.
To read the full article, please visit Forbes
7/30/2018
Mayank Prakash was a chief information officer multiple times over at private sector companies such as Avaya, Sage UK, and iSOFT. He also was a Managing Director at Morgan Stanley where he was responsible for Global Wealth and Investment Management Technology.
Four years ago, he pivoted dramatically, taking on the role of Chief Information and Digital Officer of the United Kingdom’s Department for Work and Pensions (DWP), which is the largest public sector organization in the country. He has found a motivated workforce up for the challenge of digital transformation in order to enhance the services that the DWP offers. Moreover, he has seen great advantages to being a more mature company inasmuch as there is ample data to run diverse, mature analytics on data sets that span several decades. He has also found that the significant challenges and opportunities that he and his team have been tasked with tackling have been a magnet for talent that is up to the challenge. He indicates that the greatest joy he has taken from this experience has been the opportunity to impact 22 million British citizens’ lives.
Peter High: Could you describe the United Kingdom’s Department for Work and Pensions?
Mayank Prakash: The Department for Work and Pensions is one of the largest organizations in Western Europe, and it is the largest public-sector organization in the United Kingdom. The best way for me to describe the DWP is from the perspective of the 22 million people who drive the change. Everybody in the UK has come across our services in their lives as we touch all citizens. We support children when their parents are separating to ensure that they have a better quality of life. We look after employed people who are of working age to make sure that they have fulfilling lives and that they are working. We look after disabled people to allow them to explore their potential. We look after retired people, who are typically living longer lives. We do all of this in an effort to produce better outcomes for them and for society.
High: Given the diverse array of people sho you deal with, ranging from millennials to older citizens of the United Kingdom, how do you think about the different personas or different experiences in which your citizenry wishes to interact with the DWP?
Prakash: We work with diversity on every dimension, ranging from age and gender to geographical footprint and social background. Like any large organization with a diverse footprint, we do not employ a one size fits all strategy, but instead, we use active segmentation of our customers to make sure we target our services to get the best impact. Additionally, we do not look at these customers as the cohorts. Instead, we ask ourselves what is the problem that we are trying to solve. The purpose is to work with people of working age to ensure that they are, in fact, working. That purpose leads to the need to get more people into work, which leads us to why some people may not be working. This leads to active segmentation and better delivery of targeted services.
6/25/18
Lenovo’s Kim Stevenson Senior Vice President and General Manager of the Data Center Solutions division has had a variety of roles in information technology. At technology behemoths such as IBM, EDS, and HPE, she worked on internal operations, but also gained her first exposure to customers who were technology executives. At Intel, she went from General Manager of IT Operations and Services to Chief Information Officer.
As CIO, she had an strong external customer orientation based on her experience prior to that. As a result, she quickly gained invitations to join boards. (She has served on the boards of Cloudera, Riverbed Technology, and she currently serves on the board of the wealth management and private banking company, Boston Private.) During her time as CIO, she was in the first class of Forbes CIO Innovation Award winners based on her team’s contribution of more than $1 billion in value to the company based on analytics. She would eventually rise to Chief Operating Officer, Client, IoT and System Architecture Group at Intel.
When she joined Lenovo in March of 2017, she did so with a remarkably rich set of experiences across the technology sector. As a result, she is an unusually well connected and highly regarded in the IT community. Now that she serves CIOs as clients again, she sees three things that CEOs and boards expect of CIOs: re imagining customer experience, driving productivity inside the enterprise, and delivering new products and services. She and her team are poised to help CIOs deliver all three, as she notes (among other themes) herein.
To listen to an unabridged audio version of this interview, please click this link.
Peter High: You are now the Senior Vice President and General Manager of the Data Center Solutions division of Lenovo. Could you describe your role and your responsibilities?
Kim Stevenson: At Lenovo, we aligned our Data Center Solutions division with our important customer segments, which is how we run our business. Each of those customer segments then report up into my organization.
The telco market is at a fundamental inflection point. We want to help drive a new, efficient infrastructure into the telco space. This plays into IoT, which will be all endpoints that are going to be connected in the world. And of course, there are data center implications for having multiple, new types of endpoints connecting into the network.
High: You were a former buyer of technology as the CIO of Intel. You rose beyond that role and became the Chief Operating Officer of the Client, IoT, and Systems Architecture Group. Now you are on the other side of the table as someone delivering to CIOs among others. How do you engage the customer set, and what was the transition like from one side of the table to the other?
Stevenson: Even before I joined Intel, I was with EDS and HP Enterprise Services running IT for customers and selling IT services to the CIO organization. When I moved to Intel to run internal IT, I felt like I was becoming my customer. Having that 360-degree view served me extraordinarily well. There were days when I thought to myself, “Why would anyone try to sell you this? It is just not practical.” There were also days that I felt I could understand more of what was possible from an innovation vector because I had seen many different types of accounts.
This is the next chapter, which is coming back to the business side. Now more than ever, the voice of the CIO in every company is becoming more strategic and more critical to the raw execution of the company. There is no business process in any company today that executes without some form of IT at its core. When I look at the role of the CIO today, I see three things that the board of directors and the CEO expect.
At Lenovo, I focus on helping the IT organization deliver on those three fundamental strategic priorities that exist in every company.
High: Can you expand on the translation of those general ideas to the way in which you are doing that in concert with the business?
4/2/18
By Peter High, published on Forbes
Barry Eggers is the rare venture capitalist in Silicon Valley who grew up in the region. He can recall a time when the area was covered with fruit orchards rather than start-ups. He spent time in the mergers and acquisitions department at Cisco, which had a remarkable track record during his tenure. After a brief stint in private equity, Eggers and a couple of colleagues founded Lightspeed Venture Partners in 2000. As he notes herein, the firm’s approach has been to identify key themes that the company can focus its investments around.
One area of interest for Eggers is Industry 4.0. This describes digital transformation in the manufacturing sector, and the moniker suggests that it is the fourth wave of change to manufacturing. It is especially interesting, as it brings together a wide array of technology trends such as data, analytics, human-machine interactions, and digital-physical conversion. Eggers describes this trend, and its broader implications in this interview.
Peter High: Industry 4.0 is a topic that you focus on at your firm, Lightspeed Venture Partners. Could you define the term and provide insight into the technology trends that are embedded within it?
Barry Eggers: Broadly speaking, Industry 4.0 is the digitization of the manufacturing sector. The reason it is called Industry 4.0 is that many see it as the fourth industrial revolution: the first was steam, the second was electricity, the third was automation, and the fourth is now digitization. Some people also refer to it as cyber-physical systems in the manufacturing industry. We see this as a huge opportunity.
The second technology is analytics and intelligence. That is where breakthroughs like machine learning and artificial intelligence are taking place
The third technology is human-machine interaction. Even with automation, humans are not going away. It has become easier for humans to interact with machines through touch interfaces, voice interfaces, augmented, and virtual reality. I think augmented reality will play a huge role here.
The last enabling technology is the digital-physical conversion. This is what allows you to create things quickly, such as multi-material 3D printing. The combination of these technologies is going to change the way manufacturing is done over the next 10-20 years.
High: You not only have deep connections in the start-up community, but you also spend a good deal of time with leaders – CEOs, CTOs, CIOs – of large organizations. How fast do you see the adoption of Industry 4.0 technologies happening, especially among larger organizations?
Eggers: An industrial revolution takes time. You have to look at the amount of machines and equipment that has to be replaced. With the advent of the steam engine, absolutely everything had to be replaced. With the advent of electricity, very little actually had to be replaced because it substituted electricity for steam. With automation, some estimates say that between 80 to 90 percent of machines had to be replaced.
To read the full interview, please visit Forbes
3/5/18
The consumer packaged goods (CPG) industry is a challenging one for a variety of reasons, one of which is the business-to-business-to-consumer nature of the industry. A CPG company must develop a relationship with a customer, who is a middleman in the form of a retailer who owns the relationship with the ultimate consumer of the product. Clorox is a $6 billion revenue CPG company, and its chief information officer Manjit Singh has helped the company work better with customers and with consumers. Singh has developed methods to develop direct relationships with the end-consumers through better consumer journey mapping exercises using digital marketing while facilitating direct shipping enabled through supply chain efficiencies.
At the same time, Singh and his team have gotten involved in new product innovation. An example of this is the Brita Infinity Pitcher, which is an Internet of Things product that measures usage of the filter and will automatically order a replacement filter from Amazon so that the consumer knows to change out the filters when the new filter arrives. IT developed the Brita Infinity Pitcher in concert with the Research & Development department. Singh covers each of these topics among others in this far ranging interview.
Peter High: You are the Chief Information Officer of The Clorox Company. Could you give an overview of the organization?
Manjit Singh: Clorox is a global consumer packaged goods (CPG) company. We operate in most countries around the world, selling a number of large brands including Brita Water Filters, Glad Trash Bags, Fresh Step Cat Litter, and our ubiquitous Clorox bleach product. We are also in the personal care business with our Burt’s Bees business. Most recently, we purchased a probiotics business called Renew Life which is doing well.
High: You have focused on both the internal and external aspects of digital transformation. Could you give us an overview of each of those pathways?
Singh: For the last couple of years, we have been focused on improving our digital marketing capabilities. The IT organization, alongside the Marketing organization, has been busy building a platform and a capability for us to ensure that we understand our consumer journey. We need to ensure we can interact where and how the customer wants, whether that is mobile or web or in a store. We are pleased to be one of the digital leaders in the CPG business.
If we want to be at the right place at the right time with the right product for our customer, we must make sure that our supply chain can match those opportunities. Today, we are not geared to do as well because we are used to shipping large truckloads of product to our customers, which may or may not be the way that the consumer wants to receive our product.
Though we would love everybody to have a truckload of Clorox products show up at their doorstep, they might just want a single item or a set of items. We need to make sure that our supply chain is ready to adapt to that changing world, which means it must be much more agile than it is today, much more capable of predicting where products need to be, and when they need to be at that location.
10/30/17
Andrew Moore’s career path at Carnegie Mellon has become emblematic of the way the University fosters its star talent. He became a tenured professor at Carnegie Mellon in 2000. In 2006, Moore joined Google, where he was responsible for building a new engineering office. As a vice president of engineering, Andrew was responsible for Google Shopping, the company’s retail segment. Moore returned to Carnegie Mellon in 2014 as the Dean of the Computer Science department. In that role and given his experience, Moore is among the most influential people in the fields of computer science and artificial intelligence.
In the past, Moore has described the poaching problem that the Computer Science department has, given its stable of extraordinary talent in fields such as artificial intelligence, machine learning, robotics, and others that are high in demand. Of course, he recognizes himself in those professors and students who would choose to follow their passions into lucrative positions in the private sector. The department allows professors to leave and come back in many cases, and is hiring at higher rates in anticipation of this trend continuing.
In this interview, Moore offers insights into the evolving field of artificial intelligence, what is likely to be the factors to determine the companies who will win or lose in this space, as well as insights into what makes Carnegie Mellon specifically and Pittsburgh more generally a hot test bed for cutting edge technology.
Peter High: Andrew, you are the Dean of the School of Computer Science at Carnegie Mellon University. Please describe your purview.
Andrew Moore: Carnegie Mellon’s School of Computer Science has a couple of hundred strong faculty members who are working on every aspect of computer technology. We also have a few thousand amazing students. My role as Dean is to make sure the whole organization gets to move forward. I see my role as helping to clear the way for these geniuses to get to do what they want to do.
High: You have said that being at CMU is like being at Hogwarts Academy. That when you walk around the School of Computer Science, the College of Engineering, and the university at large, you see a great number of smart people working on a variety of things that will change the technology landscape, and ultimately our lives. What was the origin of Carnegie Mellon’s influence?
Moore: It all comes down to two visionaries, Allen Newell and Herbert Simon. They were two of the four people who, in 1956, took part in the Dartmouth Artificial Intelligence Conference, where they discussed what might be possible with computers in the future. These two gentlemen were in the business school at Carnegie Tech, which later became Carnegie Mellon University. There was, of course, not a computer science school in the 1960s. Newell and Simon used their passion and extreme intellect to speculate and bring together a team of people who looked at, not what computers would do in the next five to 10 years, but what it would mean to live in a world where there are thinking machines. They inspired so many other thinkers through that period that it snowballed over the decades. Today, we have 250 faculty members in the School of Computer Science. They work on everything from the lowest level details of how photons move and how you count them up, to the highest level details of what it means to have an emotional relationship with a talking machine. It was Newell’s and Simon’s initial interest that sparked this and shaped our computer science department.
On the commercial side, through our digital channels, we continue to advance our product suite to allow our commercial customers to have access to the technology portals that house all of the capabilities and resources of the bank that they use. This includes things like wires, accounts receivable transactions, etc. Many of our interactions with our commercial customers also occur through our digital channels. This will continue to evolve as our customers work to become more efficient, and we develop new ways to help them achieve that. This transformation has been underway for some time, but it has accelerated in the last two or three years.