• Tech Book of the Month
  • Archive
  • Recommend a Book
  • Choose The Next Book
  • Sign Up
  • About
  • Search
Tech Book of the Month
  • Tech Book of the Month
  • Archive
  • Recommend a Book
  • Choose The Next Book
  • Sign Up
  • About
  • Search

May 2022 - Play Nice, But Win by Michael Dell and James Kaplan

This month we dive into the history of Dell Computer Corporation, one of the biggest PC and server companies in the world! Michael Dell gives a first-hand perspective of all of Dell’s big successes and failures throughout the years and his intense battle with Carl Icahn, over the biggest management buyout in history.

Tech Themes

  1. Be a Tinkerer. When he was in seventh grade, Michael Dell begged his parents to buy an Apple II computer (which costs ~$5,000 in today's dollars). Immediately after the computer arrived, he took the entire thing apart to see exactly how the system worked. After diving deep into each component, Dell started attending Apple user groups. During one, he met a young and tattered Steve Jobs. Dell began tutoring people on the Apple II's components and how they could get the most out of it. When IBM entered the market in 1980 with the 5150 computer, he did the same thing - took it apart, and examined the components. He realized that almost everything IBM made came from other companies (not IBM) and that the total value of its components was well below the IBM price tag. From this simple insight, he had a business. He started fixing up a couple of computers for local business people in Austin. Dell's machines cost less and delivered more performance. The company got so big (50k - 80k revenue per month) that during his freshman year at UT Austin, Dell decided to drop out, much to his parent's dismay. On May 3rd, 1984, Dell incorporated his company and never returned to school.

  2. Lower Prices and Better Service - a Powerful Combination. Dell Computer Corporation was the original DTC business. Rather than selling in big box retail stores, Dell carried out orders via mail request. When the internet became prominent in the late 90s, Dell started taking orders online. After his insight that the cost of components was significantly lower than the selling price, he flew to the far east to meet his suppliers. He started placing big deals and getting better and better prices. This strategy is the classic low-end disruption pattern that we learned about in Clayton Christensen's, The Innovator's Dilemma – a lowered-priced competitor that offers better service, customizability starts to crush the competition. Christensen is important to note that the internet itself was a sustaining innovation to Dell, but very disruptive to the market as a whole: "Usually, the technology simply is an enabler of the disruptive business model. For example, is the Internet a disruptive technology? You can't say that. If you bring it to Dell, it's a sustaining technology to what Dell's business model was in 1996. It made their processes work better; it helped them meet Dell's customers' needs at lower cost. But when you bring the very same Internet to Compaq, it is very disruptive [to the company's then dealer-only sales model]. So how do we treat that? We praise [CEO Michael] Dell, and we fire Eckhard Pfeiffer [Compaq's former CEO]. In reality, those two managers are probably equally competent." If competitors lowered prices, Dell could find better components and continually lower prices. Dell's strategy led to many departures from the personal PC market – IBM left, HP acquired Compaq in a disastrous deal for HP, and many others never made it back.

  3. Layoffs, Crises, and Opportunities. Dell IPO'd in 1988 and joined the Fortune 500 in 1991 as they hit $800m in sales for the year. So you would think the company would be humming when it hit $2B in sales in 1993, right? Wrong. Everything was breaking. When a company scales that quickly, it doesn't have time to create processes and systems. Personnel issues began to happen more frequently. As Dell recalls, the head of sales had a drinking problem, and the head of HR had a stripper girlfriend on the payroll. The company was late to market with notebooks, and it had to institute a recall on its notebooks which could catch fire in some instances. During that time, Dell hired Bain to do an internal report about how it should change its processes for its new scale – Kevin Rollins of the Bain team knew the business super well and thought incredibly strategically. After the Bain assignment, Rollins joined the company as Vice-chairman, ultimately becoming CEO for a brief period in 2004. One of his first recommendations was to cease its experiment selling through department stores and to stay DTC-focused. During the internet bubble, Dell faced another crisis – its stock had risen precipitously for many years, but once the bubble burst, in a matter of months, it fell from $50 to $17 a share. The company missed its earnings estimates for five quarters in a row and had to do two layoffs – one with 1,700 people and another with 4,000. During this time, an internal poll showed that 50% of Dell team members would leave if another company paid them the same rate. Dell realized that the values statement he had written in 1988 was no longer resonating and needed updating – he refreshed the value statement and focused the company on its role in the global IT economy. Dell understands that you should never waste a great crisis, and always find the opportunity for growth and improvement when things aren't going well.

Business Themes

Business+Model+Traditional+Model_+DELL+Direct+Model_+Suppliers.jpeg
Dell-EMC-Merger.png
  1. Carl Icahn and Dell. No one in business represents a corporate nemesis quite like Carl Icahn. Icahn was born in Rockaway, NY, and earned his tuition money at Princeton playing poker against the rich kids. Icahn is an activist investor and popularized the field of activist investing with some big, bold battles against companies in the early 1980s. Icahn got his start in 1968 by purchasing a seat on the New York Stock Exchange. He completed his first major takeover attempt in 1978, and the rest was history. Icahn takes an intense stance against companies, typically around big mergers, acquisitions, or divestitures. He 1) buys up a lot of shares, like 5-10% of a company, 2) accuses the company and usually the management of incompetence or a lousy strategy 3) argues for some action - a sale of a division, a change in management, a special dividend 4) sues the company in a variety of ways around shareholder negligence 5) sends letters to shareholders and the company detailing his findings/claims 6) puts up a new slate of board members at the company 7) waits to profit or gets paid to go away (also called greenmail). Icahn used these exact tactics when he took on Michael Dell. Icahn issued several scathing letters about Dell, criticizing the company's poor performance, highlighting Michael Dell's obvious conflicts of interest as CEO, and demanding the special committee evaluate the deal fairly. Icahn normally makes money when he gets involved, and he is essentially a gnat that doesn't go away until he makes money one way or another. After the fight, Icahn still made a profit of 10s of millions, and his fight with Dell was just beginning.

  2. Take Privates and Transformation. Michael Dell had thought a couple of times about taking the company private when he was approached by Egon Durban of Silver Lake Partners, a large tech private equity firm. Dell and Zender went on a walk in Hawaii and worked out what a transaction might be. The issue with Dell at that time was that the PC market was under siege. People thought tablets were the future, and their questions found confirmation in the PC market's declining volumes. Dell had spent $14B on an acquisition spree, acquiring a string of enterprise software companies, including Quest Software, SonicWall, Boomi, Secureworks, and more, as it redirected its strategy. But these companies had yet to kick into gear, and most of Dell's business was still PCs and servers. The stock price had fallen about 45% since Michael Dell had rejoined as CEO in 2007. Dell had thought about taking the company private a couple of other times, but now seemed like a great time - they needed to transform, and fast. Enacting a transformation in the public markets is tough because wall street focuses on quarter-to-quarter metrics over long-term vision. He first considered the idea in June 2012 when talking with the then largest shareholder Southeastern Asset Management. After letting the idea percolate, Dell held discussions with Silver Lake and KKR. Silver Lake and Dell submitted a bid at $12.70, then $12.90, then $13.25, then $13.60, then $13.65. On February 4th, 2013, the special committee accepted Silver Lake's offer. On March 5th, Carl Icahn entered the fray, saying he owned about $1b of shares. Icahn submitted a half proposal suggesting the company pay a one-time special dividend, he would acquire a substantial part of the stock and it would remain public, under different leadership. On July 18th, the special committee delayed a vote on the acquisition because it became clear that Dell couldn't get enough of the "majority of the minority" votes needed to close the acquisition. A few weeks later, Silver Lake and Dell raised their bid to $13.75 (the original asking price of the committee), and the committee agreed to remove the voting standard, allowing the SL/Dell combo to win the deal. After various lawsuits, Icahn gave up in September 2013, when it became clear he had no strategy to convince shareholders to his side. It was an absolute whirlwind of a deal process, and Dell escaped with his company.

  3. Big Deals. After Dell went private, Michael Dell and Egon Durban started scouring the world for enticing tech acquisitions. They closed on a small $1.4B storage acquisition, which reaffirmed Michael Dell's interest in the storage market. After the deal, Dell reconsidered something that almost happened in 2008/09 – a merger with EMC. EMC was the premier enterprise storage company with a dominant market share. On top of that, EMC owned VMware, a software company that had successfully virtualized the x86 architecture so servers could run multiple operating systems simultaneously. Throughout 2008 and 2009, Dell and EMC had deeply considered a merger – to the point that its boards held joint discussions about integration plans and deal price. The boards scrapped the deal during the financial crisis, and in the ensuing years, EMC grew and grew. By 2014 it was a $59B public company and the largest company in Massachusetts. In mid-2014, Dell started to consider the idea. He pondered the strategic and competitive implications of the deal everywhere he went. Little did he know that he was already late to the party – it later came out that both HP and Cisco had looked at acquiring EMC in 2013. HP got down to the wire, with the deal being championed by Meg Whitman, as a way to move past the Autonomy debacle and board room in-fighting. HP had a handshake agreement to merge with EMC in a 1:1 deal, but at the last minute, HP re-traded and demanded a more advantageous split (i.e. HP would own 55% of the combined company) and EMC said no. When EMC then turned to Dell, Whitman slammed the deal. While the only remaining competitor of size was Dell, there was still a question of how they could finance the deal, especially as a private company. Dell's ultimate package was a pretty crazy mix of considerations: Dell issued a tracking stock related specifically to Dell's business, it then took out some $40b in loans against its newly acquired VMWare equity and the cash flow of Dell's underlying business, Michael Dell and Silver lake also put in an additional $5B of equity capital. After Silver Lake and Dell determined the financing structure, Dell faced a grueling interrogation session in front of the EMC board as final approval for the deal. The deal was announced on October 12th, 2015, and it closed a year later. By all measures, it appears the deal was a success – the company has undergone a complete transformation – shedding some acquired assets, spinning off VMWare, and going public again by acquiring its own tracking stock. Michael Dell took some huge risks - taking his company private and completing the biggest tech merger in history. It seems to have paid off handsomely.

Dig Deeper

  • Michael Dell, Dell Technologies | Dell Technologies World 2022

  • Steve Jobs hammers Michael Dell (1997)

  • Michael Dell interview - 7/23/1991

  • Background of the Merger - the full SEC timeline of the EMC-Dell Merger

  • Carl Icahn's First Ever Interview | 1985

tags: Michael Dell, Dell, Carl Icahn, Apple, Steve Jobs, HP, Cisco, Meg Whitman, IBM, Austin, DTC, Clayton Christensen, Innovator's Dilemma, Compaq, Kevin Rollins, Bain, Internet History, Activist, Silver Lake, Quest Software, SonicWall, Secureworks, Egon Durban, KKR, Southeastern Asset Management, EMC, Joe Tucci, VMware
categories: Non-Fiction
 

July 2020 - Innovator's Dilemma by Clayton Christensen

This month we review the technology classic, the Innovator’s Dilemma, by Clayton Christensen. The book attempts to answer the age-old question: why do dominant companies eventually fail?

Tech Themes

  1. The Actual Definition of Disruptive Technology. Disruption is a term that is frequently thrown around in Silicon Valley circles. Every startup thinks its technology is disruptive, meaning it changes how the customer currently performs a task or service. The actual definition, discussed in detail throughout the book, is relatively specific. Christensen re-emphasizes this distinction in a 2015 Harvard Business Review article: "Specifically, as incumbents focus on improving their products and services for their most demanding (and usually most profitable) customers, they exceed the needs of some segments and ignore the needs of others. Entrants that prove disruptive begin by successfully targeting those overlooked segments, gaining a foothold by delivering more-suitable functionality—frequently at a lower price. Incumbents, chasing higher profitability in more-demanding segments, tend not to respond vigorously. Entrants then move upmarket, delivering the performance that incumbents' mainstream customers require, while preserving the advantages that drove their early success. When mainstream customers start adopting the entrants' offerings in volume, disruption has occurred." The book posits that there are generally two types of innovation: sustaining and disruptive. While disruptive innovation focuses on low-end or new, small market entry, sustaining innovation merely continues markets along their already determined axes. For example, in the book, Christensen discusses the disk drive industry, mapping out the jumps which pack more memory and power into each subsequent product release. There is a slew of sustaining jumps for each disruptive jump that improves product performance for existing customers but doesn't necessarily get non-customers to become customers. It is only when new use cases emerge, like rugged disk usage and PCs arrive, that disruption occurs. Understanding the specific definition can help companies and individuals better navigate muddled tech messaging; Uber, for example, is shown to be a sustaining technology because its market already existed, and the company didn't offer lower prices or a new business model. Understanding the intricacies of the definition can help incumbents spot disruptive competitors.

  2. Value Networks. Value networks are an underappreciated and somewhat confusing topic covered in The Innovator's Dilemma's early chapters. A value network is defined as "The context within which a firm identifies and responds to customers' needs, solves problems, procures input, reacts to competitors, and strives for profit." A value network seems all-encompassing on the surface. In reality, a value network serves to simplify the lens through which an organization must make complex decisions every day. Shown as a nested product architecture, a value network attempts to show where a company interacts with other products. By distilling the product down to its most atomic components (literally computer hardware), we can see all of the considerations that impact a business. Once we have this holistic view, we can consider the decisions and tradeoffs that face an organization every day. The takeaway here is that organizations care about different levels of performance for different products. For example, when looking at cloud computing services at AWS, Azure, or GCP, we see Amazon EC2 instances, Azure VMs, and Google Cloud VMs with different operating systems, different purposes (general, compute, memory), and different sizes. General-purpose might be fine for basic enterprise applications, while gaming applications might need compute-optimized, and real-time big data analytics may need a memory-optimized VM. While it gets somewhat forgotten throughout the book, this point means that organizations focused on producing only compute-intensive machines may not be the best for memory-intensive, because the customers of the organization may not have a use for them. In the book's example, some customers (of bigger memory providers) looked at smaller memory applications and said there was no need. In reality, there was massive demand in the rugged, portable market for smaller memory disks. When approaching disruptive innovation, it's essential to recognize your organization's current value network so that you don't target new technologies at those who don't need it.

  3. Product Commoditization. Christensen spends a lot of time describing the dynamics of the disk drive industry, where companies continually supplied increasingly smaller drives with better performance. Christensen's description of commoditization is very interesting: "A product becomes a commodity within a specific market segment when the repeated changes in the basis of competition, completely play themselves out, that is, when market needs on each attribute or dimension of performance have been fully satisfied by more than one available product." At this point, products begin competing primarily on price. In the disk drive industry, companies first competed on capacity, then on size, then on reliability, and finally on price. This price war is reminiscent of the current state of the Continuous Integration / Continuous Deployment (CI/CD) market, a subsegment of DevOps software. Companies in the space, including Github, CircleCI, Gitlab, and others are now competing primarily on price to win new business. Each of the cloud providers has similar technologies native to their public cloud offerings (AWS CodePipeline and CloudFormation, GitHub Actions, Google Cloud Build). They are giving it away for free because of their scale. The building block of CI/CD software is git, an open-source version control system founded by Linux founder Linus Torvalds. With all the providers leveraging a massive open-source project, there is little room for true differentiation. Christensen even says: "It may, in fact, be the case that the product offerings of competitors in a market continue to be differentiated from each other. But differentiation loses its meaning when the features and functionality have exceeded what the market demands." Only time will tell whether these companies can pivot into burgeoning highly differentiated technologies.

Business Themes

Innovator Dilemma.png
R1512B_BIG_MODEL-1200x1035.png
  1. Resources-Processes-Value (RPV) Framework. The RPV framework is a powerful lens for understanding the challenges that large businesses face. Companies have resources (people, assets, technology, product designs, brands, information, cash, relationships with customers, etc.) that can be transformed into greater value products and services. The way organizations go about converting these resources is the organization's processes. These processes can be formal (documented sales strategies, for example) or informal (culture and habitual routines). Processes are the big reasons organizations struggle to deal with emerging technologies. Because culture and habit are ingrained in the organization, the same process used to launch a mature, slow-growing market may be applied to a fast-growing, dynamic sector. Christensen puts it best: "This means the very mechanisms through which organizations create value are intrinsically inimical to change." Lastly, companies have values, or "the standards by which employees make prioritization decisions." When there is a mismatch between the resources, processes, and values of an organization and the product or market that an organization is chasing, its rare the business can be successful in competing in the disruptive market. To see this misalignment in action, Christensen describes a meeting with a CEO who had identified the disruptive change happening in the disk-drive market and had gotten a product to market to meet the growing market. In response to a publication showing the fast growth of the market, the CEO lamented to Christensen: "I know that's what they think, but they're wrong. There isn't a market. We've had that drive in our catalog for 18 months. Everyone knows we've got it, but nobody wants it." The issue was not the product or market demand, but the organization's values. As Christensen continues, "But among the employees, there was nothing about an $80 million, low-end market that solved the growth and profit problems of a multi-billion dollar company – especially when capable competitors were doing all they could to steal away the customers providing those billions. And way at the other end of the company there was nothing about supplying prototype companies of 1.8-inch drives to an automaker that solved the problem of meeting the 1994 quotas of salespeople whose contacts and expertise were based so solidly in the computer industry." The CEO cared about the product, but his team did not. The RPV framework helps evaluate large companies and the challenges they face in launching new products.

  2. How to manage through technological change. Christensen points out three primary ways of managing through disruptive technology change: 1. "Acquire a different organization whose processes and values are a close match with the new task." 2. "Try to change the processes and values of the current organization." 3. "Separate out an independent organization and develop within it the new processes and values that are required to solve the new problem." Acquisitions are a way to get out ahead of disruptive change. There are so many examples but two recent ones come to mind: Microsoft's acquisition of Github and Facebook's acquisition of Instagram. Microsoft paid a whopping $7.5B for Github in 2018 when the Github was rumored to be at roughly $200M in revenue (37.5x Revenue multiple!). Github was undoubtedly a mature business with a great product, but it didn't have a ton of enterprise adoption. Diane Greene at Google Cloud, tried to get Sundar Pichai to pay more, but he said no. Github has changed Azure's position within the market and continued its anti-Amazon strategy of pushing open-source technology. In contrast to the Github acquisition, Instagram was only 13 employees when it was acquired for $1B. Zuckerberg saw the threat the social network represented to Facebook, and today the acquisition is regularly touted as one of the best ever. Instagram was developing a social network solely based on photographs, right at the time every person suddenly had an excellent smartphone camera in their pocket. The acquisition occurred right when the market was ballooning, and Facebook capitalized on that growth. The second way of managing technological change is through changing cultural norms. This is rarely successful, because you are fighting against all of the processes and values deeply embedded in the organization. Indra Nooyi cited a desire to move faster on culture as one of her biggest regrets as a young executive: "I’d say I was a little too respectful of the heritage and culture [of PepsiCo]. You’ve got to make a break with the past. I was more patient than I should’ve been. When you know you have to make a change, at some point you have to say enough is enough. The people who have been in the company for 20-30 years pull you down. If I had to do it all over again, I might have hastened the pace of change even more." Lastly, Christensen prescribes creating an independent organization matched to the resources, processes, and values that the new market requires. Three great spin-out, spin-in examples with different flavors of this come to mind. First, Cisco developed a spin-ins practice whereby they would take members of their organization and start a new company that they would fund to develop a new process. The spin-ins worked for a time but caused major cultural issues. Second, as we've discussed, one of the key reasons AWS was born was that Chris Pinkham was in South Africa, thousands of miles away from Amazon Corporate in Seattle; this distance and that team's focus allowed it to come up with a major advance in computing. Lastly, Mastercard started Mastercard Labs a few years ago. CEO Ajay Banga told his team: "I need two commercial products in three years." He doesn't tell his CFO their budget, and he is the only person from his executive team that interacts with the business. This separation of resources, processes, and values allows those smaller organizations to be more nimble in finding emerging technology products and markets.

  3. Discovering Emerging Markets.

    The resources-processes-values framework can also show us why established firms fail to address emerging markets. Established companies rely on formal budgeting and forecasting processes whereby resources are allocated based on market estimates and revenue forecasts. Christensen highlights several important factors for tackling emerging markets, including focusing on ideas, failure, and learning. Underpinning all of these ideas is the impossibility of predicting the scale and growth rate of disruptive technologies: "Experts' forecasts will always be wrong. It is simply impossible to predict with any useful degree of precision how disruptive products will be used or how large their markets will be." Because of this challenge, relying too heavily on these estimates to underpin financial projections can cause businesses to view initial market development as a failure or not worthy of the companies time. When HP launched a new 1.3-inch disk drive, which could be embedded in PDAs, the company mandated that its revenues had to scale up to $150M within three years, in line with market estimates. That market never materialized, and the initiative was abandoned as a failed investment. Christensen argues that because disruptive technologies are threats, planning has to come after action, and thus strategic and financial planning must be discovery-based rather than execution-based. Companies should focus on learning their customer's needs and the right business model to attack the problem, rather than plan to execute their initial vision. As he puts it: "Research has shown, in fact, that the vast majority of successful new business ventures, abandoned their original business strategies when they began implementing their initial plans and learned what would and would not work." One big fan of Christensen's work is Jeff Bezos, and its easy to see why with Amazon's focus on releasing new products in this discovery manner. The pace of product releases is simply staggering (~almost one per day). Bezos even talked about this exact issue in his 2016 shareholder letter: "The senior team at Amazon is determined to keep our decision-making velocity high. Speed matters in business – plus a high-velocity decision making environment is more fun too. We don't know all the answers, but here are some thoughts. First, never use a one-size-fits-all decision-making process. Many decisions are reversible, two-way doors. Those decisions can use a light-weight process. For those, so what if you're wrong? I wrote about this in more detail in last year's letter. Second, most decisions should probably be made with somewhere around 70% of the information you wish you had. If you wait for 90%, in most cases, you're probably being slow." Amazon is one of the first large organizations to truly embrace this decision-making style, and clearly, the results speak for themselves.

Dig Deeper

  • What Jeff Bezos Tells His Executives To Read

  • Github Cuts Subscription Price by More Than Half

  • Ajay Banga Opening Address at MasterCard Innovation Forum 2014

  • Clayton Christensen Describing Disruptive Innovation

  • Why Cisco’s Spin-Ins Never Caught On

tags: Amazon, Google Cloud, Microsoft, Azure, Github, Gitlab, CircleCI, Pepsi, Jeff Bezos, Indra Nooyi, Mastercard, Ajay Banga, HP, Uber, RPV, Facebook, Instagram, Cisco, batch2
categories: Non-Fiction
 

December 2019 - The Moon is a Harsh Mistress by Robert A. Heinlein

This futuristic, anti-establishment thriller is one of Elon Musk’s favorite books. While Heinlein’s novel can drag on with little action, The Moon is a Harsh Mistress presents an interesting war story and predicts several technological revolutions.

Tech Themes

  1. Mike, the self-aware computer and IBM. Mycroft Holmes, Heinlein’s self-aware, artificially intelligent computer is a friendly, funny and focused companion to Manny, Wyoh and Prof throughout the novel. Mike’s massive hardware construction is analogous to the way companies are viewing Artificial Intelligence today. Mike’s AI is more closely related to Artificial General Intelligence, which imagines a machine that can go beyond the standard Turing Test, with further abilities to plan, learn, communicate in natural language and act on objects. The 1960s were filled with predictions of futuristic robots and machines. Ideas were popularized not only in books like The Moon is a Harsh Mistress but also in films like 2001: A Space Odyssey, where the intelligent computer, HAL 9000, attempts to overthrow the crew. In 1965, Herbert Simon, a noble prize winner, exclaimed: “machines will be capable, within twenty years, of doing any work a man can do.” As surprising as it may seem today, the dominant technology company of the 1960’s was IBM, known for its System/360 model. Heinlein even mentions Thomas Watson and IBM at Mike’s introduction: “Mike was not official name; I had nicknamed him for Mycroft Holmes, in a story written by Dr. Watson before he founded IBM. This story character would just sit and think--and that's what Mike did. Mike was a fair dinkum thinkum, sharpest computer you'll ever meet.” Mike’s construction is similar to that of present day IBM Watson, who’s computer was able to win Jeopardy, but has struggled to gain traction in the market. IBM and Heinlein approached the computer development in a similar way, Heinlein foresaw a massive computer with tons of hardware linked into it: “They kept hooking hardware into him--decision-action boxes to let him boss other computers, bank on bank of additional memories, more banks of associational neural nets, another tubful of twelve-digit random numbers, a greatly augmented temporary memory. Human brain has around ten-to-the tenth neurons. By third year Mike had better than one and a half times that number of neuristors.” This is the classic IBM approach – leverage all of the hardware possible and create a massive database of query-able information. This actually does work well for information retrieval like Jeopardy, but stumbles precariously on new information and lack of data, which is why IBM has struggled with Watson applications to date.

  2. Artificial General Intelligence. Mike is clearly equipped with artificial general intelligence (AGI); he has the ability to securely communicate in plain language, retrieve any of the world’s information, see via cameras and hear via microphones. As discussed above, Heinlein’s construction of Mike is clearly hardware focused, which makes sense considering the book was published in the sixties, before software was considered important. In contrast to the 1960s, today, AGI is primarily addressed from an algorithmic, software angle. One of the leading research institutions (excluding the massive tech companies) is OpenAI, an organization who’s mission is: “To ensure that artificial general intelligence (AGI)—by which we mean highly autonomous systems that outperform humans at most economically valuable work—benefits all of humanity.” OpenAI was started by several people including Elon Musk and Sam Altman, founder of Y Combinator, a famous startup incubator based in Silicon Valley. OpenAI just raised $1 billion from Microsoft to pursue its artificial algorithms and is likely making the most progress when it comes to AGI. The organization has released numerous modules that allow developers to explore the wide-ranging capabilities of AI, from music creation, to color modulation. But software alone is not going to be enough to achieve full AGI. OpenAI has acknowledged that the largest machine learning training runs have been run on increasingly more hardware: “Of course, the use of massive compute sometimes just exposes the shortcomings of our current algorithms.” As we discussed before (companies are building their own hardware for this purpose, link to building their own hardware), and the degradation of Moore’s Law imposes a serious threat to achieving full Artificial General Intelligence.

  3. Deep Learning, Adam Selene, and Deep Fakes. Heinlein successfully predicted machine’s ability to create novel images. As the group plans to take the rebellion public, Mike is able to create a depiction of Adam Selene that can appear on television and be the face of the revolution: “We waited in silence. Then screen showed neutral gray with a hint of scan lines. Went black again, then a faint light filled middle and congealed into cloudy areas light and dark, ellipsoid. Not a face, but suggestion of face that one sees in cloud patterns covering Terra. It cleared a little and reminded me of pictures alleged to be ectoplasm. A ghost of a face. Suddenly firmed and we saw "Adam Selene." Was a still picture of a mature man. No background, just a face as if trimmed out of a print. Yet was, to me, "Adam Selene." Could not he anybody else.” Image generation and manipulation has long been a hot topic among AI researchers. The research frequently leverages a technique called Deep Learning, which is a play on classically used Artificial Neural Networks. A 2012 landmark paper from the University of Toronto student Ilya Sutskever, who went on to be a founder at OpenAI, applied deep learning to the problem of image classification with incredible success. Deep learning and computer vision have been inseparable ever since. One part of research focuses on a video focused image superimposition technique called Deep Fakes, which became popular earlier this year. As shown here, these videos are essentially merging existing images and footage with a changing facial structure, which is remarkable and scary at the same time. Deep fakes are gaining so much attention that even the government is focused on learning more about them. Heinlein was early to the game, imaging a computer could create a novel image. I can only imagine how he’d feel about Deep Fakes.

Business Themes

correspondence.jpg
news-20190228.gif
  1. Video Conferencing. Manny and the rest of the members of the revolution communicate through encrypted phone conversations and video conferences. While this was certainly ahead of its time, video conferencing was first imagined in the late 1800s. Despite a clear demand for the technology, it took until the late 2000s arguably, to reach appoint where mass video communication was easily accessible for businesses (Zoom Video) and individuals (FaceTime, Skype, etc.) This industry has constantly evolved and there are platforms today that offer both secure chat and video such as Microsoft Teams and Cisco Webex. The entire industry is a lesson in execution. The idea was dreamed up so long ago, but it took hundreds of years and multiple product iterations to get to a de-facto standard in the market. Microsoft purchased Skype in 2011 for $8.5B, the same year that Eric Yuan founded Zoom. This wasn’t Microsoft’s first inroads into video either, in 2003, Microsoft bought Placeware and was supposed to overtake the market. But they didn’t and Webex continued to be a major industry player before getting acquired by Cisco. Over time Skype popularity has waned, and now, Microsoft Teams has a fully functioning video platform separate from Skype – something that Webex did years ago. Markets are constantly in a state of evolution, and its important to see what has worked well. Skype and Zoom both succeeded by appealing to free users, Skype initially focused on free consumers, and Zoom focused on free users within businesses. WebEx has always been enterprise focused but they had to be, because bandwidth costs were too high to support a video platform. Teams will go to market as a next-generation alternate/augmentation of Outlook; it will be interesting to see what happens going forward.

  2. Privacy and Secure Communication. As part of the revolution’s communication, a secure, isolated message system is created whereby not only are conversations fully encrypted and undetected by authorities but also individuals are unable to speak with more than two others in their revolution tree. Today, there are significant concerns about secure communication – people want it, but they also do not. Facebook has declared that they will implement end to end encryption despite warnings from the government not to do so. Other mobile applications like Telegram and Signal promote secure messaging and are frequently used by reporters for anonymous tips. While encryption is beneficial for those messaging, it does raise concerns about who has access to what information. Should a company have access to secure messages? Should the government have access to secure messages? Apple has always stayed strong in its privacy declaration, but has had its own missteps. This is a difficult question and the solution must be well thought out, taking into account unintended consequences of sweeping regulation in any direction.

  3. Conglomerates. LuNoHo Co is the conglomerate that the revolution utilized to build a massive catapult and embezzle funds. While Mike’s microtransaction financial fraud is interesting (“But bear in mind that an auditor must assume that machines are honest.”), the design of LuNoHo Co. which is described as part bank, part engineering firm, and part oil and gas exploitation firm, interestingly addresses the conventional business wisdom of the times. In the 1960s, coming out of World War II, conglomerates began to really take hold across many developing nations. The 1960s were a period of low interest rates, which allowed firms to perform leveraged buyouts of other companies (using low interest loans), sometimes in a completely unrelated set of industries. Activision was once part of Vivendi, a former waste management, energy, construction, water and property conglomerate. The rationale for these moves was often that a much bigger organization could centralize general costs like accounting, finance, legal and other costs that touched every aspect of the business. However, when interest rates rose in the late 70s and early 80s, several conglomerate profits fell, and the synergies promised at the outset of the deal turned out to be more difficult to realize than initially assumed. Conglomerates are incredibly popular in Asia, often times supported by the government. In 2013, McKinsey estimated: “Over the past decade, conglomerates in South Korea accounted for about 80 percent of the largest 50 companies by revenues. In India, the figure is a whopping 90 percent. Meanwhile, China’s conglomerates (excluding state-owned enterprises) represented about 40 percent of its largest 50 companies in 2010, up from less than 20 percent a decade before.” Softbank, the famous Japanese conglomerate and creator of the vision fund, was originally a shrink-wrap software distributor but now is part VC and part Telecommunications provider. We’ve discussed the current state of Chinese internet conglomerates, Alibaba and Tencent who each own several different business lines. Over the coming years, as internet access in Asia grows more pervasive and the potential for economic downturn increases, it will be interesting to see if these conglomerates break apart and focus on their core businesses.

Dig Deeper

  • The rise and fall of Toshiba

  • Using Artificial Intelligence to Create Talking Images

  • MIT Lecture on Image Classification via Deep Learning

  • 2019 Trends in the Video Conferencing Industry

  • The Moon is a Harsh Mistress may be a movie

tags: Facebook, IBM, Zoom, Artificial Intelligence, AI, AGI, Watson, OpenAI, Y Combinator, Microsoft, Moore's Law, Deep Fakes, Deep Learning, Elon Musk, Skype, WebEx, Cisco, Apple, Activision, Conglomerate, Softbank, Alibaba, Tencent, Vision Fund, China, Asia, batch2
categories: Fiction
 

February 2019 - Cloud: Seven Clear Business Models by Timothy Chou

While this book is relatively old for internet standards, it illuminates the early transition to SaaS (Software as a Service) from traditional software license and maintenance models. Timothy Chou, current Head of IoT at the Alchemist Accelerator, former Head of On Demand Applications at Oracle, and a lecturer at Stanford, details seven different business models for selling software and the pros/cons of each.

Tech Themes

  1. The rise of SaaS. Software-as-a-Service (SaaS) is an application that can be accessed through a web browser and is managed and hosted by a third-party (likely a public cloud - Google, Microsoft, or AWS). Let’s flash back to the 90’s, a time when software was sold in shrink-wrapped boxes as perpetual licenses. What this meant was you owned whatever version of the software you purchased, in perpetuity. Most of the time you would pay a maintenance cost (normally 20% of the overall license value) to receive basic upkeep services to the software and get minor bugs fixed. However, when the new version 2.0 came out, you would have to pay another big license fee, re-install the software and go through the hassle of upgrading all existing systems. On the backs of increased internet adoption, SaaS allowed companies to deliver a standard product, over the internet, typically at lower price point to end users. This meant smaller companies like salesforce (at the time) could compete with giants like Siebel Systems (acquired by Oracle for $5.85Bn in 2005) because companies could now purchase the software in an on-demand, by-user fashion without going to the store, at a much lower price point.

  2. How cloud empowers SaaS. As an extension, standardization of product means you can aptly define the necessary computing resources - thereby also standardizing your costs. At the same time that SaaS was gaining momentum, the three mega public cloud players emerged, starting with Amazon (in 2006), then Google and eventually Microsoft. This allowed companies to host software in the cloud and not on their own servers (infrastructure that was hard to manage internally). So instead of racking (pun intended) up costs with an internal infrastructure team managing complex hardware - you could offload your workloads to the cloud. Infrastructure as a service (IaaS) was born. Because SaaS is delivered over the internet at lower prices, the cloud became an integral part of scaling SaaS businesses. As the number of users grew on your SaaS platform, you simply purchased more computing space on the cloud to handle those additional users. Instead of spending big amounts of money on complex infrastructural costs/decisions, a company could now focus entirely on its product and go-to-market strategy, enabling it to reach scale much more quickly.

  3. The titans of enterprise software. Software has absolutely changed in the last 20 years and will likely continue to evolve as more specialized products and services become available. That being said, the perennial software acquirers will continue to be perennial software acquirers. At the beginning of his book, Chou highlights fifteen companies that had gone public since 1999: Concur (IPO: 1999, acquired by SAP for $8.3B in 2014), Webex (IPO: 2002, acquired by Cisco in for $3.2B in 2007), Kintera (IPO: 2003, acquired by Blackbaud for $46M in 2008), Salesforce.com (IPO: 2004), RightNow Technologies (IPO: 2004, acquired by Oracle for $1.5B in 2011), Websidestory (IPO: 2004, acquired by Omniture in 2008 for $394M), Kenexa (IPO: 2005, acquired by IBM for $1.3B in 2012), Taleo (IPO: 2005, acquired for $1.9B by Oracle in 2012), DealerTrack (IPO 2005, acquired by Cox Automotive in 2015 for $4.0B), Vocus (IPO: 2005, acquired by GTCR in 2014 for $446M), Omniture (IPO: 2006, acquired by Adobe for $1.8B in 2009), Constant Contact (IPO: 2007, acquired by Endurance International for $1B in 2015), SuccessFactors (IPO: 2007, acquired by SAP for $3.4B in 2011), NetSuite (IPO 2007: acquired by Oracle for $9.3B in 2016) and Opentable (IPO: 2009, acquired by Priceline for $2.6B in 2015). Oracle, IBM, Cisco and SAP have been some of the most active serial acquirers in tech history and this trend is only continuing. Interestingly enough, Salesforce.com is now in a similar position. What it shows is that if you can come to dominate a horizontal application - CRM (salesforce), ERP (SAP/Oracle), or Infrastructure (Google/Amazon/Microsoft) you can build a massive moat that allows you to become the serial acquirer in that space. You then have first and highest dibs at every target in your industry because you can underwrite an acquisition to the highest strategic multiple. Look for these acquirers to continue to make big deals when it can further lock in their market dominant position especially when you see their core business slow.

    Business Themes

Here we see the “Cash Gap” in the subscription model - customer acquisition expenses are incurred upfront but are recouped over time.

Here we see the “Cash Gap” in the subscription model - customer acquisition expenses are incurred upfront but are recouped over time.

  1. The misaligned incentives of traditional license/maintenance model. Software was traditionally sold as perpetual licenses, where a user could access that version of the software forever. Because users were paying to use something forever, the typical price point was very high for any given enterprise software license. This meant that large software upgrades were made at the the most senior levels of management and were large investments from a dollars and time perspective. On top of that initial license came the 20% support costs paid annually to receive patch updates. At the software vendor, this structure created interesting incentives. First, product updates were usually focused on break-fix and not new, “game-changing” upgrades because supporting multiple, separate versions of the software (especially, pre-IaaS) was incredibly costly. This slowed the pace of innovation at those large software providers (turning them into serial acquirers). Second, the sales team became focused on selling customers on new releases directly after they signed the initial deal. This happened because once you made that initial purchase, you owned that version forever; what better way to get more money off of you than introduce a new feature and re-sell you the whole system again. Salespeople were also incredibly focused on closing deals in a certain quarter because any single deal could make or break not only their quarterly sales quota, but also the Company’s revenue targets. If one big deal slipped from Q4 to Q1 the following year, a Company may have to report lower growth numbers to the stock market driving the stock price down. Third, once you made the initial purchase, the software vendor would direct all problems and product inquiries to customer support who were typically overburdened by requests. Additionally, the maintenance/support costs were built into the initial contract so you may end up contractually obligated to pay for support for a product that you don’t like and cannot change. The Company viewed it as: “You’ve already purchased the software, so why should I waste time ensuring you have a great experience with it - unless you are looking to buy the next version, I’m going to spend my time selling to new leads.” These incentives limited product changes/upgrades, focused salespeople completely on new leads, and hurt customer experience, all at the benefit of the Company over the user.

  2. What are CAC and LTV? CAC or customer acquisition costs are key to understand for any type of software business. As HubSpot and distinguished SaaS investor, David Skok notes, its typically measured as, “the entire cost of sales and marketing over a given period, including salaries and other headcount related expenses, and divide it by the number of customers that you acquired in that period.” Once the software sales model shifted from license/maintenance to SaaS, instead of hard-to-predict, big new license sales, companies started to receive monthly recurring payments. Enterprise software contracts are typically year-long, which means that once a customer signs the Company will know exactly how much revenue it should plan to receive over the coming year. Furthermore, with recurring subscriptions, as long as the customer was happy, the Company could be reasonably assured that customer would renew. This idea led to the concept of Lifetime Value of a customer or LTV. LTV is the total amount of revenue a customer will pay the Company until it churns or cancels the subscription. The logic followed that if you could acquire a customer (CAC) for less than the lifetime value of the customer (LTV), over time you would make money on that individual customer. Typically, investors view a 3:1 LTV to CAC ratio as viable for a healthy SaaS company.

Dig Deeper

  • Bill Gates 1995 memo on the state of early internet competition: The Internet Tidal Wave

  • Andy Jassey on how Amazon Web Services got started

  • Why CAC can be a Startup Killer?

  • How CAC is different for different types of software

  • Basic SaaS Economics by David Skok

tags: Cloud Computing, SaaS, License, Maintenance, Business Models, software, Salesforce, SAP, Oracle, Cisco, IaaS, batch2
categories: Non-Fiction
 

About Contact Us | Recommend a Book Disclaimer