• Tech Book of the Month
  • Archive
  • Recommend a Book
  • Choose The Next Book
  • Sign Up
  • About
  • Search
Tech Book of the Month
  • Tech Book of the Month
  • Archive
  • Recommend a Book
  • Choose The Next Book
  • Sign Up
  • About
  • Search

October 2021 - Unapologetically Ambitious by Shellye Archambeau

This month we hear the story of famous technology CEO Shellye Archambeau, former leader of GRC software provider, Metricstream. Archambeau packs her memoir full of amazing stories and helpful career advice; the book is a must-read for any ambitious leader looking for how to break into Silicon Valley’s top ranks.

Tech Themes

  1. The Art of the Pivot. When Archambeau joined Zaplet in 2003 as its new CEO, she had a frank conversation with the chairman of the board Vinod Khosla. She asked him one question: “You have a great reputation for supporting your companies, but you also have a reputation of being strong-willed and sometimes dominating. I just need to know before I answer [where I will take the job], are you hiring me to implement your strategy, or are you hiring me to be the CEO?” Vinod responded: “I would be hiring you to be the CEO, to run the company, fully responsible and accountable.” With that answer, Archambeau accepted the job and achieved her life-long goal of becoming a CEO before age forty. Archambeau had just inherited a struggling former silicon-valley darling that had raised over $100M but had failed to translate that money into meaningful sales. Zaplet’s highly configurable technology was a vital asset, but the company had not locked on to a real problem. Struggling to set a direction for the company, Archambeau spoke with board member Roger McNamee, who suggested pivoting into compliance software. In early 2004, Zaplet merged with compliance software provider MetricStream (taking its name), with Archambeau at the helm of the combined company. She wasn’t out of the woods yet. The 2008/09 financial crisis pushed MetricStream to the brink. With less than $2M in the bank, Archambeau ditched her salary, executed a layoff, and rallied her executive through the financial crisis. As banks recapitalized, they sought new compliance and risk management platforms to avoid future issues, and MetricStream was well-positioned to serve this new set of highly engaged customers. Archambeau’s first and only CEO role lasted for 14 years, as she led Metricstream to $100M in revenue and 2,000+ employees.

  2. Taking Calculated Risks. Although Archambeau architected a successful turnaround, her career was not without challenges. After years of working her way up at IBM, Archambeau strategically chose to seek out a challenging international assignment, an essential staple of IBM’s CEOs. While working in Tokyo as VP and GM for Public Sector in Asia Pacific, Archambeau was not selected for a meeting with Lou Gerstner, IBM’s CEO. She put it bluntly: “I was ranked highly in terms of my performance - close to the top of the yearly ranking, not just in Japan, but globally. Yet I was pretty sure I wasn’t earning the salary many of my colleagues were getting.” It was then that Archambeau realized that she might need to leave IBM to achieve her goal of becoming CEO. She left IBM and became President of Blockbuster.com, as they were beginning to compete with Netflix. Blockbuster was staunch in its dismissal of Netflix, refusing to buy the streaming company when it had a chance for a measly $50M. Archambeau was unhappy with management’s flippant attitude toward a legitimate threat and left Blockbuster’s Dallas HQ after only 9 months. After this difficult work experience, Archambeau sought out work in Silicon Valley, moving to the nation’s tech hub without her family. She became Head of Sales and Marketing for Northpoint Communications. The company was fighting a losing DSL cable battle, and after a merger with Verizon fell through, the company went bankrupt. Then Archambeau became CMO of Loudcloud, Ben Horowitz’s early cloud product covered in our March 2020 book, The Hard Thing About Hard Things. But things were already blowing up at Loudcloud, and after a year, Archambeau was looking for another role following the sale of LoudCloud’s services business to EDS. At 40 years old, Archambeau had completed international assignments, managed companies across technology, internet, and telecom, and seen several mergers and bankruptcies. That experience laid the bedrock for her attitude: “After the dot-com bubble burst, I would need to double down and take greater risks, but-and this probably won’t surprise you-I had planned for this…It’s 2002, I’m almost forty, I’ve learned a great deal from Northpoint and Loudcloud, and I’m feeling ready for my chance to be a CEO.” Archambeau was always ready for the next challenge, unafraid of the risks posed - prepared to make her mark on the Tech industry.

  3. Find the Current. Trends drive the Tech industry, and finding and riding those trends can be hugely important to creating a career. As in Archambeau’s journey, she saw the growing role of technology as an intern at IBM in the 1980s and knew the industry would thrive over time. As the internet and telecom took hold, she jumped into new and emerging businesses, unafraid of roadblocks. As she puts it: “Ultimately, when it comes to reaching your goals, the real skill lies in spotting the strongest current - in an organization, in an industry, even in the larger economy - and then positioning yourself so it propels you forward. Sail past the opportunities that lead you into the weeds and take the opportunities that will move you toward your goals.”

Business Themes

Shellye-Archambeau-on-Remarkable-People-podcast.jpg
  1. The Power of Networking. One of Archambeau’s not-so-secret strategies toward career success was networking. She is a people person and radiates energy in every conversation. Beyond this natural disposition, Archambeau took a very concerted and intentional approach toward building her network, and it shows. Archambeau crosses paths with Silicon Valley legends like Bill Campbell and Ben Horowitz throughout the book. Beyond one-to-one mentorship relationships, Archambeau joined several organizations to grow her network, including Watermark, the Committee of 200, ITSM Form, Silicon Valley Leadership Group, and more. These groups offered a robust foundation and became a strong community, empowering and inspiring her to lead!

  2. Support and Tradeoffs. As a young college sophomore, Archambeau knew she wanted to be the breadwinner of the family. When she met her soon-to-be husband Scotty, a 38-year-old former NFL athlete, she was direct with him: “I would really like to be able to have someone stay home with the kids, especially when they are in school. But the thing is…I just don’t want it to be me.” Scotty thought patiently, “You know, Archambeau, I’ve had a lot of experiences in my life. I’ve had three different careers and you know I like working. But, I think I could see myself doing that, for you.” That was the icing on top of the cake. The two married and had two children while Archambeau worked up the ranks to become CEO. Scotty took care of the kids, Kethlyn and Kheaton, when Archambeau moved to Silicon Valley for work. She understood the tough tradeoff she was making and acknowledged that her relationship with her daughter felt more strained during Kethlyn’s teenage years. It begs the question, how comfortable are you with the tradeoffs you are making today? Moving to a new city to pursue a career that may strain family dynamics is never an easy decision. Family was always important to Archambeau, but it became front and center when Scotty was diagnosed with blood cancer in 2010. Although she was still CEO of MetricStream, things changed: “I had accumulated vacation days, I was putting off trips and experiences for ‘when the time was right’…We’re going to do things that we would have waited to do. We’re going to them now.” Family and friends became a priority - they always were!

  3. Earning Respect. As a Black woman in Technology, Archambeau had to overcome the odds repeatedly. She recounted: “As a young African American woman, I was accustomed to earning respect. Whenever I got a promotion or a new job, I walked into it understanding that people likely would assume I was not quite qualified or not equity ready. I presumed I need to establish relationships and credibility, to develop a reputation, to prove myself.” While incredibly sad that Archambeau had to deal with this questioning, she learned how to use it to her advantage. As her family moved around the country, Archambeau faced repeated challenges: getting denied from taking advanced classes in school, getting bullied and beaten walking home from school, and starting high school with leg braces in a new city. Through these difficulties, she developed a simple methodology for getting through tough times: “Accept the circumstances, fake it ‘til you make it, control what you can, and trust that things will get better.” Archambeau took that mentality with her and earned the respect of the entire IBM Japan when she presented her introduction slides entirely in Japanese to build trust with her new co-workers. It was the first time a foreign executive had done so. Archambeau’s ability to boldly take action in face of many obstacles is impressive.

Dig Deeper

  • Knowing Your Power | Shellye Archambeau | TEDxSonomaCounty

  • Spelman College Courageous Conversations - Shellye Archambeau

  • Shellye Archambeau: Becoming a CEO (A) - A Harvard Business School Case

  • MetricStream Raises $50M to Take on the GRC Market

tags: Metricstream, Zaplet, Shellye Archambeau, Vinod Khosla, Ben Horowitz, Loudcloud, Bill Campbell, GRC, Japan, Lou Gerstner, IBM, Blockbuster, Netflix, Silicon Valley, Silver Lake, Roger McNamee, Northpoint Communications, Verizon
categories: Non-Fiction
 

September 2021 - Super Mario: How Nintendo Conquered America by Jeff Ryan

This month we dive into the history of Nintendo and Super Mario, the loveable, super-smashing, tennis-playing, go-karting, partier. Jeff Ryan’s book explores the history of Nintendo and the evolution of the Video Game industry to the console competition we have today.

Tech Themes

  1. Constraint Breeds Creativity. Sometimes, challenges drum up creativity like nothing other than having your back against the wall can. This was the case with Nintendo. In 1980, Nintendo’s CEO Hiroshi Yamauchi sent his son-in-law, Minoru Arakawa to Manhattan to launch Nintendo of America. The idea was to launch Nintendo into the large and growing market for arcade cabinet games in the US. Nintendo had developed a Space Invaders knock-off called Radar Scope to take the market by storm. However, it sold incredibly poorly and months after moving to the US, Arakawa found himself with 2,000 large, unsold arcade cabinet games and a disappointed father-in-law. Yamauchi scoured the company for interesting game ideas, not wanting the pre-made cabinets to go to waste, and found one from a young designer named Shigeru Miyamoto. Miyamoto drew inspiration from Popeye and King Kong to come up with Donkey Kong, a revolutionary “platform” style game that involved a character named Jumpman trying to save a damsel in distress Pauline from a giant evil gorilla. After coming up with this crazy concept game, Nintendo still had to re-work the original Radar Scope circuit boards. The boards were shipped from Nintendo’s Japanese headquarters to Manhattan, where Arakawa and his wife carefully removed the Radar Scope game and installed the new Donkey Kong game. Nintendo’s sales network convinced two bars in Seattle to pilot the game and it took off like crazy; people played 120 times per day, yielding $30 of profit to Nintendo every day. Jumpman would later become Mario, Donkey Kong would go on to become a staple character in Nintendo’s video gaming world, and all because of an epic failure and a distressed company.

  2. Cabinet, Console, and Competition. Staying relevant in technology evolution. Nintendo successfully moved from a video game cabinet to the super Nintendo, the Gameboy, the N64, the GameCube, the wii, and now the Switch. At each stop, Nintendo tried hard to leverage all of the resources available in the hardware of the day. By purposefully maxing out its new hardware capabilities, Nintendo was able to build innovation into its games. As an example, Nintendo leveraged a special aspect of code in the NES to build Mario’s initial music theme. While Mario is a silent character, this created a new atmosphere for gamers. Later on, Nintendo would launch the N64 Rumble Pak, which provided haptic feedback through the controller based on gameplay. This became a staple concept for all consoles on the market. However, it wasn’t always fun and games. Nintendo missteps are single-handedly responsible for the creation of Sony’s Playstation. In 1988, a Sony Engineer began secretly developing a chip to help make CD-ROM games compatible with the Nintendo. Nintendo was interested in broadening its capabilities and signed a contract with Sony to produce an add-on device for the Super Nintendo Entertainment System (SNES). Although the two companies had signed a deal, it was clear that Nintendo would have to give up substantial control of the creative rights and hardware to Sony with the add-on. Yamauchi could not give Sony that much control, and in a historic change of direction at the 1991 CES, he went behind Sony’s back to partner with Sony’s biggest rival, Phillips. However, Phillips was not a super-strong development partner and the SNES CD-ROM add-on was plagued with delays. Sony continued the development of a gaming system on its own and Nintendo shifted priorities to its next console, the N64. Sony’s CD-ROM gaming system had a significant advantage over the N64 cartridge-based system in that it allowed much easier and consistent, open standards for developers. Sony went to Square, one of Nintendo’s top game makers, and lured them over to produce its famous Final Fantasy series for the upcoming launch of the PlayStation in 1994. The PlayStation seized significant market share from Nintendo and entered Sony into the gaming space. Nintendo’s decision to opt for control and proprietary formats in the N64 and GameCube helped avoid counterfeit games but left the market open to Sony’s Playstation and consumers that wanted an all-in-one device (games, CDs, DVDs).

  3. Play the Long Game. Miyamoto had the idea for a three-dimensional Mario that would take advantage of all of the improvements in graphics rendering by the early 90s. While the idea gestated, Miyamoto tried to think of how game mechanics for 3D games could work. After serious thought and some development time spent in the early 1990s, Miyamoto shelved the idea because he felt they would need a bigger controller with more buttons to fully realize the vision of a 3D Super Mario. After Nintendo and Miyamoto began development on Super Mario 64 in September 1994, they ran into delays caused by contrasting opinions on camera views and game layout. On top of this, Miyamoto had grander designs than Nintendo had time for, and several courses had to be scrapped to get to a working version. The game shipped after the 1995 holiday season and delayed the launch of the Nintendo64 until April 1996. However, because Nintendo had created such strong, single-player, free-roaming game mechanics, this allowed some of the unused levels to be put into Legend of Zelda: Ocarina of Time, which debuted in 1998. Sometimes it takes time for the world and technology to catch up to your ambitions.

Business Themes

nintendo_timeline_by_vincentweir_darjnzt-fullview.jpg
Super_Mario_64_box_cover.jpg
  1. An Intense Family Business. Nintendo was started in 1889 by Fusajiro Yamauchi to produce flower cards, which are a type of Japanese playing card. Despite significant trouble during the Russo-Japanese War of 1907 and World War II, the company survived long enough for third-generation Hiroshi Yamauchi to take the reigns in 1950. Over the next 20 years, Nintendo would ride the wave of post-war popularity to a 1963 IPO on the Osaka and Kyoto stock exchanges. However, in the late 60’s, appetite for cards decreased and Yamauchi was looking for a new market to support the company’s growth. In 1969, Gunpei Yokoi joined the company and set it off on a new trajectory developing simple electric toys. In the 1970s and 80’s the company repositioned itself as a handheld, console, and cabinet video game producer. Since then, Nintendo has gone on to produce millions of games and systems. There is something amazing to be said about a business that finds its next wave of growth in its S-curve and somehow stays alive through multiple wars, products, and competitors.

  2. Counter-Positioning. Nintendo is famous for its numerous licensing deals to promote its characters on everything to build brand awareness and associations amongst consumers (Super Mario Mac & Cheese anyone?). Nintendo leveraged its history selling toys to children to create a strong brand of reputable characters only rivaled by the likes of Disney today. Because Nintendo focused on a family-friendly, younger customer base (no blood in games on the original Nintendo), it left some un-fulfilled customers in the market. Enter SEGA and Sonic the Hedgehog. SEGA was started as a simple amusement game provider for military bases in the 1940s. The company launched its first video game in 1973, its first console in 1982, and created Sonic in 1991. Sonic was everything Mario was not - he was purposely built to be a character built for teenagers. As School of Game Design points out: “Just as the 19th century expressionists use shape and line to evoke emotional responses, character designers today use the shape of a character’s body to communicate the personality of a character to us. Mario is circular, he has a button nose, a pot belly, and his hands, feet, and head, are all round. Sonic’s design on the other hand is all jaggy triangles, he has spiky hair, pointy cat ears, ski goggle eyes, and torpedo shoes…Right out of the gate the personalities clash. Sonic has the image of a mischievous bad boy, while Mario is playful, and aloof.” This is a classic example of counter-positioning - or directly occupying a competitive place in the industry that is the exact opposite of the incumbent firm. Sonic was the anti-Mario, and helped SEGA launch its Genesis platform.

  3. The Video Game Recession & Supply Chain Bullwhip. While Super Mario and Donkey Kong helped launch a massive interest in video and arcade games, there were some periods in the 1980s when people thought videogames were just a fad. In 1983, the arcade game industry experienced a massive recession driven by a common supply-chain issue called the bullwhip effect. As explained in this simple video, the bullwhip effect occurs when a change in demand has an amplified effect across a supply chain from customer to retailer to wholesale to distributor to manufacturer. The effect causes massive forecasting errors and inventory build up due to an over-extrapolation of demand. In the late 1970s and early 1980s, video games were all the rage driven by Atari’s Pong and Space Invaders games. This attracted a flood of competition from Coleco, Mattel, and Phillips. Everyone forecasted that market saturation was years away, and consumers would be itching for video game and cabinet game systems for the next few years. As a result, many video game companies over-ordered from their cartridge and console manufacturers. Once the video game companies had too much inventory on hand, they started discounting it to try to sell more, but it could only sell so much. After being unable to sell several systems, Atari famously buried some of its inventory at a landfill site in New Mexico. This effect can cause compounding losses for companies, because they buy inventory at full or sometimes above full price, sell games at cheaper prices due to market saturdation, and often have to pay to house or destroy extra inventory. The bullwhip effect is a crippling issue that companies like Peloton are facing today.

Dig Deeper

  • There will Never Ever be another Melee player like Hungrybox - Documentary exploring Professional Super Smash Brothers Athletes

  • Super Mario Bros 30th Anniversary Special Interview with Shigeru Miyamoto and Takashi Tezuka

  • CRASH: The Year Video Games Died

  • The History of the Gameboy

  • The 10 Biggest Mistakes in Nintendo History

tags: Nintendo, Super Mario, Mario, Luigi, Hiroshi Yamauchi, Shigeru Miyamoto, Donkey Kong, Video Games, Jumpman, Wii, Switch, Gamecube, N64, NES, SNES, Zelda, Playstation, Phillips, CDs, DVDs, Disney, SEGA, Sonic, Genesis, Bullwhip Effect, Mattel, Coleco, Pong, Space Invaders, Minoru Arakawa, Gameboy
categories: Non-Fiction
 

August 2021 - Hit Refresh by Satya Nadella, with Greg Shaw and Jill Tracie Nichols

This month we look at how Satya Nadella reignited Microsoft’s fire and attacked new spaces with a growth mindset. The book is loaded with excellent management philosophy and complex Microsoft history.

Tech Themes

  1. Bing: The Other Search Engine. After starting at Microsoft as an engineer and rising through the ranks to lead Microsoft Dynamics (its CRM product), Nadella was handpicked to lead the re-launch of a brand new search engine, Microsoft Bing. Bing was one of Microsoft’s first “born-in-the-cloud” businesses and Nadella quickly recognized four core areas of focus: distributed systems, consumer product design, understanding the economics, of two-sided marketplaces, and AI. Microsoft had a troubled history with search engines and wanted to go big quickly, submitting an offer to buy Yahoo for $45B in February of 2008. Microsoft was rebuffed and thus Nadella found himself launching Search Checkpoint #1 in September of 2008 ahead of a June 2009 Bing launch. What are the odds that Microsoft’s future CEO would have early cloud, distributed systems, and advanced AI leadership experience? It was an almost prescient combination!

  2. Red Dog to Azure. Microsoft started working on the cloud two years after Amazon launched AWS. In 2008, veteran software architects Ray Ozzie and Dave Cutler created a secret team inside Microsoft known as Red Dog, which was focused on building a cloud infrastructure product. Red Dog was stationed under Microsoft’s Servers and Tools business unit (STB), with products such as Windows Server and Microsoft’s powerful RDBMS, SQL Server. In 2010, Microsoft CEO Steve Ballmer asked Nadella to lead the STB business unit and set the vision for their then single-digit millions cloud infrastructure business. It was a precarious situation: “The server and tools business was at the peak of its commercial success and yet it was missing the future. The organizing was deeply divided over the importance of the cloud business. There was constant tension between diverging forces.” How did Nadella resolve this tension? It was simple - he made choices and rallied his team around those decisions. He focused the team on hybrid cloud, data, and ML capabilities where Microsoft could take advantage of its on-premise, large enterprise heritage while providing an on-ramp for customers eager to make the shift to the cloud. Microsoft has since surged to an estimated 20% worldwide market share making it one of the biggest and fastest-growing products in the world!

  3. Re-Mixed Reality. Microsoft’s gaming portfolio is impressive: Xbox, Mojang (aka Minecraft), Zenimax Media (Maker of Fallout, Wolfenstein, and DOOM). Microsoft also owns the Hololens, a virtual reality headset that competes with Facebook’s Oculus. Many believe the future computing generations will take place in virtual reality, augmented, or mixed reality. Nadella doesn’t mince words - he believes that the future will not be in virtual reality (as Facebook is betting) but rather in mixed reality, a combination of augmented reality (AR) and virtual reality, where the user experiences an augmented experience but still maintains some semblance of the outside world. Nadella lays out the benefits: “HoloLens provides access to mixed reality in which the users can navigate both their current location - interact with people in the same room - and a remote environment while also manipulating holograms and other digital objects.” Virtual reality blocks out the outside world, but that can be an overwhelming experience and impractical particularly for enterprise users of AR/VR/MR technologies. One of the big users of the HoloLens is the US Army, which recently signed a rumored $22B deal with Microsoft. It is still early days, but the future needs a new medium of computing and it might just be mixed reality!

Business Themes

0417red_microsoftlinux.jpg
  1. Leading with Empathy. Satya Nadella’s life changed with the birth of his son. “The arrival of our son, Zain, in August 1996 had been a watershed moment in Anu’s and my life together. His suffering from asphyxia in utero had changed our lives in ways we had not anticipated. We came to understand life as something that cannot always be solved in the manner we want. Instead, we had to learn to cope. When Zain came home from the intensive care unit, Anu internalized this understanding immediately. There were multiple therapies to be administered to him every day, not to mention quite a few surgeries he needed that called for strenuous follow-up care after nerve-racking ICU stays…My son’s condition requires that I draw daily upon the very same passion for ideas and empathy that I learned from my parents.” Nadella reiterates the importance of empathy throughout the book, and rightly so, empathy is viewed as the most important leadership skill, according to recent research. How does one increase empathy? It’s actually quite simple - talk to people! Satya understands this: “It is impossible to be an empathetic leader sitting in an office behind a computer screen all day. An empathetic leader needs to be out in the world, meeting people where they live, and seeing how the technology we create affects their daily activities.” Leadership requires empathy - hopefully, we see more of it from big technology soon!

  2. Frenemies. One of the first things that Satya Nadella did after taking over the CEO role from Steve Ballmer in 2014 was reach out to Tim Cook. Apple and Microsoft had always had a love-hate relationship. In 1997, Microsoft saved Apple shortly after Steve Jobs returned by investing $150M in the company so that Apple could stave off potential bankruptcy. However, in 2014, Nadella called on Apple: “I decided we needed to get Office everywhere, including iOS and Android…I wanted unambiguously to declare, both internally and externally, that the strategy would be to center our innovation agenda around users’ needs and not simply their device.” Microsoft had tried to become a phone company with Windows Mobile in 2000, tried again with Windows Phone in 2010, and tried even harder at Windows Phone in 2013 with a $7.2B acquisition of Nokia’s mobile phone unit. Although Nadella voted ‘No’ on the deal before becoming CEO, he was forced to manage the company through a total write-off of the acquisition and the elimination of eighteen thousand jobs. So how could Nadella catch up to the mobile wave? “For me, partnerships - particularly with competitors - have to be about strengthening a company’s core businesses, which ultimately centers on creating additional value for the customer…We have to face reality. When we have a great product like Bing, Office, or Cortana but someone else has created a strong market position with their service or device, we can’t just sit on the sidelines. We have to find smart ways to partners so that our products can become available on each other's popular platforms.” Nobody knows platforms like Microsoft; Bill Gates wrote the definition of a platform: “A platform is when the economic value of everybody that uses it, exceeds the value of the company that creates it.” Nadella got over his predecessor’s worry and hatred of the competition to bring Microsoft’s software to other platforms to strengthen both of their leadership positions.

  3. Regulation and Technology. Nadella devotes an entire chapter to the idea of trust in the digital age. Using three case studies - North Korea’s attack on Sony’s servers, Edward Snowden’s leaked documents (that were held on Microsoft’s servers), and the FBI’s lawsuit against Apple to unlock an iPhone that might contain criminal information - Nadella calls for increased(!) regulation, particularly around digital technology. Satya uses a simple equation for trust: “Empathy + Shared values + Safety and Reliability = Trust over time.” Don’t you love it when a company that the government sued over anti-trust practices calls on the government to develop better laws! You’d love it even more if you saw how they used the same tactics to launch Microsoft Teams! Regulation in technology has been a hot topic recently, and Nadella is right to call on the government to create new laws for our digital world: “We do not believe that courts should seek to resolve issues of twenty-first-century technology relying on law that was written in the era of the adding machine.” He goes further to suggest potential remedies, including an efficient system for government access to corporate data, stronger privacy protections, globalized digital evidence sharing, and transparency of corporate and government data. I imagine the trend will be toward more regulation, especially with the passage of recent data laws like GDPR or CCPA, but I’m not sure we will see any real sweeping changes.

Dig Deeper

  • “Culture Eats Strategy for Breakfast” - How Satya Nadella Rebooted Microsoft

  • Satya Nadella Interview at Stanford Business School (2019)

  • Microsoft is Rolling out a New Framework to its Leaders - Business Insider

  • Satya Nadella email to employees on first day as CEO

  • HoloLens Mixed Reality Demonstration

tags: Microsoft, Satya Nadella, Apple, Tim Cook, Bing, Yahoo, Xbox, Minecraft, Facebook, Army, Mixed Reality, AR, VR, HoloLens, Oculus, Steve Jobs, Bill Gates, iOS, Android, Office, Sony, North Korea, FBI, Snowden, Empathy, Regulation, Privacy
categories: Non-Fiction
 

July 2021 - Genentech: The Beginnings of Biotech by Sally Smith Hughes

This month we dive into the birth of the biotech industry and learn about Genentech, a biotech company that was built on the back of novel recombinant DNA research in the 1970’s. The book covers most of the discovery and pre-IPO story of the company, weaving in commentary about political, social, and fundraising challenges the company faced.

Tech Themes

  1. Education & Profits. The biotech industry creates an interesting symbiotic relationship between universities and businesses. Genentech was founded by an out-of-work venture capitalist named Bob Swanson and an exuberant scientific genius named Herb Boyer. In 1973, Boyer and a colleague, Stan Cohen, had conceived of the idea of using restriction enzymes to cleave DNA fragments, allowing the scientists to insert and express almost any gene in bacteria. In 1977-78, Boyer, Riggs, and Itakura showed that the recombinant DNA process could create somatostatin and insulin. Because of the unbelievable economic potential of their findings, Stanford (where Cohen worked) and UCSF (where Boyer worked) decided to file a patent on the recombinant DNA procedure. The patent process sparked a massive debate about the commercialized use of their procedure, with several scientists, like National Academy of Science Chairman Paul Berg, calling for an investigation and formal rules. As Hughes notes, “The 1970s was notably inhospitable to professors forming consuming relationships with business, let alone taking the almost unheard-of step of founding a company without giving up a professorship.” This challenge of balance incentives: helping society, contributing all biological research back to the world for free, and personal financial and celebrity gain is hard. Many of the world’s leading researchers are motivated not only by deep investigative science but also by the notoriety of being published in the world’s leading journals. Today, several of the world’s leading AI researchers face a similar dilemma. In 2012, Geoff Hinton, a former Unversity of Toronto professor, auctioned off his AI algorithm and job between Google, Baidu, and Microsoft for a one-time £30M payout. Databricks, a big data company, recently raised money at a $38B valuation - their CEO, Ali Ghodsi, conceived of the idea for Databricks as a Ph.D. student at UC Berkeley, where he remains an adjunct professor. The twisted and complicated world of Academia and corporations continues!

  2. IP. One of the big challenges of Genentech’s unique academic heritage was a massive intellectual property battle that would last for years. In 1976, Bob Swanson set out to negotiate an exclusive license to the Boyer-Cohen patent from Stanford and UCSF. He was rebuffed by the administration, trying to avoid the politically heated topic of recombinant DNA research. Things were made even more complicated in 1978. On New Year’s eve at 12:00 am, soon-to-be new employees Peter Seeburg and Axel Ulrich broke into their former UCSF lab to take research specimens related to contract research work they were performing for Genentech. In 1999, after years of patent disputes, Genentech finally settled the patent infringement for $200M, one of the largest biotech settlements ever. With such enormous sums of money at stake, the question of who owns the invention and how that invention is used is hotly debated and contested - pharmaceutical companies have seen larger and larger misuse settlements.

  3. Regulation & Action. An often forgotten aspect of commercial industry change is regulation, perhaps because it is complicated and slow to develop, but the effects can be enormous. iN 1983, in reaction to chronic under-investment in drugs serving small patient population sizes (“Rare Disease”), the Department of Health and Human Services and FDA helped enact the Orphan Drug act of 1983. “That law, the Orphan Drug Act, provided financial incentives to attract industry’s interest through a seven-year period of market exclusivity for a drug approved to treat an orphan disease, even if it were not under patent, and tax credits of up to 50 percent for research and development expenses. In addition, FDA was authorized to designate drugs and biologics for orphan status (the first step to getting orphan development incentives), provide grants for clinical testing of orphan products, and offer assistance in how to frame protocols for investigations.” A further revision to the Act in 2002 specified a rare disease as a disease affecting a patient population of <200,000 people. Coupled with these amazing incentives was the ability to price drugs in response to the exclusivity received for performing the research that led to the drug’s discovery. Such exclusivity has led to much higher prices for rare disease drugs, causing anger from patients (and insurance groups) who need to pay for these effective but high-priced drugs. Some economists have even studied the idea of “fairness” in orphan drug pricing - considering whether a rare disease drug that cures 90% of patients with the disease should be priced significantly higher than those that cure a smaller percentage of the population. These incentives have produced a massive influx of investment into the space, with 838 total orphan drug indications and 564 distinct drugs created to help patients with rare diseases.

Business Themes

drug-development-failure-and-success-lrg1.jpg
S-Curves-New-Products.png
  1. Partnerships. The biotech industry thrives off of partnerships. This is primarily due to the enormous cost of bringing a drug to market, with a recent paper pinning the number for just R&D costs at greater than ~$1B. Beyond the cost of FDA Phase 1, 2, and 3 trials - $4M, $13M, and $20M median - companies often have to deal with many failures and re-directions along the way. On top of that, companies have to manufacture, sell, and market the drug to patient populations and physicians. Genentech was one of the first companies to establish partnerships with major pharmaceuticals companies. Genentech considered many different partnerships for different parts of its drug pipeline (something that is still done today). In August of 1978, Genentech partnered with Kabi, a Swedish pharmaceutical manufacturer, to produce human growth hormone using the Genentech approach. The deal included a $1M upfront payment for exclusive foreign marketing rights. Three weeks later, Genentech partnered with Eli Lilly to start making human insulin using the recombinant DNA approach - the deal was a twenty-year R&D contract with an upfront fee of $500,000 for exclusive worldwide rights to insulin; Genentech received 6% royalties and City of Hope (an education institution) received 2% of product sales. In January of 1980, Genentech signed a deal with Hoffman-La Roche to collaborate on leukocyte and fibroblast interferon - a chemical that was believed to be a potential cancer panacea. All of these deals were new back then but are now commonplace today - with marketing, R&D, and royalty partnerships the norm in the biotech and pharmaceuticals industry.

  2. The Perils and Beauty R&D. Pharmaceutical and Biotech companies face a very difficult challenge in bringing a drug to market. Beyond the costs detailed above, the success rate is so low that companies often need to have multiple scientific projects going on at once. The book details this challenge: “By the second quarter of 1979, the company had four new projects underway, all but one sponsored by a major corporation: Hoffman-La Roche on interferon; Monsanto on animal growth hormone; Institut Merieux on hepatitis B vaccine; and a Genentech fund project on the hormone thymosin.” This was all in addition to its Kabi and Eli Lilly deals! This brings up the idea of S curves, whereby product adoption reaches a peak and new products pick up to continue the growth of the organization. This is common in all businesses and markets but especially difficult to predict in biotech and pharma where drug development takes years, patents come and go, and new drug success is probabilistically low. This is the double-sided challenge of big pharma, where companies debate internal R&D spending or external M&A to drive new growth vectors on a company’s S-Cuve. It’s something that Genetech is still trying to figure out today.

  3. A Silicon Valley Story. While the center of the biotech industry today is arguably Cambridge, MA, Genentech was an original Silicon Valley - high risk/high reward bet. Genentech was funded by the historically great Kleiner Perkins - a silicon valley VC born out of the semiconductor company Fairchild Semiconductor (where Kleiner was part of the traitorous eight). Kleiner was joined by Tom Perkins, who worked at Hewlett Packard in the 1960s, and brought HP into the minicomputer business. As one of the earliest venture capitalists, with a great knowledge of the Silicon Valley semiconductor and technological innovation boom, they hit big winners with Compaq, EA, Amazon, Sun Microsystems, and many others. A lot of these investments were speculative at the time and the team understood more risk at the earlier stages meant more reward down the line. As Perkins put it: “Kleiner & Perkins realizes that an investment in Genentech is highly speculative, but we are in the business of making highly speculative investments.” After weeks of meeting with Swanson and a key meeting with Herb Boyer, Perkins took the plunge, leading a $100,000 seed investment in Genentech in May of 1976. Perkins commented: “I concluded that the experiment might not work, but at least they knew how to do the experiment.” Despite the work of raising billions of dollars for Genentech’s continually growing product and partnership pipeline, Perkins commented years later on his involvement with Genentech: “I can’t remember at what point it dawned on me that Gentech would probably be the most important deal of my life, in many terms - the returns, the social benefits, the excitement, the technical prowess, and the fun.” Perkins stayed on the board for 20 years and Kleiner Perkins led several investments in the company over the years. Genentech eventually got acquired by Hoffman-La Roche (now called Roche), when they bought 60% of the company for $2B in 1990 and the rest of the company for $47B in 2009. Genentech was the first big biotech win and helped establish Silicon Valley’s cache in the process!

Dig Deeper

  • An Overview of Genetic Engineering (the tech underpinning Genentech)

  • The History of Insulin - 100 Years of Innovation by Dr. Daniel Drucker

  • How Drug Prices Work by the Wall Street Journal

  • How to Value Biotech Stocks by the Biotechnology Innovation Organization

  • Wonderful Life: An Interview with Herb Boyer

tags: Biotech, Genentech, Eli Lilly, Orphan Drug Act, Bob Swanson, Paul Berg, National Academy of Science, Stan Cohen, Herb Boyer, Stanford, UCSF, Geoff Hinton, Databricks, Ali Ghodsi, UC Berkeley, Pharma, FDA, Rare Disease
categories: Non-Fiction
 

June 2021 - Letters to the Nomad Partnership 2001-2013 (Nick Sleep's and Qais Zakaria's Investor Letters)

This month we review a unique source of information - mysterious fund manager Nick Sleep’s investment letters. Sleep had an extremely successful run and identified several very interesting companies and characteristics of those companies which made for great investments. He was early to uncover Amazon, Costco, and others - riding their stocks into the stratosphere over the last 20 years. These letters cover the internet bubble, the 08/09 crisis, and all types of interesting businesses across the world.

The full letters can be found here

The full letters can be found here

Tech Themes

  1. Scale Benefits Shared. Nick Sleep’s favored business model is what he calls Scale Benefits Shared. The idea is straight forward and appears across industries. Geico, Amazon, and Costco all have this business model. Its simple - companies start with low prices and spend only on the most important things. Over time as the company scales (more insured drivers, more online orders, more stores) they pass on the benefits of scale to the customer with even further lower prices. The consumer then buys more with the low-cost provider. This has a devastating effect on competition - it forces companies to exit the industry because the one sharing the scale benefits has to become hyper-efficient to continue to make the business model work. “In the case of Costco scale efficiency gains are passed back to the consumer in order to drive further revenue growth. That way customers at one of the first Costco stores (outside Seattle) benefit from the firm’s expansion (into say Ohio) as they also gain from the decline in supplier prices. This keeps the old stores growing too. The point is that having shared the cost savings, the customer reciprocates, with the result that revenues per foot of retailing space at Costco exceed that at the next highest rival (WalMart’s Sam’s Club) by about fifty percent.” Jeff Bezos was also very focused on this, his 2006 annual letter highlighted as much: “Our judgment is that relentlessly returning efficiency improvements and scale economies to customers in the form of lower prices creates a virtuous cycle that leads over the long-term to a much larger dollar amount of free cash flow, and thereby to a much more valuable Amazon.com. We have made similar judgments around Free Super Saver Shipping and Amazon Prime, both of which are expensive in the short term and – we believe – important and valuable in the long term.” So what companies today are returning scale efficiencies with customers? One recent example is Snowflake - which is a super expensive solution but is at least posturing correctly in favor of this model - the recent earnings call highlighted that they had figured out a better way to store data, resulting in a storage price decrease for customers. Fivetran’s recent cloud data warehouse comparison showed Snowflake was both cheaper and faster than competitors Redshift and Bigquery - a good spot to be in! Another example of this might be Cloudflare - they are lower cost than any other CDN in the market and have millions of free customers. Improvements made to the core security+CDN engine, threat graph, and POP locations result in better performance for all of their free users, which leads to more free users, more threats, vulnerabilities, and location/network demands - a very virtuous cycle!

  2. The Miracle of Compound Growth & Its Obviousness. While appreciated in some circles, compounding is revered by Warren Buffett and Nick Sleep - it’s a miracle worth celebrating every day. Sleep takes this idea one step further, after discussing how the average hold period of stocks has fallen significantly over the past few decades: “The fund management industry has it that owning shares for a long time is futile as the future is unknowable and what is known is discounted. We respectfully disagree. Indeed, the evidence may suggest that investors rarely appropriately value truly great companies.” This is quite a natural phenomenon as well - when Google IPO’d in 2004 for a whopping $23bn, were investors really valuing the company appropriately? Were Visa ($18Bn valuation, largest US IPO in history) and Mastercard ($5.3Bn valuation) being valued appropriately? Even big companies like Apple in 2016 valued at $600Bn were arguably not valued appropriately. Hindsight is obvious, but the durability of compounding in great businesses is truly a myth to behold. That’s why Sleep and Zakaria wound down the partnership in 2014, opting to return LP money and only own Berkshire, Costco, and Amazon for the next decade (so far that’s been a great decision!). While frequently cited as a key investing principle, compounding in technology, experiences, art, and life are rarely discussed, maybe because they are too obvious. Examples of compounding (re-investing interest/dividends and waiting) abound: Moore’s Law, Picasso’s art training, Satya Nadella’s experience running Bing and Azure before becoming CEO, and Beatles playing clubs for years before breaking on the scene. Compounding is a universal law that applies to so much!

  3. Information Overload. Sleep makes a very important but subtle point toward the end of his letters about the importance of reflective thinking:

    BBC Interviewer: “David Attenborough, you visited the North and South Poles, you witnessed all of life in-between from the canopies of the tropical rainforest to giant earthworms in Australia, it must be true, must it not, and it is a quite staggering thought, that you have seen more of the world than anybody else who has ever lived?”

    David Attenborough: “Well…I suppose so…but then on the other hand it is fairly salutary to remember that perhaps the greatest naturalist that ever lived and had more effect on our thinking than anybody, Charles Darwin, only spent four years travelling and the rest of the time thinking.”

    Sleep: “Oh! David Attenborough’s modesty is delightful but notice also, if you will, the model of behaviour he observed in Charles Darwin: study intensely, go away, and really think.”

    There is no doubt that the information age has ushered in a new normal for daily data flow and news. New information is constant and people have the ability to be up to date on everything, all the time. While there are benefits to an always-on world, the pace of information flow can be overwhelming and cause companies and individuals to lose sight of important strategic decisions. Bill Gates famously took a “think week” each year where he would lock himself in a cabin with no internet connection and scan over hundreds of investment proposals from Microsoft employees. A Harvard study showed that reflection can even improve job performance. Sometimes the constant data flow can be a distraction from what might be a very obvious decision given a set of circumstances. Remember to take some time to think!

principal-agent-problem.png
image-13.png

Business Themes

  1. Psychological Mistakes. Sleep touches on several different psychological problems and challenges within investing and business, including the role of Social Proof in decision making. Social proof occurs when individuals look to others to determine how to behave in a given situation. A classic example of Social Proof comes from an experiment done by Psychologist, Stanley Milgram, in which he had groups of people stare up at the sky on a crowded street corner in New York City. When five people were standing and looking up (as opposed to a single person), many more people also stopped to look up, driven by the group behavior. This principle shows up all the time in business and is a major proponent in financial bubbles. People see others making successful investments at high valuations and that drives them to do the same. It can also drive product and strategic decisions - companies launching dot-com names in the 90’s to drive their stock price up, companies launching corporate venture arms in rising markets, companies today deciding they need a down-market “product-led growth” engine. As famed investor Stan Druckenmiller notes, its hard to sit idly by while others (who may be less informed) crush certain types of investments: “I bought $6 billion worth of tech stocks, and in six weeks I had lost $3 billion in that one play. You asked me what I learned. I didn’t learn anything. I already knew that I wasn’t supposed to do that. I was just an emotional basketcase and I couldn’t help myself. So maybe I learned not to do it again, but I already knew that.”

  2. Incentives, Psychology, and Ownership Mindset. Incentives are incredibly powerful in business and its surprisingly difficult to get people to do the right thing. Sleep spends a lot of time on incentives and the so-called Principal-Agent Conflict. Often times the Principal (Owner, Boss, Purchaser, etc.) may employ an Agent (Employee, Contractor, Service) to accomplish something. However the goals and priorities of the principal may not align with that agent. As an example, when your car breaks down and you need to go to a local mechanic to fix it, you (the principal) want to find someone to fix the car as well and as cheaply as possible. However, the agent (the mechanic) may be incentivized to create the biggest bill possible to drive business for their garage. Here we see the potential for misaligned incentives. After 5 years of really strong investment results, Sleep and Zakaria noticed a misaligned incentive of their own: “Which brings me to the subject of the existing performance fee. Eagle-eyed investors will not have failed but notice the near 200 basis point difference between gross and net performance this year, reflecting the performance fee earned. We are in this position because performance for all investors is in excess of 6% per annum compounded. But given historic performance, that may be the case for a very long time. Indeed, we are so far ahead of the hurdle that if the Partnership now earned pass-book rates of return, say 5% per annum, we would continue to “earn” 20% performance fees (1% of assets) for thirty years, that is, until the hurdle caught up with actual results. During those thirty years, which would see me through to retirement, we would have added no value over the money market rates you can earn yourself, but we would still have been paid a “performance fee”. We are only in this position because we have done so well, and one could argue that contractually we have earned the right by dint of performance, but just look at the conflicts!” They could have invested in treasury bonds and collected a performance fee for years to come but they knew that was unfair to limited partners. So the duo created a resetting fee structure, that allowed LPs to claw back performance fees if Nomad did not exceed the 6% hurdle rate for a given year. This kept the pair focused on driving continued strong results through the life of the partnership.

  3. Discovery & Pace. Nick Sleep and Qais Zakaria looked for interesting companies in interesting situations. Their pace is simply astounding: “When Zak and I trawled through the detritus of the stock market these last eighteen months (around a thousand annual reports read and three hundred companies interviewed)…” Sleep and Zakaria put up numbers: 55 annual reports per month (~2 per day), 17 companies interviewed per month (meeting every other day)! That is so much reading. Its partially unsurprising that after a while they started to be able to find things in the annual reports that piqued their interest. Not only did they find retrospectively obvious gems like Amazon and Costco, they also looked all around the world for mispricings and interesting opportunities. One of their successful international investments took place in Zimbabwe, where they noticed significant mispricing involving the Harare Stock Exchange, which opened in 1896 but only started allowing foreign investment in 1993. While Nomad certainly made its name on the Scaled efficiencies shared investment model, Zimbabwe offered Sleep and Zakaria to prioritize their second model: “We have little more than a handful of distinct investment models, which overlap to some extent, and Zimcem is a good example of a second model namely, ‘deep discount to replacement cost with latent pricing power.’” Zimcem was the country’s second-largest cement producer, which traded at a massive discount to replacement cost due to terrible business conditions (inflation growing faster than the price of cement). Not only did Sleep find a weird, mispriced asset, he also employed a unique way of acquiring shares to further increase his margin of safety. “The official exchange rate at the time of writing is Z$9,100 to the U$1. The unofficial, street rate is around Z$17,000 to the U$1. In other words, the Central Bank values its own currency at over twice the price set by the public with the effect that money entering the country via the Central Bank buys approximately half as much as at the street rate. Fortunately, there is an alternative to the Central Bank for foreign investors, which is to purchase Old Mutual shares in Johannesburg, re-register the same shares in Harare and then sell the shares in Harare. This we have done.“ By doing this, Nomad was able to purchase shares at a discounted exchange rate (they would also face the exchange rate on sale, so not entirely increasing the margin of safety). The weird and off the beaten path investments and companies can offer rich rewards to those who are patient. This was the approach Warren Buffett employed early on in his career, until he started focusing on “wonderful businesses” at Charlie Munger’s recommendation.

Dig Deeper

  • Overview of Several Scale Economies Shared Businesses

  • Investor Masterclass Learnings from Nick Sleep

  • Warren Buffett & Berkshire’s Compounding

  • Jim Sinegal (Costco Founder / CEO) - Provost Lecture Series Spring 2017

  • Robert Cialdini - Mastering the Seven Principles of Influence and Persuasion

tags: Costco, Warren Buffett, Berkshire Hathaway, Geico, Jim Sinegal, Cloudflare, Snowflake, Visa, Mastercard, Google, Fivetran, Walmart, Apple, Azure, Bing, Satya Nadella, Beatles, Picasso, Moore's Law, David Attenborough, Nick Sleep, Qais Zakaria, Charles Darwin, Bill Gates, Microsoft, Stanley Druckenmiller, Charlie Munger, Zimbabwe, Harare
categories: Non-Fiction
 

May 2021 - Crossing the Chasm by Geoffrey Moore

This month we take a look at a classic high-tech growth marketing book. Originally published in 1991, Crossing the Chasm became a beloved book within the tech industry although its glory seems to have faded over the years. While the book is often overly prescriptive in its suggestions, it provides several useful frameworks to address growth challenges primarily early on in a company’s history.

Tech Themes

  1. Technology Adoption Life Cycle. The core framework of the book discusses the evolution of new technology adoption. It was an interesting micro-view of the broader phenomena described in Carlota Perez’s Technological Revolutions. In Moore’s Chasm-crossing world, there are five personas that dominate adoption: innovators, early adopters, early majority, late majority, and laggards. Innovators are technologists, happy to accept more challenging user experiences to push the boundaries of their capabilities and knowledge. Early adopters are intuitive buyers that enjoy trying new technologies but want a slightly better experience. The early majority are “wait and see” folks that want others to battle test the technology before trying it out, but don’t typically wait too long before buying. The late majority want significant reference material and usage before buying a product. Laggards simply don’t want anything to do with new technology. It is interesting to think of this adoption pattern in concert with big technology migrations of the past twenty years including: mainframes to on-premise servers to cloud computing, home phones to cell phones to iphone/android, radio to CDs to downloadable music to Spotify, and cash to check to credit/debit to mobile payments. Each of these massive migration patterns feels very aligned with this adoption model. Everyone knows someone ready to apply the latest tech, and someone who doesn’t want anything to do with it (Warren Buffett!).

  2. Crossing the Chasm. If we accept the above as a general way products are adopted by society (obviously its much more of a mish/mash in reality), we can posit that the most important step is from the early adopters to the early majority - the spot where the bell curve (shown below) really opens up. This is what Geoffrey Moore calls Crossing the Chasm. This idea is highly reminiscent of Clay Christensen’s “not good enough” disruption pattern and Gartner’s technology hype cycle. The examples Moore uses (in 1991) are also striking: Neural networking software and desktop video conferencing. Moore lamented: “With each of these exciting, functional technologies it has been possible to establish a working system and to get innovators to adopt it. But it has not as yet been possible to carry that success over to the early adopters.” Both of these technologies have clearly crossed into the mainstream with Google’s TensorFlow machine learning library and video conferencing tools like Zoom that make it super easy to speak with anyone over video instantly. So what was the great unlock for these technologies, that made these commercially viable and successfully adopted products? Well since 1990 there have been major changes in several important underlying technologies - computer storage and data processing capabilities are almost limitless with cloud computing, network bandwidth has grown exponentially and costs have dropped, and software has greatly improved the ability to make great user experiences for customers. This is a version of not-good-enough technologies that have benefited substantially from changes in underlying inputs. The systems you could deploy in 1990 just could not have been comparable to what you can deploy today. The real question is - are there different types of adoption curves for differently technologies and do they really follow a normal distribution as Moore shows here?

  3. Making Markets & Product Alternatives. Moore positions the book as if you were a marketing executive at a high-tech company and offers several exercises to help you identify a target market, customer, and use case. Chapter six, “Define the Battle” covers the best way to position a product within a target market. For early markets, competition comes from non-consumption, and the company has to offer a “Whole Product” that enables the user to actually derive benefit from the product. Thus, Moore recommends targeting innovators and early adopters who are technologist visionaries able to see the benefit of the product. This also mirrors Clayton Christensen’s commoditization de-commoditization framework, where new market products must offer all of the core components to a system combined into one solution; over time the axis of commoditization shifts toward the underlying components as companies differentiate by using faster and better sub-components. Positioning in these market scenarios should be focused on the contrast between your product and legacy ways of performing the task (use our software instead of pen and paper as an example). In mainstream markets, companies should position their products within the established buying criteria developed by pragmatist buyers. A market alternative serves as the incumbent, well-known provider and a product alternative is a near upstart competitor that you are clearly beating. What’s odd here is that you are constantly referring to your competitors as alternatives to your product, which seems counter-intuitive but obviously, enterprise buyers have alternatives they are considering and you need to make the case that your solution is the best. Choosing a market alternative lets you procure a budget previously used for a similar solution, and the product alternative can help differentiate your technology relative to other upstarts. Moore’s simple positioning formula has helped hundreds of companies establish their go-to-market message: “For (target customers—beachhead segment only) • Who are dissatisfied with (the current market alternative) • Our product is a (new product category) • That provides (key problem-solving capability). • Unlike (the product alternative), • We have assembled (key whole product features for your specific application).”

Business Themes

0_KIXz2tAVqXVREkyd.png
Whole-Product-5-PRODUCT-LEVELS-PHILIP-KOTLER.png
Zz0xZTMzMGUxNGRlNWQxMWVhYTYyMTBhMTMzNTllZGE5ZA==.png
  1. What happened to these examples? Moore offers a number of examples of Crossing the Chasm, but what actually happened to these companies after this book was written? Clarify Software was bought in October 1999 by Nortel for $2.1B (a 16x revenue multiple) and then divested by Nortel to Amdocs in October 2001 for $200M - an epic disaster of capital allocation. Documentum was acquired by EMC in 2003 for $1.7B in stock and was later sold to OpenText in 2017 for $1.6B. 3Com Palm Pilot was a mess of acquisitions/divestitures. Palm was acquired by U.S Robotics which was acquired by 3COM in 1997 and then subsequently spun out in a 2000 IPO which saw a 94% drop. Palm stopped making PDA devices in 2008 and in 2010, HP acquired Palm for $1.2B in cash. Smartcard maker Gemplus merged with competitor Axalto in an 1.8Bn euro deal in 2005, creating Gemalto, which was later acquired by Thales in 2019 for $8.4Bn. So my three questions are: Did these companies really cross the chasm or were they just readily available success stories of their time? Do you need to be the company that leads the chasm crossing or can someone else do it to your benefit? What is the next step in the chasm journey after its crossed and why did so many of these companies fail after a time?

  2. Whole Products. Moore leans into an idea called the Whole Product Concept which was popularized by Theodore Levitt’s 1983 book The Marketing Imagination and Bill Davidow’s (of early VC Mohr Davidow) 1986 book Marketing High Technology. Moore explains the idea: “The concept is very straightforward: There is a gap between the marketing promise made to the customer—the compelling value proposition—and the ability of the shipped product to fulfill that promise. For that gap to be overcome, the product must be augmented by a variety of services and ancillary products to become the whole product.” There are four different perceptions of the product: “1. Generic product: This is what is shipped in the box and what is covered by the purchasing contract. 2.Expected product: This is the product that the consumer thought she was buying when she bought the generic product. It is the minimum configuration of products and services necessary to have any chance of achieving the buying objective. For example, people who are buying personal computers for the first time expect to get a monitor with their purchase-how else could you use the computer?—but in fact, in most cases, it is not part of the generic product. 3.Augmented product: This is the product fleshed out to provide the maximum chance of achieving the buying objective. In the case of a personal computer, this would include a variety of products, such as software, a hard disk drive, and a printer, as well as a variety of services, such as a customer hotline, advanced training, and readily accessible service centers. 4. Potential product: This represents the product’s room for growth as more and more ancillary products come on the market and as customer-specific enhancements to the system are made. These are the product features that have maybe expected or additional to drive adoption.” Moore makes a subtle point that after a while, investments in the generic/out-of-the-box product functionality drive less and less purchase behavior, in tandem with broader market adoption. Customers want to be wooed by the latest technology and as products become similar, customers care less about what’s in the product today, and more about what’s coming. Moore emphasizes Whole Product Planning where you can see how you get to those additional features into the product over time - but Moore was also operating in an era when product decisions and development processes were on two-year+ timelines and not in the DevOps era of today, where product updates are pushed daily in some cases. In the bottoms-up/DevOps era, its become clear that finding your niche users, driving strong adoption from them, and integrating feature ideas from them as soon as possible can yield a big success.

  3. Distribution Channels. Moore focuses on each of the potential ways a company can distribute its solutions: Direct Sales, two-tier retail, one-tier retail, internet retail, two-tier value-added reselling, national roll-ups, original equipment manufacturers (OEMs), and system integrators. As Moore puts it, “The number-one corporate objective, when crossing the chasm, is to secure a channel into the mainstream market with which the pragmatist customer will be comfortable.” These distribution types are clearly relics of technology distribution in the early 1990s. Great direct sales have produced some of the best and biggest technology companies of yesterday including IBM, Oracle, CA Technologies, SAP, and HP. What’s so fascinating about this framework is that you just need one channel to reach the pragmatist customer and in the last 10 years, that channel has become the internet for many technology products. Moore even recognizes that direct sales had produced poor customer alignment: “First, wherever vendors have been able to achieve lock-in with customers through proprietary technology, there has been the temptation to exploit the relationship through unfairly expensive maintenance agreements [Oracle did this big time] topped by charging for some new releases as if they were new products. This was one of the main forces behind the open systems rebellion that undermined so many vendors’ account control—which, in turn, decrease predictability of revenues, putting the system further in jeopardy.” So what is the strategy used by popular open-source bottoms up go-to-market motions at companies like Github, Hashicorp, Redis, Confluent and others? Its straightforward - the internet and simple APIs (normally on Github) provide the fastest channel to reach the developer end market while they are coding. When you look at Open Source scaling, it can take years and years to Cross the Chasm because most of these early open source adopters are technology innovators, however, eventually, solutions permeate into massive enterprises and make the jump. With these new go-to-market motions coming on board, driven by the internet, we’ve seen large companies grow from primarily inbound marketing tactics and less direct outbound sales. The companies named above as well as Shopify, Twilio, Monday.com and others have done a great job growing to a massive scale on the backs of their products (product-led growth) instead of a salesforce. What’s important to realize is that distribution is an abstract term and no single motion or strategy is right for every company. The next distribution channel will surprise everyone!

Dig Deeper

  • How the sales team behind Monday is changing the way workplaces collaborate

  • An Overview of the Technology Adoption Lifecycle

  • A Brief History of the Cloud at NDC Conference

  • Frank Slootman (Snowflake) and Geoffrey Moore Discuss Disruptive Innovations and the Future of Tech

  • Growth, Sales, and a New Era of B2B by Martin Casado (GP at Andreessen Horowitz)

  • Strata 2014: Geoffrey Moore, "Crossing the Chasm: What's New, What's Not"

tags: Crossing the Chasm, Github, Hashicorp, Redis, Monday.com, Confluent, Open Source, Snowflake, Shopify, Twilio, Geoffrey Moore, Gartner, TensorFlow, Google, Clayton Christensen, Zoom, nORTEL, Amdocs, OpenText, EMC, HP, CA, IBM, Oracle, SAP, Gemalto, DevOps
categories: Non-Fiction
 

April 2021 - Innovator's Solution by Clayton Christensen and Michael Raynor

This month we take another look at disruptive innovation in the counter piece to Clayton Christensen’s Innovator’s Dilemma, our July 2020 book. The book crystallizes the types of disruptive innovation and provides frameworks for how incumbents can introduce or combat these innovations. The book was a pleasure to read and will serve as a great reference for the future.

Tech Themes

  1. Integration and Outsourcing. Today, technology companies rely on a variety of software tools and open source components to build their products. When you stitch all of these components together, you get the full product architecture. A great example is seen here with Gitlab, an SMB DevOps provider. They have Postgres for a relational database, Redis for caching, NGINX for request routing, Sentry for monitoring and error tracking and so on. Each of these subsystems interacts with each other to form the powerful Gitlab project. These interaction points are called interfaces. The key product development question for companies is: “Which things do I build internally and which do I outsource?” A simple answer offered by many MBA students is “Outsource everything that is not part of your core competence.” As Clayton Christensen points out, “The problem with core-competence/not-your-core-competence categorization is that what might seem to be a non-core activity today might become an absolutely critical competence to have mastered in a proprietary way in the future, and vice versa.” A great example that we’ve discussed before is IBM’s decision to go with Microsoft DOS for its Operating System and Intel for its Microprocessor. At the time, IBM thought it was making a strategic decision to outsource things that were not within its core competence but they inadvertently gave almost all of the industry profits from personal computing to Intel and Microsoft. Other competitors copied their modular approach and the whole industry slugged it out on price. The question of whether to outsource really depends on what might be important in the future. But that is difficult to predict, so the question of integration vs. outsourcing really comes down to the state of the product and market itself: is this product “not good enough” yet? If the answer is yes, then a proprietary, integrated architecture is likely needed just to make the actual product work for customers. Over time, as competitors enter the market and the fully integrated platform becomes more commoditized, the individual subsystems become increasingly important competitive drivers. So the decision to outsource or build internally must be made on the status of product and the market its attacking.

  2. Commoditization within Stacks. The above point leads to the unbelievable idea of how companies fall into the commoditization trap. This happens from overshooting, where companies create products that are too good (which I find counter-intuitive, who thought that doing your job really well would cause customers to leave!). Christensen describes this through the lens of a salesperson “‘Why can’t they see that our product is better than the competition? They’re treating it like a commodity!’ This is evidence of overshooting…there is a performance surplus. Customers are happy to accept improved products, but unwilling to pay a premium price to get them.” At this time, the things demanded by customers flip - they are willing to pay premium prices for innovations along a new trajectory of performance, most likely speed, convenience, and customization. “The pressure of competing along this new trajectory of improvement forces a gradual evolution in product architectures, away from the interdependent, proprietary architectures that had the advantage in the not-good-enough era toward modular designs in the era of performance surplus. In a modular world, you can prosper by outsourcing or by supplying just one element.” This process of integration, to modularization and back, is super fascinating. As an example of modularization, let’s take the streaming company Confluent, the makers of the open-source software project Apache Kafka. Confluent offers a real-time communications service that allows companies to stream data (as events) rather than batching large data transfers. Their product is often a sub-system underpinning real-time applications, like providing data to traders at Citigroup. Clearly, the basis of competition in trading has pivoted over the years as more and more banking companies offer the service. Companies are prioritizing a new axis, speed, to differentiate amongst competing services, and when speed is the basis of competition, you use Confluent and Kafka to beat out the competition. Now let’s fast forward five years and assume all banks use Kafka and Confluent for their traders, the modular sub-system is thus commoditized. What happens? I’d posit that the axis would shift again, maybe towards convenience, or customization where traders want specific info displayed maybe on a mobile phone or tablet. The fundamental idea is that “Disruption and commoditization can be seen as two sides of the same coin. That’s because the process of commoditization initiates a reciprocal process of de-commoditization [somewhere else in the stack].”

  3. The Disruptive Becomes the Disruptor. Disruption is a relative term. As we’ve discussed previously, disruption is often mischaracterized as startups enter markets and challenge incumbents. Disruption is really a focused and contextual concept whereby products that are “not good enough” by market standards enter a market with a simpler, more convenient, or less expensive product. These products and markets are often dismissed by incumbents or even ceded by market leaders as those leaders continue to move up-market to chase even bigger customers. Its fascinating to watch the disruptive become the disrupted. A great example would be department stores - initially, Macy’s offered a massive selection that couldn’t be found in any single store and customers loved it. They did this by turning inventory three times per year with 40% gross margins for a 120% return on capital invested in inventory. In the 1960s, Walmart and Kmart attacked the full-service department stores by offering a similar selection at much cheaper prices. They did this by setting up a value system whereby they could make 23% gross margins but turn inventories 5 times per year, enabling them to earn the industry golden 120% return on capital invested in inventory. Full-service department stores decided not to compete against these lower gross margin products and shifted more space to beauty and cosmetics that offered even higher gross margins (55%) than the 40% they were used to. This meant they could increase their return on capital invested in inventory and their profits while avoiding a competitive threat. This process continued with discount stores eventually pushing Macy’s out of most categories until Macy’s had nowhere to go. All of a sudden the initially disruptive department stores had become disrupted. We see this in technology markets as well. I’m not 100% this qualifies but think about Salesforce and Oracle. Marc Benioff had spent a number of years at Oracle and left to start Salesforce, which pioneered selling subscription, cloud software, on a per-seat revenue model. This meant a much cheaper option compared to traditional Oracle/Siebel CRM software. Salesforce was initially adopted by smaller customers that didn’t need the feature-rich platform offered by Oracle. Oracle dismissed Salesforce as competition even as Oracle CEO Larry Ellison seeded Salesforce and sat on Salesforce’s board. Today, Salesforce is a $200B company and briefly passed Oracle in market cap a few months ago. But now, Salesforce has raised its prices and mostly targets large enterprise buyers to hit its ambitious growth initiatives. Down-market competitors like Hubspot have come into the market with cheaper solutions and more fully integrated marketing tools to help smaller businesses that aren’t ready for a fully-featured Salesforce platform. Disruption is always contextual and it never stops.

Business Themes

1_fnX5OXzCcYOyPfRHA7o7ug.png
  1. Low-end-Market vs. New-Market Disruption. There are two types of established methods for disruption: Low-end-market (Down-market) and New-market. Low-end-market disruption seeks to establish performance that is “not good enough” along traditional lines, and targets overserved customers in the low-end of the mainstream market. It typically utilizes a new operating or financial approach with structurally different margins than up-market competitors. Amazon.com is a quintessential low-end market disruptor compared to traditional bookstores, offering prices so low they angered book publishers while offering unmatched convenience to customers allowing them to purchase books online. In contrast, Robinhood is a great example of a new-market disruption. Traditional discount brokerages like Charles Schwab and Fidelity had been around for a while (themselves disruptors of full-service models like Morgan Stanley Wealth Management). But Robinhood targeted a group of people that weren’t consuming in the market, namely teens and millennials, and they did it in an easy-to-use app with a much better user interface compared to Schwab and Fidelity. Robinhood also pioneered new pricing with zero-fee trading and made revenue via a new financial approach, payment for order flow (PFOF). Robinhood makes money by being a data provider to market makers - basically, large hedge funds, like Citadel, pay Robinhood for data on their transactions to help optimize customers buying and selling prices. When approaching big markets its important to ask: Is this targeted at a non-consumer today or am I competing at a structurally lower margin with a new financial model and a “not quite good enough” product? This determines whether you are providing a low-end market disruption or a new-market disruption.

  2. Jobs To Be Done. The jobs to be done framework was one of the most important frameworks that Clayton Christensen ever introduced. Marketers typically use advertising platforms like Facebook and Google to target specific demographics with their ads. These segments are narrowly defined: “Males over 55, living in New York City, with household income above $100,000.” The issue with this categorization method is that while these are attributes that may be correlated with a product purchase, customers do not look up exactly how marketers expect them to behave and purchase the products expected by their attributes. There may be a correlation but simply targeting certain demographics does not yield a great result. The marketers need to understand why the customer is adopting the product. This is where the Jobs to Be Done framework comes in. As Christensen describes it, “Customers - people and companies - have ‘jobs’ that arise regularly and need to get done. When customers become aware of a job that they need to get done in their lives, they look around for a product or service that they can ‘hire’ to get the job done. Their thought processes originate with an awareness of needing to get something done, and then they set out to hire something or someone to do the job as effectively, conveniently, and inexpensively as possible.” Christensen zeroes in on the contextual adoption of products; it is the circumstance and not the demographics that matter most. Christensen describes ways for people to view competition and feature development through the Jobs to Be Done lens using Blackberry as an example (later disrupted by the iPhone). While the immature smartphone market was seeing feature competition from Microsoft, Motorola, and Nokia, Blackberry and its parent company RIM came out with a simple to use device that allowed for short productivity bursts when the time was available. This meant they leaned into features that competed not with other smartphone providers (like better cellular reception), but rather things that allowed for these easy “productive” sessions like email, wall street journal updates, and simple games. The Blackberry was later disrupted by the iPhone which offered more interesting applications in an easier to use package. Interestingly, the first iPhone shipped without an app store (but as a proprietary, interdependent product) and was viewed as not good enough for work purposes, allowing the Blackberry to co-exist. Management even dismissed the iPhone as a competitor initially. It wasn’t long until the iPhone caught up and eventually surpassed the Blackberry as the world’s leading mobile phone.

  3. Brand Strategies. Companies may choose to address customers in a number of different circumstances and address a number of Jobs to Be Done. It’s important that the Company establishes specific ways of communicating the circumstance to the customer. Branding is powerful, something that Warren Buffett, Terry Smith, and Clayton Christensen have all recognized as durable growth providers. As Christensen puts it: “Brands are, at the beginning, hollow words into which marketers stuff meaning. if a brand’s meaning is positioned on a job to be done, then when the job arises in a customer’s life, he or she will remember the brand and hire the product. Customers pay significant premiums for brands that do a job well.” So what can a large corporate company do when faced with a disruptive challenger to its branding turf? It’s simple - add a word to their leading brand, targeted at the circumstance in which a customer might find themself. Think about Marriott, one of the leading hotel chains. They offer a number of hotel brands: Courtyard by Marriott for business travel, Residence Inn by Marriott for a home away from home, the Ritz Carlton for high-end luxurious stays, Marriott Vacation Club for resort destination hotels. Each brand is targeted at a different Job to Be Done and customers intuitively understand what the brands stand for based on experience or advertising. A great technology example is Amazon Web Services (AWS), the cloud computing division of Amazon.com. Amazon invented the cloud, and rather than launch with the Amazon.com brand, which might have confused their normal e-commerce customers, they created a completely new brand targeted at a different set of buyers and problems, that maintained the quality and recognition that Amazon had become known for. Another great retail example is the SNKRs app released by Nike. Nike understands that some customers are sneakerheads, and want to know the latest about all Nike shoe drops, so Nike created a distinct, branded app called SNKRS, that gives news and updates on the latest, trendiest sneakers. These buyers might not be interested in logging into the Nike app and may become angry after sifting through all of the different types of apparel offered by Nike, just to find new shoes. The SNKRS app offers a new set of consumers and an easy way to find what they are looking for (convenience), which benefits Nike’s core business. Branding is powerful, and understanding the Job to Be Done helps focus the right brand for the right job.

Dig Deeper

  • Clayton Christensen’s Overview on Disruptive Innovation

  • Jobs to Be Done: 4 Real-World Examples

  • A Peek Inside Marriott’s Marketing Strategy & Why It Works So Well

  • The Rise and Fall of Blackberry

  • Payment for Order Flow Overview

  • How Commoditization Happens

tags: Clayton Christensen, AWS, Nike, Amazon, Marriott, Warren Buffett, Terry Smith, Blackberry, RIM, Microsoft, Motorola, iPhone, Facebook, Google, Robinhood, Citadel, Schwab, Fidelity, Morgan Stanley, Oracle, Salesforce, Walmart, Macy's, Kmart, Confluent, Kafka, Citigroup, Intel, Gitlab, Redis
categories: Non-Fiction
 

March 2021 - Payments Systems in the U.S. by Carol Coye Benson, Scott Loftesness, and Russ Jones

This month we dive into the fintech space for the first time! Glenbrook Partners is a famous payments consulting company. This classic book describes the history and current state of the many financial systems we use every day. While the book is a bit dated and reads like a textbook, it throws in some great real-world observations and provides a great foundation for any payments novice!

Tech Themes

  1. Mapping Open-Loop and Closed-Loop Networks. The major credit and debit card providers (Visa, Mastercard, American Express, China UnionPay, and Discover) all compete for the same spots in customer wallets but have unique and differing backgrounds and mechanics. The first credit card on the scene was the BankAmericard in the late 1950’s. As it took off, Bank of America started licensing the technology all across the US and created National BankAmericard Inc. (NBI) to facilitate its card program. NBI merged with its international counterpart (IBANCO) to form Visa in the mid-1970’s. Another group of California banks had created the Interbank Card Association (ICA) to compete with Visa and in 1979 renamed itself Mastercard. Both organizations remained owned by the banks until their IPO’s in 2006 (Mastercard) and 2008 (Visa). Both of these companies are known as open-loop networks, that is they work with any bank and require banks to sign up customers and merchants. As the bank points out, “This structure allows the two end parties to transact with each other without having direct relationships with each other’s banks.” This convenient feature of open-loop payments systems means that they can scale incredibly quickly. Any time a bank signs up a new customer or merchant, they immediately have access to the network of all other banks on the Mastercard / Visa network. In contrast to open-loop systems, American Express and Discover operate largely closed-loop systems, where they enroll each merchant and customer individually. Because of this onerous task of finding and signing up every single consumer/merchant, Amex and Discover cannot scale to nearly the size of Visa/Mastercard. However, there is no bank intermediation and the networks get total access to all transaction data, making them a go-to solution for things like loyalty programs, where a merchant may want to leverage data to target specific brand benefits at a customer. Open-loop systems like Apple Pay (its tied to your bank account) and closed-loop systems like Starbuck’s purchasing app (funds are pre-loaded and can only be redeemed at Starbucks) can be found everywhere. Even Snowflake, the data warehouse provider and subject of last month’s TBOTM is a closed-loop payments network. Customers buy Snowflake credits up-front, which can only be used to redeem Snowflake compute services. In contrast, AWS and other cloud’s are beginning to offer more open-loop style networks, where AWS credits can be redeemed against non-AWS software. Side note - these credit systems and odd-pricing structures deliberately mislead customers and obfuscate actual costs, allowing the cloud companies to better control gross margins and revenue growth. It’s fascinating to view the world through this open-loop / closed-loop dynamic.

  2. New Kids on the Block - What are Stripe, Adyen, and Marqeta? Stripe recently raised at a minuscule valuation of $95B, making it the highest valued private startup (ever?!). Marqeta, its API/card-issuing counterpart, is prepping a 2021 IPO that may value it at $10B. Adyen, a Dutch public company is worth close to $60B (Visa is worth $440B for comparison). Stripe and Marqeta are API-based payment service providers, which allow businesses to easily accept online payments and issue debit and credit cards for a variety of use cases. Adyen is a merchant account provider, which means it actually maintains the merchant account used to run a company’s business - this often comes with enormous scale benefits and reduced costs, which is why large customers like Nike have opted for Adyen. This merchant account clearing process can take quite a while which is why Stripe is focused on SMB’s - a business can sign up as a Stripe customer and almost immediately begin accepting online payments on the internet. Stripe and Marqeta’s API’s allow a seamless integration into payment checkout flows. On top of this basic but highly now simplified use case, Stripe and Marqeta (and Adyen) allow companies to issue debit and credit cards for all sorts of use cases. This is creating an absolute BOOM in fintech, as companies seek to try new and innovative ways of issuing credit/debit cards - such as expense management, banking-as-a-service, and buy-now-pay-later. Why is this now such a big thing when Stripe, Adyen, and Marqeta were all created before 2011? In 2016, Visa launched its first developer API’s which allowed companies like Stripe, Adyen, and Marqeta to become licensed Visa card issuers - now any merchant could issue their own branded Visa card. That is why Andreessen Horowitz’s fintech partner Angela Strange proclaimed: “Every company will be a fintech company.” (this is also clearly some VC marketing)! Mastercard followed suit in 2019, launching its open API called the Mastercard Innovation Engine. The big networks decided to support innovation - Visa is an investor in Stripe and Marqeta, AmEx is an investor in Stripe, and Mastercard is an investor in Marqeta. Surprisingly, no network providers are investors in Adyen. Fintech innovation has always seen that the upstarts re-write the incumbents (Visa and Mastercard are bigger than the banks with much better business models) - will the same happen here?

  3. Building a High Availability System. Do Mastercard and Visa have the highest availability needs of any system? Obviously, people are angry when Slack or Google Cloud goes down, but think about how many people are affected when Visa or Mastercard goes down? In 2018, a UK hardware failure prompted a five-hour outage at Visa: “Disgruntled customers at supermarkets, petrol stations and abroad vented their frustrations on social media when there was little information from the financial services firm. Bank transactions were also hit.” High availability is a measure of system uptime: “Availability is often expressed as a percentage indicating how much uptime is expected from a particular system or component in a given period of time, where a value of 100% would indicate that the system never fails. For instance, a system that guarantees 99% of availability in a period of one year can have up to 3.65 days of downtime (1%).” According to Statista, Visa handles ~185B transactions per year (a cool 6,000 per second), while UnionPay comes in second with 131B and Mastercard in third with 108B. For the last twelve months end June 30, 2020, Visa processed $8.7T in payments volume which means that the average transaction was ~$47. At 6,000 transactions per second, Visa loses $282,000 in payment volume every second it’s down. Mastercard and Visa have always been historically very cagey about disclosing data center operations (the only article I could find is from 2013) though they control their own operations much like other technology giants. “One of the keys to the [Visa] network's performance, Quinlan says, is capacity. And Visa has lots of it. Its two data centers--which are mirror images of each other and can operate interchangeably--are configured to process as many as 30,000 simultaneous transactions, or nearly three times as much as they've ever been asked to handle. Inside the pods, 376 servers, 277 switches, 85 routers, and 42 firewalls--all connected by 3,000 miles of cable--hum around the clock, enabling transactions around the globe in near real-time and keeping Visa's business running.” The data infrastructure challenges that payments systems are subjected to are massive and yet they all seem to perform very well. I’d love to learn more about how they do it!

Business Themes

interchange_fee.jpg
Interchange.png
  1. What is interchange and why does it exist? BigCommerce has a great simple definition for interchange: “Interchange fees are transaction fees that the merchant's bank account must pay whenever a customer uses a credit/debit card to make a purchase from their store. The fees are paid to the card-issuing bank to cover handling costs, fraud and bad debt costs and the risk involved in approving the payment.” What is crazy about interchange is that it is not the banks, but the networks (Mastercard, Visa, China UnionPay) that set interchange rates. On top of that, the networks set the rates but receive no revenue from interchange itself. As the book points out: “Since the card netork’s issuing customers are the recipients of interchange fees, the level of interchange that a network sets is an important element in the network’s competitive position. A higher level of interchange on one network’s card products naturally makes that network’s card products more attractive to card issuers.” The incentives here are wild - the card issuers (banks) want higher interchange because they receive the interchange from the merchant’s bank in a transaction, the card networks want more card issuing customers and offering higher interchange rates better positions them in competitive battles. The merchant is left worse off by higher interchange rates, as the merchant bank almost always passes this fee on to the merchant itself ($100 received via credit card turns out to only be $97 when it gets to their bank account because of fees). Visa and Mastercard have different interchange rates for every type of transaction and acceptance method - making it a complicated nightmare to actually understand their fees. The networks and their issuers may claim that increased interchange fees allow banks to invest more in fraud protection, risk management, and handling costs, but there is no way to verify this claim. This has caused a crazy war between merchants, the card networks, and the card issuers.

  2. Why is Jamie Dimon so pissed about fintechs? In a recent interview, Jamie Dimon, CEO of JP Morgan Chase, recently called fintechs “examples of unfair competition.” Dimon is angry about the famous (or infamous) Durbin Amendment, which was a last-minute addition included in the landmark Dodd-Frank Wall Street Reform and Consumer Protection Act of 2010. The Durbin amendment attempted to cap the interchange amount that could be charged by banks and tier the interchange rates based on the assets of the bank. In theory, capping the rates would mean that merchants paid less in fees, and the merchant would pass these lower fees onto the consumer by giving them lower prices thus spurring demand. The tiering would mean banks with >$10B in assets under management would make less in interchange fees, leveling the playing field for smaller banks and credit unions. “The regulated [bank with >$10B in assets] debit fee is 0.05% + $0.21, while the unregulated is 1.60% + $0.05. Before the Durbin Amendment the fee was 1.190% + $0.10.” While this did lower debit card interchange, a few unintended consequences resulted: 1. Regulators expected that banks would make substantially less revenue, however, they failed to recognize that banks might increase other fees to offset this lost revenue stream: “Banks have cut back on offering rewards for their debit cards. Banks have also started charging more for their checking accounts or they require a larger monthly balance.” In addition, many smaller banks couldn’t recoup the lost revenue amount, leading to many bankruptcies and consolidation. 2. Because a flat rate fee was introduced regardless of transaction size, smaller merchants were charged more in interchange than the prior system (which was pro-rated based on $ amount). “One problem with the Durbin Amendment is that it didn’t take small transactions into account,” said Ellen Cunningham, processing expert at CardFellow.com. “On a small transaction, 22 cents is a bigger bite than on a larger transaction. Convenience stores, coffee shops and others with smaller sales benefited from the original system, with a lower per-transaction fee even if it came with a higher percentage.” These small retailers ended up raising prices in some instances to combat these additional fees - causing the law to have the opposite effect of lowering costs to consumers. Dimon is angry that this law has allowed fintech companies to start charging higher prices for debit card transactions. As shown above, smaller banks earn a substantial amount more in interchange fees. These smaller banks are moving quickly to partner with fintechs, which now power hundreds of millions of dollars in account balances and Dimon believes they are not spending enough attention on anti-money laundering and fraud practices. In addition, fintech’s are making money in suspect ways - Chime makes 21% of its revenue through high out-of-network ATM fees, and cash advance companies like Dave, Branch, and Earnin’ are offering what amount to pay-day loans to customers.

  3. Mastercard and Visa: A history of regulation. Visa and Mastercard have been the subject of many regulatory battles over the years. The US Justice Department announced in March that it would be investigating Visa over online debit-card practices. In 1996, Visa and Mastercard were sued by merchants and settled for $3B. In 1998, the Department of Justice won a case against Visa and Mastercard for not allowing issuing banks to work with other card networks like AmEx and Discover. In 2009, Mastercard and Visa were sued by the European Union and forced to reduce debit card swipe fees by 0.2%. In 2012, Mastercard and Visa were sued for price-fixing fees and were forced to pay $6.25B in a settlement. The networks have been sued by the US, Europe, Australia, New Zealand, ATM Operators, Intuit, Starbucks, Amazon, Walmart, and many more. Each time they have been forced to modify fees and practices to ensure competition. However, this has also re-inforced their dominance as the biggest payment networks which is why no competitors have been established since the creation of the networks in the 1970’s. Also, leave it to the banks to establish a revenue source that is so good that it is almost entirely undefeatable by legislation. When, if ever, will Visa and Mastercard not be dominant payments companies?

Dig Deeper

  • American Banker: Big banks, Big Tech face-off over swipe fees

  • Stripe Sessions 2019 | The future of payments

  • China's growth cements UnionPay as world's largest card scheme

  • THE DAY THE CREDIT CARD WAS BORN by Joe Nocera (Washington Post)

  • Mine Safety Disclosure’s 2019 Visa Investment Case

  • FineMeValue’s Payments Overview

tags: Visa, Mastercard, American Express, Discover, Bank of America, Stripe, Marqeta, Adyen, Apple, Open-loop, Closed-loop, Snowflake, AWS, Nike, BNPL, Andreessen Horowitz, Angela Strange, Slack, Google Cloud, UnionPay, BigCommerce, Jamie Dimon, Dodd-Frank, Durbin Amendment, JP Morgan Chase, Debit Cards, Credit Cards, Chime, Branch, Earnin', US Department of Justice, Intuit, Starbucks, Amazon, Walmart
categories: Non-Fiction
 

February 2021 - Rise of the Data Cloud by Frank Slootman and Steve Hamm

This month we read a new book by the CEO of Snowflake and author of our November 2020 book, Tape Sucks. The book covers Snowflake’s founding, products, strategy, industry specific solutions and partnerships. Although the content is somewhat interesting, it reads more like a marketing book than an actually useful guide to cloud data warehousing. Nonetheless, its a solid quick read on the state of the data infrastructure ecosystem.

Tech Themes

  1. The Data Warehouse. A data warehouse is a type of database that is optimized for analytics. These optimizations mainly revolve around complex query performance, the ability to handle multiple data types, the ability to integrate data from different applications, and the ability to run fast queries across large data sets. In contrast to a normal database (like Postgres), a data warehouse is purpose-built for efficient retrieval of large data sets and not high performance read/write transactions like a typical relational database. The industry began in the late 1970s and early 80’s, driven by work done by the “Father of Data Warehousing” Bill Inmon and early competitor Ralph Kimball, who was a former Xerox PARC designer. In 1986, Kimball launched Redbrick Systems and Inmon launched Prism Solutions in 1991, with its leading product the Prism Warehouse Manager. Prism went public in 1995 and was acquired by Ardent Software in 1998 for $42M while Red Brick was acquired by Informix for ~$35M in 1998. In the background, a company called Teradata, which was formed in the late 1970s by researchers at Cal and employees from Citibank, was going through their own journey to the data warehouse. Teradata would IPO in 1987, get acquired by NCR in 1991; NCR itself would get acquired by AT&T in 1991; NCR would then spin out of AT&T in 1997, and Teradata would spin out of NCR through IPO in 2007. What a whirlwind of corporate acquisitions! Around that time, other new data warehouses were popping up on the scene including Netezza (launched in 1999) and Vertica (2005). Netezza, Vertica, and Teradata were great solutions but they were physical hardware that ran a highly efficient data warehouse on-premise. The issue was, as data began to grow on the hardware, it became really difficult to add more hardware boxes and to know how to manage queries optimally across the disparate hardware. Snowflake wanted to leverage the unlimited storage and computing power of the cloud to allow for infinitely scalable data warehouses. This was an absolute game-changer as early customer Accordant Media described, “In the first five minutes, I was sold. Cloud-based. Storage separate from compute. Virtual warehouses that can go up and down. I said, ‘That’s what we want!’”

  2. Storage + Compute. Snowflake was launched in 2012 by Benoit Dageville (Oracle), Thierry Cruanes (Oracle) and Marcin Żukowski (Vectorwise). Mike Speiser and Sutter Hill Ventures provided the initial capital to fund the formation of the company. After numerous whiteboarding sessions, the technical founders decided to try something crazy, separating data storage from compute (processing power). This allowed Snowflake’s product to scale the storage (i.e. add more boxes) and put tons of computing power behind very complex queries. What may have been limited by Vertica hardware, was now possible with Snowflake. At this point, the cloud had only been around for about 5 years and unlike today, there were only a few services offered by the main providers. The team took a huge risk to 1) bet on the long-term success of the public cloud providers and 2) try something that had never successfully been accomplished before. When they got it to work, it felt like magic. “One of the early customers was using a $20 million system to do behavioral analysis of online advertising results. Typically, one big analytics job would take about thirty days to complete. When they tried the same job on an early version of Snowflake;’s data warehouse, it took just six minutes. After Mike learned about this, he said to himself: ‘Holy shit, we need to hire a lot of sales people. This product will sell itself.’” This idea was so crazy that not even Amazon (where Snowflake runs) thought of unbundling storage and compute when they built their cloud-native data warehouse, Redshift, in 2013. Funny enough, Amazon also sought to attract people away from Oracle, hence the name Red-Shift. It would take Amazon almost seven years to re-design their data warehouse to separate storage and compute in Redshift RA3 which launched in 2019. On top of these functional benefits, there is a massive gap in the cost of storage and the cost of compute and separating the two made Snowflake a significantly more cost-competitive solution than traditional hardware systems.

  3. The Battle for Data Pipelines. A typical data pipeline (shown below) consists of pulling data from many sources, perform ETL/ELT (extract, load, transform and vice versa), centralizing it in a data warehouse or data lake, and connecting that data to visualization tools like Tableau or Looker. All parts of this data stack are facing intense competition. On the ETL/ELT side, you have companies like Fivetran and Matillion and on the data warehouse/data lake side you have Snowflake and Databricks. Fivetran focuses on the extract and load portion of ETL, providing a data integration tool that allows you to connect to all of your operational systems (salesforce, zendesk, workday, etc.) and pull them all together in Snowflake for comprehensive analysis. Matillion is similar, except it connects to your systems and imports raw data into Snowflake, and then transforms it (checking for NULL’s, ensuring matching records, removing blanks) in your Snowflake data warehouse. Matillion thus focuses on the load and transform steps in ETL while Fivetran focuses on the extract and load portions and leverages dbt (data build tool) to do transformations. The data warehouse vs. data lake debate is a complex and highly technical discussion but it mainly comes down to Databricks vs. Snowflake. Databricks is primarily a Machine Learning platform that allows you to run Apache Spark (an open-source ML framework) at scale. Databricks’s main product, Delta Lake allows you to store all data types - structured and unstructured for real-time and complex analytical processes. As Datagrom points out here, the platforms come down to three differences: data structure, data ownership, and use case versatility. Snowflake requires structured or semi-structured data prior to running a query while Databricks does not. Similarly, while Snowflake decouples data storage from compute, it does not decouple data ownership meaning Snowflake maintains all of your data, whereas you can run Databricks on top of any data source you have whether it be on-premise or in the cloud. Lastly, Databricks acts more as a processing layer (able to function in code like python as well as SQL) while Snowflake acts as a query and storage layer (mainly driven by SQL). Snowflake performs best with business intelligence querying while Databricks performs best with data science and machine learning. Both platforms can be used by the same organizations and I expect both to be massive companies (Databricks recently raised at a $28B valuation!). All of these tools are blending together and competing against each other - Databricks just launched a new LakeHouse (Data lake + data warehouse - I know the name is hilarious) and Snowflake is leaning heavily into its data lake. We will see who wins!

An interesting data platform battle is brewing that will play out over the next 5-10 years: The Data Warehouse vs the Data Lakehouse, and the race to create the data cloud

Who's the biggest threat to @snowflake? I think it's @databricks, not AWS Redshifthttps://t.co/R2b77XPXB7

— Jamin Ball (@jaminball) January 26, 2021

Business Themes

Lakehouse_v1.png
architecture-overview.png
  1. Marketing Customers. This book at its core, is a marketing document. Sure, it gives a nice story of how the company was built, the insights of its founding team, and some obstacles they overcame. But the majority of the book is just a “Imagine what you could do with data” exploration across a variety of industries and use cases. Its not good or bad, but its an interesting way of marketing - that’s for sure. Its annoying they spent so little on the technology and actual company building. Our May 2019 book, The Everything Store, about Jeff Bezos and Amazon was perfect because it covered all of the decision making and challenging moments to build a long-term company. This book just talks about customer and partner use cases over and over. Slootman’s section is only about 20 pages and five of them cover case studies from Square, Walmart, Capital One, Fair, and Blackboard. I suspect it may be due to the controversial ousting of their long-time CEO Bob Muglia for Frank Slootman, co-author of this book. As this Forbes article noted: “Just one problem: No one told Muglia until the day the company announced the coup. Speaking publicly about his departure for the first time, Muglia tells Forbes that it took him months to get over the shock.” One day we will hear the actual unfiltered story of Snowflake and it will make for an interesting comparison to this book.

  2. Timing & Building. We often forget how important timing is in startups. Being the right investor or company at the right time can do a lot to drive unbelievable returns. Consider Don Valentine at Sequoia in the early 1970’s. We know that venture capital fund performance persists, in part due to incredible branding at firms like Sequoia that has built up over years and years (obviously reinforced by top-notch talents like Mike Moritz and Doug Leone). Don is a great investor and took significant risks on unproven individuals like Steve Jobs (Apple), Nolan Bushnell (Atari), and Trip Hawkins (EA). But he also had unfettered access to the birth of an entirely new ecosystem and knowledge of how that ecosystem would change business, built up from his years at Fairchild Semiconductor. Don is a unique person and capitalized on that incredible knowledgebase, veritably creating the VC industry. Sequoia is a top firm because he was in the right place at the right time with the right knowledge. Now let’s cover some companies that weren’t: Cloudera, Hortonworks, and MapR. In 2005, Yahoo engineers Doug Cutting and Mike Cafarella, inspired by the Google File System paper, created Hadoop, a distributed file system for storing and accessing data like never before. Hadoop spawned many companies like Cloudera, Hortonworks, and MapR that were built to commercialize the open-source Hadoop project. All of the companies came out of the gate fast with big funding - Cloudera raised $1B at a $4B valuation prior to its 2017 IPO, Hortonworks raised $260M at a $1B valuation prior to its 2014 IPO, and MapR $300M before it was acquired by HPE in 2019. The companies all had one thing in problem however, they were on-premise and built prior to the cloud gaining traction. That meant it required significant internal expertise and resources to run Cloudera, Hortonworks, and MapR software. In 2018, Cloudera and Hortonworks merged (at a $5B valuation) because the competitive pressure from the cloud was eroding both of their businesses. MapR was quietly acquired for less than it raised. Today Cloudera trades at a $5B valuation meaning no shareholder return since the merger and the business is only recently slightly profitable at its current low growth rate. This cautionary case study shows how important timing is and how difficult it is to build a lasting company in the data infrastructure world. As the new analytics stack is built with Fivetran, Matillion, dbt, Snowflake, and Databricks, it will be interesting to see which companies exist 10 years from now. Its probable that some new technology will come along and hurt every company in the stack, but for now the coast is clear - the scariest time for any of these companies.

  3. Burn Baby Burn. Snowflake burns A LOT of money. In the Nine months ended October 31, 2020, Snowflake burned $343M, including $169M in their third quarter alone. Why would Snowflake burn so much money? Because they are growing efficiently! What does efficient growth mean? As we discussed in the last Frank Slootman book - sales and marketing efficiency is a key hallmark to understand the quality of growth a company is experiencing. According to their filings, Snowflake added ~$230M of revenue and spent $325M in sales and marketing. This is actually not terribly efficient - it supposes a dollar invested in sales and marketing yielded $0.70 of incremental revenue. While you would like this number to be closer to 1x (i.e. $1 in S&M yield $1 in revenue - hence a repeatable go-to-market motion), it is not terrible. ServiceNow (Slootman’s old company), actually operates less efficiently - for every dollar it invests in sales and marketing, it generates only $0.55 of subscription revenue. Crowdstrike, on the other hand, operates a partner-driven go-to-market, which enables it to generate more while spending less - created $0.90 for every dollar invested in sales and marketing over the last nine months. However, there is a key thing that distinguishes the data warehouse compared to these other companies and Ben Thompson at Stratechery nails it here: “Think about this in the context of Snowflake’s business: the entire concept of a data warehouse is that it contains nearly all of a company’s data, which (1) it has to be sold to the highest levels of the company, because you will only get the full benefit if everyone in the company is contributing their data and (2) once the data is in the data warehouse it will be exceptionally difficult and expensive to move it somewhere else. Both of these suggest that Snowflake should spend more on sales and marketing, not less. Selling to the executive suite is inherently more expensive than a bottoms-up approach. Data warehouses have inherently large lifetime values given the fact that the data, once imported, isn’t going anywhere.” I hope Snowflake burns more money in the future, and builds a sustainable long-term business.

Dig Deeper

  • Early Youtube Videos Describing Snowflake’s Architecture and Re-inventing the Data Warehouse

  • NCR’s spinoff of Teradata in 2007

  • Fraser Harris of Fivetran and Tristan Handy of dbt speak at the Modern Data Stack Conference

  • Don Valentine, Sequoia Capital: "Target Big Markets" - A discussion at Stanford

  • The Mike Speiser Incubation Playbook (an essay by Kevin Kwok)

tags: Snowflake, Data Warehouse, Oracle, Vertica, Netezza, IBM, Databricks, Apache Spark, Open Source, Fivetran, Matillion, dbt, Data Lake, Sequoia, ServiceNow, Crowdstrike, Cloudera, Hortonworks, MapR, BigQuery, Frank Slootman, Teradata, Xerox, Informix, NCR, AT&T, Benoit Dageville, Mike Speiser, Sutter Hill Ventures, Redshift, Amazon, ETL, Hadoop, SQL
categories: Non-Fiction
 

January 2021 - Technological Revolutions and Financial Capital: The Dynamics of Bubbles and Golden Ages by Carlota Perez

This month we read Carlota Perez’s understudied book covering the history of technology breakthroughs and revolutions. This book marries the role of financing and technology breakthrough so seamlessly in an easy to digest narrative style.

Tech Themes

  1. The 5 Technology Revolutions. Perez identifies the five major technological revolutions: The Industrial Revolution (1771-1829), The Age of Steam and Railways (1829-1873), The Age of Steel, Electricity and Heavy Engineering (1875-1918), The Age of Oil, the Automobile and Mass Production (1908-1974), and The Age of Information and Telecommunications (1971-Today). When looking back at these individual revolutions, one can recognize how powerful it is to view the world and technology in these incredibly long waves. Many of these periods lasted for over fifty years while their geographic dispersion and economic effects fully came to fruition. These new technologies fundamentally alter society - when it becomes clear that the revolution is happening, many people jump on the bandwagon. As Perez puts it, “The great clusters of talent come forth after the evolution is visible and because it is visible.” Each revolution produces a myriad of change in society. The industrial revolution popularized factory production, railways created national markets, electricity created the power to build steel buildings, oil and cars created mass markets and assembly lines, and the microprocessor and internet created amazing companies like Amazon and Airbnb.

  2. The Phases of Technology Revolution. After a decently long gestation period during which the old revolution has permeated across the world, the new revolution normally starts with a big bang, some discovery or breakthrough (like the transistor or steam engine) that fundamentally pushed society into a new wave of innovation. Coupled with these big bangs, is re-defined infrastructure from the prior eras - as an example, the Telegraph and phone wires were created along the initial railways, as they allowed significant distance of uninterrupted space to build on. Another example is electricity - initially, homes were wired to serve lightbulbs, it was only many years later that great home appliances came into use. This initial period of application discovery is called the Irruption phase. The increasing interest in forming businesses causes a Frenzy period like the Railway Mania or the Dot-com Boom, where everyone thinks they can get rich quick by starting a business around the new revolution. As the first 20-30 years of a revolution play themselves out, there grows a strong divide between those who were part of the revolution and those who were not; there is an economic, social, and regulatory mismatch between the old guard and the new revolution. After an uprising (like the populism we have seen recently) and bubble collapse (Check your crystal ball), regulatory changes typically foster a harmonious future for the technology. Following these changes, we enter the Synergy phase, where technology can fully flourish due to accommodating and clear regulation. This Synergy phase propagates outward across all countries until even the lagging adopters have started the adoption process. At this point the cycle enters into Maturity, waiting for the next big advance to start the whole process over again.

  3. Where are we in the cycle today? We tweeted at Carlota Perez to answer this question AND SHE RESPONDED! My question to Perez was: With the recent wave of massive, transformational innovation like the public cloud providers, and the iPhone, are we still in the Age of Information? These technological waves are often 50-60 years and yet we’ve arguably been in the same age for quite a while. This wave started in 1971, exactly 50 years ago, with Intel and the creation of the microprocessor. Are we in the Frenzy phase with record amounts of investment capital, an enormous demand for early stage companies, and new financial innovations like Affirm’s debt securitizations? Or have we not gotten to the Frenzy phase yet? Is the public cloud or the iPhone the start of a new big bang and we have overlapping revolutions for the first time ever? Obviously identifying the truly breakthrough moments in technology history is way easier after the fact, so maybe we are too new to know what really is a seminal moment. Perez’s answer, though only a few words, fully provides scope to the question. Perez suggests we are still in the installation phase (Irruption and Frenzy) of the new technology and that makes a lot of sense. Sure, internet usage is incredibly high in the US (96%) but not in other large countries. China (the world’s largest country by population) has only 63% using the internet and India (the world’s second-largest country) has only 55% of its population using the internet. Ethiopia, with a population of over 100M people only has 18% using the internet. There is still a lot of runway left for the internet to bloom! In addition, only recently have people been equipped with a powerful computing device that fits in their pocket - and low-priced phones are now making their way to all parts of the world led by firms like Chinese giant Transsion. Added to the fact that we are not fully installed with this revolution, is the rise of populism, a political movement that seeks to mobilize ordinary people who feel disregarded by the elite group. Populism has reared its ugly head across many nations like the US (Donald Trump), UK (Brexit), Brazil (Bolsonaro) and many other countries. The rise of populism is fueled by the growing dichotomy between the elites who have benefitted socially and monetarily from the revolution and those who have not. In the 1890’s, anti-railroad sentiment drove the creation of the populist party. More recently, people have become angry at tech giants (Facebook, Google, Amazon, Apple, Twitter) for unfair labor practices, psychological manipulation, and monopolistic tendencies. The recent movie, the Social Dilemma, which suggests a more humane and regulatory focused approach to social media, speaks to the need for regulation of these massive companies. It is also incredibly ironic to watch a movie about how social media is manipulating its users while streaming a movie that was recommended to me on Netflix, a company that has popularized incessant binge-watching through UX manipulation, not dissimilar to Facebook and Google’s tactics. I expect these companies to get regulated soon -and I hope that once that happens, we enter into the Synergy phase of growth and value accruing to all people.

Yes, I do. I will find the time to reply to you properly. But just quickly, I think installation was prolonged by QE &casino finance; we are at the turning point (the successful rise of populism is a sign) and maybe post-Covid we'll go into synergy.

— Carlota Perez (@CarlotaPrzPerez) January 17, 2021

Business Themes

saupload_31850821249d4eb762b6cc.png
tumblr_63436aee14331420f570d452241e94ad_197e0e8c_500.png
tech-lifecycle.png
1920px-LongWavesThreeParadigms.jpg
images.jpg
  1. The role of Financial Capital in Revolutions. As the new technology revolutions play themselves out, financial capital appears right alongside technology developments, ready to mold the revolution into the phases suggested by Perez. In the irruption phase, as new technology is taking hold, financial capital that had been on the sidelines waiting out the Maturity phase of the previous revolution plows into new company formation and ideas. The financial sector tries to adopt the new technology as soon as possible (we are already seeing this with Quantum computing), so it can then espouse the benefits to everyone it talks to, setting the stage for increasing financing opportunities. Eventually, demand for financing company creation goes crazy, and you enter into a Frenzy phase. During this phase, there is a discrepancy between the value of financial capital and production capital, or money used by companies to create actual products and services. Financial capital believes in unrealistic returns on investment, funding projects that don’t make any sense. Perez notes: “In relation to the canal Mania of the 1790s, disorder and lack of coordination prevailed in investment decisions. Canals were built ‘with different widths and depths and much inefficient routing.’ According to Dan Roberts at the Financial Times, in 2001 it was estimated that only 1 to 2 percent of the fiber optic cable buried under Europe and the United States had so far been turned on.” These Frenzy phases create bubbles and further ingrain regulatory mismatch and political divide. Could we be in one now with deals getting priced at 125x revenue for tiny companies? After the institutional reckoning, the Technology revolution enters the Synergy phase where production capital has really strong returns on investment - the path of technology is somewhat known and real gains are to be made by continuing investment (especially at more reasonable asset prices). Production capital continues to go to good use until the technology revolution fully plays itself out, entering into the Maturity phase.

  2. Casino Finance and Prolonging Bubbles. One point that Perez makes in her tweet, is that this current bubble has been prolonged by QE and casino finance. Quantitative easing is a monetary policy where the federal reserve (US’s central bank) buys government bonds issued by the treasury department to inject money into the financial ecosystem. This money at the federal reserve can purchase bank loans and assets, offering more liquidity to the financial system. This process is used to create low-interest rates, which push individuals and corporations to invest their money because the rate of interest on savings accounts is really really low. Following the financial crisis and more recently COVID-19, the Federal Reserve lowered interest rates and started quantitative easing to help the hurting economy. In Perez’s view, these actions have prolonged the Irruption and Frenzy phases because it forces more money into investment opportunities. On top of quantitative easing, governments have allowed so-called Casino Capitalism - allowing free-market ideals to shape governmental policies (like Reagan’s economic plan). Uninterrupted free markets are in theory economically efficient but can give rise to bad actors - like Enron’s manipulation of California’s energy markets after deregulation. By engaging in continual quantitative easing and deregulation, speculative markets, like collateralized loan obligations during the financial crisis, are allowed to grow. This creates a risk-taking environment that can only end in a frenzy and bubble.

  3. Synergy Phase and Productive Capital Allocation. Capital allocation has been called the most important part of being a great investor and business leader. Think about being the CEO of Coca Cola for a second - you have thousands of competing projects, vying for budget - how do you determine which ones get the most money? In the investing world, capital allocation is measured by conviction. As George Soros’s famous quote goes: “It's not whether you're right or wrong, but how much money you make when you're right and how much you lose when you're wrong.” Clayton Christensen took the ideas of capital allocation and compared them to life investments, coming to the conclusion: “Investments in relationships with friends and family need to be made long, long before you’ll see any sign that they are paying off. If you defer investing your time and energy until you see that you need to, chances are it will already be too late.” Capital and time allocation are underappreciated concepts because they often seem abstract to the everyday humdrum of life. It is interesting to think about capital allocation within Perez’s long-term framework. The obvious approach would be to identify the stage (Irruption, Frenzy, Synergy, Maturity) and make the appropriate time/money decisions - deploy capital into the Irruption phase, pull money out at the height of the Frenzy, buy as many companies as possible at the crash/turning point, hold through most of the Synergy, and sell at Maturity to identify the next Irruption phase. Although that would be fruitful, identifying market bottoms and tops is a fool’s errand. However, according to Perez, the best returns on capital investment typically happen during the Synergy phase, where production capital (money employed by firms through investment in R&D) reigns supreme. During this time, the revolutionary applications of recently frenzied technology finally start to bear fruit. They are typically poised to succeed by an accommodating regulatory and social environment. Unsurprisingly, after the diabolic grifting financiers of the frenzy phase are exposed (see Worldcom, Great Financial Crisis, and Theranos), social pressures on regulators typically force an agreement to fix the loopholes that allowed these manipulators to take advantage of the system. After Enron, the Sarbanes-Oxley act increased disclosure requirements and oversight of auditors. After the GFC, the Dodd-Frank act mandated bank stress tests and introduced financial stability oversight. With the problems of the frenzy phase "fixed” for the time being, the social attitude toward innovation turns positive once again and the returns to production capital start to outweigh financial capital which is now reigned in under the new rules. Suffice to say, we are probably in the Frenzy phase in the technology world, with a dearth of venture opportunities, creating a massive valuation increase for early-stage companies. This will change eventually and as Warren Buffett says: “It’s only when the tide goes out that you learn who’s been swimming naked.” When the bubble does burst, regulation of big technology companies will usher in the best returns period for investors and companies alike.

Dig Deeper

  • The Financial Instability Hypothesis: Capitalist Processes and the Behavior of the Economy

  • Bubbles, Golden Ages, and Tech Revolutions - a Podcast with Carlota Perez

  • Jeff Bezos: The electricity metaphor (2007)

  • Where Does Growth Come From? Clayton Christensen | Talks at Google

  • A Spectral Analysis of World GDP Dynamics: Kondratieff Waves, Kuznets Swings, Juglar and Kitchin Cycles in Global Economic Development, and the 2008–2009 Economic Crisis

tags: Telegraph, Steam Engine, Steel, Transistor, Intel, Railway Mania, Dot-com Boom, Carlota Perez, Affirm, Irruption, Frenzy, Synergy, Maturity, iPhone, Apple, China, Ethiopia, Theranos, Populism, Twitter, Netflix, Warren Buffett, George Soros, Quantum Computing, QE, Reagan, Enron, Clayton Christensen, Worldcom
categories: Non-Fiction
 

November 2020 - Tape Sucks: Inside Data Domain, A Silicon Valley Growth Story by Frank Slootman

This month we read a short, under-discussed book by current Snowflake and former ServiceNow and Data Domain CEO, Frank Slootman. The book is just like Frank - direct and unafraid. Frank has had success several times in the startup world and the story of Data Domain provides a great case study of entrepreneurship. Data Domain was a data deduplication company, offering a 20:1 reduction of data backed up to tape casettes by using new disk drive technology.

Tech Themes

Data Domain’s 2008 10-K prior to being acquired

Data Domain’s 2008 10-K prior to being acquired

  1. First time CEO at a Company with No Revenue. Frank is an immigrant to the US, coming from the Netherlands shortly after graduating from the University of Rotterdam. After being rejected by IBM 10+ times, he joined Burroughs corporation, an early mainframe provider which subsequently merged with its direct competitor Sperry for $4.8B in 1986. Frank then spent some time at Compuware and moved back to the Netherlands to help it integrate the acquisition of Uniface, an early customizable report building software. After spending time there, he went to Borland software in 1997, working his way up the product management ranks but all the while being angered by time spent lobbying internally, rather than building. Frank joined Data Domain in the Spring of 2003 - when it had no customers, no revenue, and was burning cash. The initial team and VC’s were impressive - Kai Li, a computer science professor on sabbatical from Princeton, Ben Zhu, an EIR at USVP, and Brian Biles, a product leader with experience at VA Linux and Sun Microsystems. The company was financed by top-tier VC’s New Enterprise Associates and Greylock Partners, with Aneel Bhusri (Founder and current CEO of Workday) serving as initial CEO and then board chairman. This was a stacked team and Slootman knew it: “I’d bring down the average IQ of the company by joining, which felt right to me.” The Company had been around for 18 months and already burned through a significant amount of money when Frank joined. He knew he needed to raise money relatively soon after joining and put the Company’s chances bluntly: “Would this idea really come together and captivate customers? Nobody knew. We, the people on the ground floor, were perhaps, the most surprised by the extraordinary success we enjoyed.”

  2. Playing to his Strengths: Capital Efficiency. One of the big takeaways from the Innovators by Walter Issacson was that individuals or teams at the nexus of disciplines - primarily where the sciences meet the humanities, often achieved breakthrough success. The classic case study for this is Apple - Steve Jobs had an intense love of art, music, and design and Steve Wozniak was an amazing technologist. Frank has cultivated a cross-discipline strength at the intersection of Sales and Technology. This might be driven by Slootman’s background is in economics. The book has several references to economic terms, which clearly have had an impact on Frank’s thinking. Data Domain espoused capital efficiency: “We traveled alone, made few many-legged sales calls, and booked cheap flights and hotels: everybody tried to save a dime for the company.” The results showed - the business went from $800K of revenue in 2004 to $275 million by 2008, generating $75M in cash flow from operations. Frank’s capital efficiency was interesting and broke from traditional thinking - most people think to raise a round and build something. Frank took a different approach: “When you are not yet generating revenue, conservation of resource is the dominant theme.” Over time, “when your sales activity is solidly paying for itself,” the spending should shift from conservative to aggressive (like Snowflake is doing this now). The concept of sales efficiency is somewhat talked about, but given the recent fundraising environment, is often dismissed. Sales efficiency can be thought of as: “How much revenue do I generate for every $1 spent in sales and marketing?” Looking at the P&L below, we see Data Domain was highly efficient in its sales and marketing activity - the company increased revenue $150M in 2008, despite spending $115M in sales and marketing (a ratio of 1.3x). Contrast this with a company like Slack which spent $403M to acquire $230M of new revenue (a ratio of 0.6x). It gets harder to acquire customers at scale, so this efficiency is supposed to come down over time but best in class is hopefully above 1x. Frank clearly understands when to step on the gas with investing, as both ServiceNow and Snowflake have remained fairly efficient (from a sales perspective at least) while growing to a significant scale.

  3. Technology for Technology’s Sake. “Many technologies are conceived without a clear, precise notion of the intended use.” Slootman hits on a key point and one that the tech industry has struggled to grasp throughout its history. So many products and companies are established around budding technology with no use case. We’ve discussed Magic Leap’s fundraising money-pit (still might find its way), and Iridium Communications, the massive satellite telephone that required people to carry a suitcase around to use it. Gartner, the leading IT research publication (which is heavily influenced by marketing spend from companies) established the Technology Hype Cycle, complete with the “Peak of inflated expectations,” and the “Trough of Disillusionment” for categorizing technologies that fail to live up to their promise. There have been several waves that have come and gone: AR/VR, Blockchain, and most recently, Serverless. Its not so much that these technologies were wrong or not useful, its rather that they were initially described as a panacea to several or all known technology hindrances and few technologies ever live up to that hype. Its common that new innovations spur tons of development but also lots of failure, and this is Slootman’s caution to entrepreneurs. Data Domain was attacking a problem that existed already (tape storage) and the company provided what Clayton Christensen would call a sustaining innovation (something that Slootman points out). Whenever things go into “winter state”, like the internet after the dot-com bubble, or the recent Crpyto Winter which is unthawing as I write; it is time to pay attention and understand the relevance of the innovation.

Business Themes

5dacqibnz_funnelvs.pipeline.png
Inside-Sales-Team-Structure.png
  1. Importance of Owning Sales. Slootman spends a considerable amount of this small book discussing sales tactics and decision making, particularly with respect to direct sales and OEM relationships. OEM deals are partnerships with other companies whereby one company will re-sell the software, hardware, or service of another company. Crowdstrike is a popular product with many OEM relationships. The Company drives a significant amount of its sales through its partner model, who re-sell on behalf of Crowdstrike. OEM partnerships with big companies present many challenges: “First of all, you get divorced from your customer because the OEM is now between you and them, making customer intimacy challenging. Plus, as the OEM becomes a large part of your business, for all intents and purposes they basically own you without paying for the privilege…Never forget that nobody wants to sell your product more than you do.” The challenges don’t end there. Slootman points out that EMC discarded their previous OEM vendor in the data deduplication space, right after acquiring Data Domain. On top of that, the typical reseller relationship happens at a 10-20% margin, degrading gross margins and hurting ability to invest. It is somewhat similar to the challenges open-source companies like MongoDB and Elastic have run into with their core software being…free. Amazon can just OEM their offering and cut them out as a partner, something they do frequently. Partner models can be sustainable, but the give and take from the big company is a tough balance to strike. Investors like organic adoption, especially recently with the rise of freemium SaaS models percolating in startups. Slootman’s point is that at some point in enterprise focused businesses, the Company must own direct sales (and relationships) with its customers to drive real efficiency. After the low cost to acquire freemium adopters buy the product, the executive team must pivot to traditional top down enterprise sales to drive a successful and enduring relationship with the customer.

  2. In the Thick of Things. Slootman has some very concise advice for CEOs: be a fighter, show some humanity, and check your ego at the door. “Running a startup reduces you to your most elementary instincts, and survival is on your mind most of the time…The CEO is the ‘Chief Combatant,’ warrior number one.” Slootman views the role of CEO as a fighter, ready to be the first to jump into the action, at all times. And this can be incredibly productive for business as well. Tony Xu, the founder and CEO of Doordash, takes time out every month to do delivery for his own company, in order to remain close to the customer and the problems of the company. Jeff Bezos famously still responds and views emails from customers at jeff@amazon.com. Being CEO also requires a willingness to put yourself out there and show your true personality. As Slootman puts it: “People can instantly finger a phony. Let them know who you really are, warts and all.” As CEO you are tasked with managing so many people and being involved in all aspects of the business, it is easy to become rigid and unemotional in everyday interactions. Harvard Business School professor and former leader at Uber distills it down to a simple phrase: “Begin With Trust.” All CEO’s have some amount of ego, driving them to want to be at the top of their organization. Slootman encourages CEO’s to be introspective, and try to recognize blind spots, so ego doesn’t drive day-to-day interactions with employees. One way to do that is simple: use the pronoun “we” when discussing the company you are leading. Though Slootman doesn’t explicitly call it out - all of these suggestions (fighting, showing empathy, getting rid of ego) are meant to build trust with employees.

  3. R-E-C-I-P-E for a Great Culture. The last fifth of the book is all focused on building culture at companies. It is the only topic Slootman stays on for more than a few chapters, so you know its important! RECIPE was an acronym created by the employees at Data Domain to describe the company’s values: Respect, Excellence, Customer, Integrity, Performance, Execution. Its interesting how simple and focused these values are. Technology has pushed its cultural delusion’s of grandeur to an extreme in recent years. The WeWork S-1 hilariously started with: “We are a community company committed to maximum global impact. Our mission is to elevate the world’s consciousness.” But none of Data Domain’s values were about changing the world to be a better place - they were about doing excellent, honest work for customers. Slootman is lasered focused on culture, and specifically views culture as an asset - calling it: “The only enduring, sustainable form of differentiation. These days, we don’t have a monopoly for very long on talent, technology, capital, or any other asset; the one thing that is unique to us is how we choose to come together as a group of people, day in and day out. How many organizations are there that make more than a halfhearted attempt at this?” Technology companies have taken different routes in establishing culture: Google and Facebook have tried to create culture by showering employees with unbelievable benefits, Netflix has focused on pure execution and transparency, and Microsoft has re-vamped its culture by adopting a Growth Mindset (has it really though?). Google originally promoted “Don’t be evil,” as part of its Code of Conduct but dropped the motto in 2018. Employees want to work for mission-driven organizations, but not all companies are really changing the world with their products, and Frank did not try to sugarcoat Data Domain’s data-duplication technology as a way to “elevate the world’s consciousness.” He created a culture driven by performance and execution - providing a useful product to businesses that needed it. The culture was so revered that post-acquisition, EMC instituted Data Domain’s performance management system. Data Domain employees were looked at strangely by longtime EMC executives, who had spent years in a big and stale company. Culture is a hard thing to replicate and a hard thing to change as we saw with the Innovator’s Dilemma. Might as well use it to help the company succeed!

Dig Deeper

  • How Data Domain Evolved in the Cloud World

  • Former Data Domain CEO Frank Slootman Gets His Old Band Back Together at ServiceNow

  • The Contentious Take-over Battle for Data Domain: Netapp vs. EMC

  • 2009 Interview with Frank Slootman After the Acquisition of Data Domain

tags: Snowflake, DoorDash, ServiceNow, WeWork, Data Domain, EMC, Netapp, Frank Slootman, Borland, IBM, Burroughs, Sperry, NEA, Greylock, Workday, Aneel Bhusri, Sun Microsystems, USVP, Uber, Netflix, Facebook, Google, Microsoft, Amazon, Jeff Bezos, Tony Xu, MongoDB, Elastic, Crowdstrike, Crypto, Gartner, Hype Cycle, Slack, Apple, Steve Jobs, Steve Wozniak, Magic Leap, batch2
categories: Non-Fiction
 

October 2020 - Working in Public: The Making and Maintenance of Open Source Software by Nadia Eghbal

This month we covered Nadia Eghbal’s instant classic about open-source software. Open-source software has been around since the late seventies but only recently it has gained significant public and business attention.

Tech Themes

The four types of open source communities described in Working in Public

The four types of open source communities described in Working in Public

  1. Misunderstood Communities. Open source is frequently viewed as an overwhelmingly positive force for good - taking software and making it free for everyone to use. Many think of open source as community-driven, where everyone participates and contributes to making the software better. The theory is that so many eyeballs and contributors to the software improves security, improves reliability, and increases distribution. In reality, open-source communities take the shape of the “90-9-1” rule and act more like social media than you could think. According to Wikipedia, the "90–9–1” rule states that for websites where users can both create and edit content, 1% of people create content, 9% edit or modify that content, and 90% view the content without contributing. To show how this applies to open source communities, Eghbal cites a study by North Carolina State Researchers: “One study found that in more than 85% of open source projects the research examined on Github, less than 5% of developers were responsible for 95% of code and social interactions.” These creators, contributors, and maintainers are developer influencers: “Each of these developers commands a large audience of people who follow them personally; they have the attention of thousands of developers.” Unlike Instagram and Twitch influencers, who often actively try to build their audiences, open-source developer influencers sometimes find the attention off-putting - they simply published something to help others and suddenly found themselves with actual influence. The challenging truth of open source is that core contributors and maintainers give significant amounts of their time and attention to their communities - often spending hours at a time responding to pull requests (requests for changes / new features) on Github. Evan Czaplicki’s insightful talk entitled “The Hard Parts of Open Source,” speaks to this challenging dynamic. Evan created the open-source project, Elm, a functional programming language that compiles Javascript, because he wanted to make functional programming more accessible to developers. As one of its core maintainers, he has repeatedly been hit with requests of “Why don’t you just…” from non-contributing developers angrily asking why a feature wasn’t included in the latest release. As fastlane creator, Felix Krause put it, “The bigger your project becomes, the harder it is to keep the innovation you had in the beginning of your project. Suddenly you have to consider hundreds of different use cases…Once you pass a few thousand active users, you’ll notice that helping your users takes more time than actually working on your project. People submit all kinds of issues, most of them aren’t actually issues, but feature requests or questions.” When you use open-source software, remember who is contributing and maintaining it - and the days and years poured into the project for the sole goal of increasing its utility for the masses.

  2. Git it? Git was created by Linus Torvalds in 2005. We talked about Torvalds last month, who also created the most famous open-source operating system, Linux. Git was born in response to a skirmish with Larry McAvoy, the head of proprietary tool BitKeeper, over the potential misuse of his product. Torvalds went on vacation for a week and hammered out the most dominant version control system today - git. Version control systems allow developers to work simultaneously on projects, committing any changes to a centralized branch of code. It also allows for any changes to be rolled back to earlier versions which can be enormously helpful if a bug is found in the main branch. Git ushered in a new wave of version control, but the open-source version was somewhat difficult to use for the untrained developer. Enter Github and GitLab - two companies built around the idea of making the git version control system easier for developers to use. Github came first, in 2007, offering a platform to host and share projects. The Github platform was free, but not open source - developers couldn’t build onto their hosting platform - only use it. GitLab started in 2014 to offer an alternative, fully-open sourced platform that allowed individuals to self-host a Github-like tracking program, providing improved security and control. Because of Github’s first mover advantage, however, it has become the dominant platform upon which developers build: “Github is still by far the dominant market player: while it’s hard to find public numbers on GitLab’s adoption, its website claims more than 100,000 organizations use its product, whereas GitHub claims more than 2.9 million organizations.” Developers find GitHub incredibly easy to use, creating an enormous wave of open source projects and code-sharing. The company added 10 million new users in 2019 alone - bringing the total to over 40 million worldwide. This growth prompted Microsoft to buy GitHub in 2018 for $7.5B. We are in the early stages of this development explosion, and it will be interesting to see how increased code accessibility changes the world over the next ten years.

  3. Developing and Maintaining an Ecosystem Forever. Open source communities are unique and complex - with different user and contributor dynamics. Eghbal tries to segment the different types of open source communities into four buckets - federations, clubs, stadiums, and toys - characterized below in the two by two matrix - based on contributor growth and user growth. Federations are the pinnacle of open source software development - many contributors and many users, creating a vibrant ecosystem of innovative development. Clubs represent more niche and focused communities, including vertical-specific tools like astronomy package, Astropy. Stadiums are highly centralized but large communities - this typically means only a few contributors but a significant user base. It is up to these core contributors to lead the ecosystem as opposed to decentralized federations that have so many contributors they can go in all directions. Lastly, there are toys, which have low user growth and low contributor growth but may actually be very useful projects. Interestingly, projects can shift in and out of these community types as they become more or less relevant. For example, developers from Yahoo open-sourced their Hadoop project based on Google’s File System and Map Reduce papers. The initial project slowly became huge, moving from a stadium to a federation, and formed subprojects around it, like Apache Spark. What’s interesting, is that projects mature and change, and code can remain in production for a number of years after the project’s day in the spotlight is gone. According to Eghbal, “Some of the oldest code ever written is still running in production today. Fortran, which was first developed in 1957 at IBM, is still widely used in aerospace, weather forecasting, and other computational industries.” These ecosystems can exist forever, but the costs of these ecosystems (creation, distribution, and maintenance) are often hidden, especially the maintenance aspect. The cost of creation and distribution has dropped significantly in the past ten years - with many of the world’s developers all working in the same ecosystem on GitHub - but it has also increased the total cost of maintenance, and that maintenance cost can be significant. Bootstrap co-creator Jacob Thornton likens maintenance costs to caring for an old dog: “I’ve created endlessly more and more projects that have now turned [from puppies] into dogs. Almost every project I release will get 2,000, 3,000 watchers, which is enough to have this guilt, which is essentially like ‘I need to maintain this, I need to take care of this dog.” Communities change from toys to clubs to stadiums to federations but they may also change back as new tools are developed. Old projects still need to be maintained and that code and maintenance comes down to committed developers.

Business Themes

1_c7udbm7fJtdkZEE6tl1mWQ.png
  1. Revenue Model Matching. One of the earliest code-hosting platforms was SourceForge, a company founded in 1999. The Company pioneered the idea of code-hosting - letting developers publish their code for easy download. It became famous for letting open-source developers use the platform free of charge. SourceForge was created by VA Software, an internet bubble darling that saw its stock price decimated when the bubble finally burst. The challenge with scaling SourceForge was a revenue model mismatch - VA Software made money with paid advertising, which allowed it to offer its tools to developers for free, but meant its revenue model was highly variable. When the company went public, it was still a small and unproven business, posting $17M in revenue and $31M in costs. The revenue model mismatch is starting to rear its head again, with traditional software as a service (SaaS) recurring subscription models catching some heat. Many cloud service and API companies are pricing by usage rather than a fixed, high margin subscription fee. This is the classic electric utility model - you only pay for what you use. Snowflake CEO Frank Slootman (who formerly ran SaaS pioneer ServiceNow) commented: “I also did not like SaaS that much as a business model, felt it not equitable for customers.” Snowflake instead charges based on credits which pay for usage. The issue with usage-based billing has traditionally been price transparency, which can be obfuscated with customer credit systems and incalculable pricing, like Amazon Web Services. This revenue model mismatch was just one problem for SourceForge. As git became the dominant version control system, SourceForge was reluctant to support it - opting for its traditional tools instead. Pricing norms change, and new technology comes out every day, it’s imperative that businesses have a strong grasp of the value they provide to their customers and align their revenue model with customers, so a fair trade-off is created.

  2. Open Core Model. There has been enormous growth in open source businesses in the past few years, which typically operate on an open core model. The open core model means the Company offers a free, normally feature limited, version of its software and also a proprietary, enterprise version with additional features. Developers might adopt the free version but hit usage limits or feature constraints, causing them to purchase the paid version. The open-source “core” is often just that - freely available for anyone to download and modify; the core's actual source code is normally published on GitHub, and developers can fork the project or do whatever they wish with that open core. The commercial product is normally closed source and not available for modification, providing the business a product. Joseph Jacks, who runs Open Source Software (OSS) Capital, an investment firm focused on open source, displays four types of open core business model (pictured above). The business models differ based on how much of the software is open source. Github, interestingly, employs the “thick” model of being mostly proprietary, with only 10% of its software truly open-sourced. Its funny that the site that hosts and facilitates the most open source development is proprietary. Jacks nails the most important question in the open core model: “How much stays open vs. How much stays closed?” The consequences can be dire to a business - open source too much and all of a sudden other companies can quickly recreate your tool. Many DevOps tools have experienced the perils of open source, with some companies losing control of the project it was supposed to facilitate. On the flip side, keeping more of the software closed source goes against the open-source ethos, which can be viewed as organizations selling out. The continuous delivery pipeline project Jenkins has struggled to satiate its growing user base, leading to the CEO of the Jenkins company, CloudBees, posting the blog post entitled, “Shifting Gears”: “But at the same time, the incremental, autonomous nature of our community made us demonstrably unable to solve certain kinds of problems. And after 10+ years, these unsolved problems are getting more pronounced, and they are taking a toll — segments of users correctly feel that the community doesn’t get them, because we have shown an inability to address some of their greatest difficulties in using Jenkins. And I know some of those problems, such as service instability, matter to all of us.” Striking this balance is incredibly tough, especially in a world of competing projects and finite development time and money in a commercial setting. Furthermore, large companies like AWS are taking open core tools like Elastic and MongoDB and recreating them in proprietary fashions (Elasticsearch Service and DocumentDB) prompting company CEO’s to appropriately lash out. Commercializing open source software is a never-ending battle against proprietary players and yourself.

  3. Compensation for Open Source. Eghabl characterizes two types of funders of open-source - institutions (companies, governments, universities) and individuals (usually developers who are direct users). Companies like to fund improved code quality, influence, and access to core projects. The largest groups of contributors to open source projects are mainly corporations like Microsoft, Google, Red Hat, IBM, and Intel. These corporations are big enough and profitable enough to hire individuals and allow them to strike a comfortable balance between time spent on commercial software and time spent on open source. This also functions as a marketing expense for the big corporations; big companies like having influencer developers on payroll to get the company’s name out into the ecosystem. Evan You, who authored Vue.js, a javascript framework described company backed open-source projects: “The thing about company-backed open-source projects is that in a lot of cases… they want to make it sort of an open standard for a certain industry, or sometimes they simply open-source it to serve as some sort of publicity improvement to help with recruiting… If this project no longer serves that purpose, then most companies will probably just cut it, or (in other terms) just give it to the community and let the community drive it.” In contrast to company-funded projects, developer-funded projects are often donation based. With the rise of online tools for encouraging payments like Stripe and Patreon, more and more funding is being directed to individual open source developers. Unfortunately though, it is still hard for many open source developers to pursue open source on individual contributions, especially if they work on multiple projects at the same time. Open source developer Sindre Sorhus explains: “It’s a lot harder to attract company sponsors when you maintain a lot of projects of varying sizes instead of just one large popular project like Babel, even if many of those projects are the backbone of the Node.js ecosystem.” Whether working in a company or as an individual developer, building and maintaining open source software takes significant time and effort and rarely leads to significant monetary compensation.

Dig Deeper

  • List of Commercial Open Source Software Businesses by OSS Capital

  • How to Build an Open Source Business by Peter Levine (General Partner at Andreessen Horowitz)

  • The Mind Behind Linux (a talk by Linus Torvalds)

  • What is open source - a blog post by Red Hat

  • Why Open Source is Hard by PHP Developer Jose Diaz Gonzalez

  • The Complicated Economy of Open Source

tags: Github, Gitlab, Google, Twitch, Instagram, E;, Elm, Javascript, Open Source, Git, Linus Torvalds, Linux, Microsoft, MapReduce, IBM, Fortran, Node, Vue, SourceForge, VA Software, Snowflake, Frank Slootman, ServiceNow, SaaS, AWS, DevOps, CloudBees, Jenkins, Intel, Red Hat, batch2
categories: Non-Fiction
 

September 2020 - Women of Color in Tech by Susanne Tedrick

This month we dove into Susanne Tedrick’s new book, Women of Color in Tech. Tedrick provides an excellent overview of the challenges many women of color face when trying to enter into and stay in the technology industry. The mix of real-world advice, personal experience, and industry stories combine to form a comprehensive resource for anyone in technology or looking to enter the field.

Tech Themes

  1. The Current State. Tedrick starts the book with uncomfortable statistics. Only 26% of computing roles are held by women; Black women hold 3% and Hispanic women hold 2% of computing roles. In addition, the trends aren’t positive - 26% is a 9% decrease since 1990. According to the Ascend Foundation, a Pan-Asian organization for business professionals, from 2007 to 2015, black women experienced a 13% decrease in professional roles in technology. While distressing, there are some green shoots, a 2012 paper by Heather Gonzalez and Jeffrey Kuenzi pointed out that science and engineering graduate program enrollments grew 65%, 55%, and 50% for Hispanic/Latino, American Indian/Alaska Native, and African American students, respectively. So why is this? Tedrick acknowledges that there is no one single answer, instead, its a combination of circumstances starting at early adolescence. Tedrick introduces the idea of “STEM Deserts” or areas where STEM education is not offered. These deserts disproportionally affect high poverty schools (schools where 75% or more of the students are eligible for free lunch and breakfast). Almost half of these schools contain large Black and Hispanic populations. Once women of color arrive at college it gets harder: “Coupling [student debt] with professor’s biases, a lack of meaningful support at home or within their community, and few to no peers with whom they can identify in their academic programs, many young women of color struggle to get through their programs.” For the few that conquer all of these challenges, the workplace introduces a whole new set of issues. Tedrick cites the Kapor Center’s Tech Leavers Study: “Thirty percent of women of color respondents claimed that they were passed over for promotions and 24% report being stereotyped.” According to a Harvard Business Review article written by feminist legal scholar Joan Williams, “77% of black women report having to prove themselves over and over; their success discounted and their expertise questioned.” When you compile all of these challenges throughout a lifetime, it becomes an incredibly difficult journey for black women in tech.

  2. Technical Roles and the Building Blocks of the Internet. Tedrick introduces many key organizational roles in technology including business analysis, consulting, data science, information security, product management, project management, software development, technical sales, technical support, user experience design, and web design. After introducing each one, she provides a prescriptive guide for individuals looking to learn more - hitting on key skills, educational requirements, and the latest trends. While I can’t cover every role here, one underappreciated position / sub-segment of technology Tedrick discusses is computer networking. Ultimately, networking was the benefit that unlocked the internet to the masses. Protocols like TCP/IP, VoIP, and HTTP are crucial to the functioning internet. These protocols offer ways for computers to communicate with one another in a consistent manner. The IP (Internet Protocol) provides basic addressing for computers and TCP provides the continual delivery of ordered and reliable bytes from one computer to another in what are called packets. A packet is a pre-defined standard for sending data. VoIP is an extension of this protocol specifically for transcoding audio and video voice signals into packets. HTTP is the way you request the data found at a location: http://techbookofthemonth.com tells the browser to fetch the website at that URL. A lot of basic networking features are typically baked into the operating system, which for most consumers today is Linux. Linux is an open-source operating system that handles all of the things that makes your computer run: memory, CPU, connected devices, graphics, desktop environment, and the ability to run applications. However, Linux programming is still not a commonly learned skill. Tedrick quotes Tameika Reed, a senior infrastructure engineer and founder of Women in Linux: “We have people who are getting degrees and PhDs and so on. . . . When it comes down to Linux, which runs in 90 percent of most companies, and it’s time to troubleshoot something, they don’t know how to troubleshoot the basics of the foundation. I look at Linux as the foundations of getting into tech.” Red Hat, which was acquired by IBM for $34 billion in 2019, offers an enterprise version of Linux which comes with support, guaranteed versioning, and additional security. While computer networking is not a flashy industry, it underpins so much that it remains very interesting.

  3. Technology Skills. Chapter six lays out a great way to assess your own skills and understand where you need improvement. These skills can require additional schooling via college, trade schools, or massive-open-online-courses (MOOCs) like Coursera but other ways to complement this learning include hackathons, conferences, networking, and volunteering. Tedrick wanted to improve her own skills so she volunteered to help set up a conference: “To improve my web design, WordPress, and conference organization skills, I volunteered my services for a leadership conference being held by IEEE Women in Engineering for four months in 2016. I helped to build and maintain the event website using WordPress, as well as helped people with registration and refunds. This experience greatly improved my understanding of web design, search engine optimization (SEO), event promotion, and collaborating with remote teams (I was based in Chicago, while much of the event team and registrants were based in and around Detroit, Michigan). In the process, I learned more about the different fields of engineering and broadened my network with incredible engineering students and professionals.” The book is incredibly helpful for skill-building - it gives you the exact things you need to learn to be successful in specific positions and it even clears up some myths of the technology industry. One common myth is that “Tech Careers Require Constant, Hands-On Programming.” As evidenced by the myriad of roles listed above, the technology industry involves so much more than programming. In addition, Tech careers exist outside of the top five big-name companies like Microsoft, Google, Facebook, Amazon, and Netflix and even exist at non-tech companies too. One critical skill that Tedrick highlights for a number of different technical roles is communication. Communication is not often mentioned when discussing software engineering, but Tedrick picks up on its huge importance, and the necessary ability to communicate to technical and non-technical audiences. On top of sharing with non-technical audiences, engineers need to know how to communicate accurate deadlines to managers and ask for help when unsure of how to implement a challenging new feature. Communication is not just speaking, its also listening and empathetically understanding where others are coming from, to establish common ground and grow mutual understanding.

Business Themes

equal-pay-by-race_new-website_50-50_900x700_acf_cropped-1.png
Screenshot 2020-10-10 145811.png
  1. Tedrick’s Story and Grit. Susanne’s personal stories appear throughout the book and perfectly complement the substantial amount of how-to information and advice. Chapter nine talks about the daily challenges of many women of color in tech and their lack of support to solve those challenges. Susanne’s own story is one of incredible determination and perseverance: “My mother had been diagnosed with a brain tumor when I was very young. This initial tumor led to more health issues for her over the years, including a decline into dementia, a loss of some of her short-term memory, and impacted mobility. The latter half of her life was spent in and out of hospitals, having numerous operations and medical incidents. My father was left to care for me and my sister, while also supporting several other family members in one house. Between work and caring for my mom, he couldn’t be around much, and fortunately, some nearby relatives and family friends helped to raise and care for us. As there was only one income (already too high to qualify for most public assistance programs) and my mother needed many medications, there were times where a choice had to be made between eating, having phone service, making critical house repairs, or having the lights stay on. This went on for nearly two decades, up until my mother’s death. It wasn’t until well into my adult life that I realized I was living in ‘survival mode’ and just trying to exist. I was spending most of my time trying to find happiness in my life; having a meaningful and engaging career was not an immediate goal or one I thought was achievable for me.” After working in administrative roles and taking on a couple of different jobs, she managed to attend Northwestern while continuing to work. “I used much of my vacation and holiday time from work not only to study but to attend conferences, interviews, boot camps, and the like. I did homework during lunch breaks or before the start of a full workday, only to go to class for several hours in the same evening.” Tedrick has risen to be an award-winning public speaker, author, and technologist at IBM (oh and she’s also run a couple of marathons). Her story is truly inspirational!

  2. Culture, Intersectionality, and Bias. We’ve discussed Clayton Christensen’s Resources-Processes-Values framework before and how they impact the discovery of emerging technologies. Often the processes create a culture and set of habitual routines that can be difficult to change. The culture of big technology has been anti-women for a long time. As Tedrick points out, women of color not only have to deal with this challenge but also repeated racial abuse, microaggressions, and tokenism. Kimberlé Crenshaw called this intersectionality, or the idea that a person's social identities (e.g., gender, caste, sex, race, class, sexuality, religion, disability, physical appearance, height, etc.) combine to create unique modes of discrimination and privilege. Tedrick points out an example of this with Sheryl Sandberg’s famous novel, Lean In. The book became a bestseller and made Sheryl Sandberg a household name (to those that didn’t already know her as COO of Facebook). However, as Tedrick points out: “The central problem with the book, which Sandberg herself later acknowledged, is that it assumed that the reader had certain privileges that many women of color do not have: completely supportive households that don’t require much of their time and attention, work cultures that allow expression of their thoughts without fear of being fired or held back, and access to career mentors to help them become stronger leaders. This lack of understanding of where the reader may be coming from and experiencing caused much of Sandberg’s advice to ring hollow for women of color.” The book ignores the structural challenges that many women of color face. Michelle Obama put it bluntly: “It’s not always enough to lean in, because that shit doesn’t work all the time.” When building culture at an organization, it’s super important to think about how that culture addresses each social identity at the company. Furthermore, it’s not the responsibility of diverse individuals to build that culture. Tedrick sums it up well: “Addressing tokenism, much like addressing bias, unfortunately, is not something that you alone can address. It is also not our responsibility to address this. It is up to organizations and their leaders to correct and address tokenism so that women of color are fully engaged.”

  3. Negotiating Compensation. Understanding pay and compensation are critical to understanding any job offer. Frequently job candidates are remiss to ask for additional compensation because they fear retribution like the offer is pulled and given to someone else and worry about sounding greedy before even joining a new company. As Susanne found out after receiving her first traditional job, this can lead to lower salaries, especially when adjusting for location. In addition, Susanne points out the enormous gender pay gap that occurs at organizations: “It’s no secret that women—and specifically, women of color—are underpaid in about every industry, not just tech. While it is on companies to fix their approaches to compensation, it is our right and duty to demand fair compensation for our work.” A study of the technology industry done by job search marketplace, Hired, shows that black women were paid $0.89 on the dollar compared to white males. This is the lowest across White, Asian, Black, and Hispanic men and women in the technology sector. For LGBTQI+ individuals, the wage gap is $0.90 to $1 of compensation for non-LGBTQI+. While pay gap detail for black LGBTQI+ community is under-studied, according to The National LGBTQ Task Force’s 2011, 48% of trans and gender non-conforming black individuals experienced discrimination in the hiring process. Outside of the technology industry, the pay gap is even more stark with Black women earning $0.62 for every dollar earned by a White male. To address many of these challenges, and ensure that candidates get as close to a fair offer as possible, Tedrick lays out a framework for considering a new job, from pay to benefits to location. Tedrick advises individuals to first research local salaries for the role they are taking on. Armed with data, Tedrick suggests candidates try to be confident, respectful, and flexible in all discussions and to emphasize the unique value they bring to the organization.

Dig Deeper

  • Work Smart & Start Smart: Salary Negotiation for Women of Color

  • Anita Borg and the history of one of the largest professional organizations for women in technology

  • How the World’s most prevalent operating system was built by a 21-year old in Finland

  • Black Girls Code: Empowering Young Black Women to Become Innovators

  • Tedrick’s Twitter, website, and talk with the Women’s National Book Association

tags: TCP/IP, VoIP, HTTP, Computer Networking, Linux, Red Hat, IBM, Susanne Tedrick, Coursera, IEEE Women in Engineering, Grit, Culture, Diversity, Women in Tech, Intersectionality, Facebook, Sheryl Sandberg, Michelle Obama, Gender Pay Gap, batch2
categories: Non-Fiction
 

August 2020 - Venture Deals by Brad Feld and Jason Mendelson

This month we checked out an excellent book for founders, investors, and those interested in private company financings. The book hits on a lot of the key business and legal terms that aren’t discussed in typical startup books, making it useful no matter what stage of the entrepreneurial journey you are on.

Tech Themes

  1. The Rise of Founder Friendly VC. Writing on his blog, Feld Thoughts, which was the original genesis for Venture Deals, Brad Feld mentioned that: “From 2010 forward, the entire VC market shifted into a mode that many describe as ‘founder friendly.’ Investor reputation mattered at both the angel and VC level.” In the 80’s and 90’s, because there was so little competition among venture capital firms, it was common for firms to dictate terms to company founders. The VC firms were the ones with the cash, and the founders didn’t have many options to choose from. If you wanted to build a big, profitable, public company, the only way to get there was by taking venture capital money. This trend started to unwind during the internet bubble, when founders started to maintain more and more of their businesses before the IPO. In fact, as this Harvard Business Review article points out, it was actually common to fire the founder/CEO prior to a public offering in favor of more seasoned leaders. This trend was bucked by Netscape, which eschewed traditional wisdom, going public less than a year from founding, with an unprofitable business. The Netscape IPO was clearly a royal coming-together of technology history. Tracing it all the way back - George Winthrop Fairchild started IBM in 1911; in the late 50’s, Arthur Rock convinced Fairchild’s son, Sherman to fund the traitorous eight (eight employees who left competitor Shockley Semiconductor) to start Fairchild Semiconductor; Eugene Kleiner (one of the traitorous eight) starts Kleiner Perkins, a venture capital firm that eventually invested in Netscape. Kleiner Perkins would also invest in Google (frequently regarded as one of the best and riskiest startup investments ever). Google was the first internet company to go public with a dual-class share structure where the founders would own a disproportionate amount of the voting rights of the company. Marc Andreessen, the founder of Netscape, loved this idea and eventually launched his own venture capital firm called Andreessen Horowitz, which ushered in a new generation of founder-friendly investing. At one point Andreessen was even quoted saying: “It is unsafe to go public today without a dual-class share structure.” Some notable companies with dual class shares include several Andreessen companies such as Facebook, Zynga, Box, and Lyft. Recently some have questioned whether founder friendly terms have pushed too far with some major flameouts from companies with the structure including Theranos, WeWork, and Uber.

  2. How to Raise Money. Feld has several recommendations for fundraising that are important including having a target round size, demo, financial projections, and VC syndicate. Feld contends that CEOs who offer a range of varying round sizes to VC’s don’t really understand their business goals and use of proceeds. By having a concrete round size it shows that the CEO understands roughly how much money it will take to get to the next milestone or said another way, it shows the CEO understands the runway (in months) needed to build that new product or feature. It shows command of the financing and vision of the business. Feld encourages founders to provide a demo, because: “while never required, many investors respond to things we can play with, so even if you are an early stage company, a prototype or demo is desirable.” Beyond the explicit point here, the demo shows confidence in the product and at least some ability to sell, which is obviously a key aspect in eventually scaling the business. Another aspect of scaling the business is the financial model, but as Feld states, “the only thing that can be known about a pre-revenue company’s financial projections is that they are wrong.” While the numbers are meaningless for really early stage companies, for those that have a few customers it can be helpful to get a sense of long-term gross margins and aspects of the company you hope to invest in and / or change over time. Lastly, Feld gives advice for building a VC syndicate, or group of VC investors. Frequently lead investors will commit a certain dollar amount of the round, and it will be up to the founder/CEO to go find a way to build out the round. This can be incredibly challenging as detailed by Moz founder, Rand Fishkin, who thought he had a deal in hand only to see it be taken away. There are multiple bids in the VC fundraising process, one called an indication of interest, which is non-binding and normally provides a range on valuation, one called a letter of intent, which is slightly more detailed and may include legal terms of the deal such as board representation, liquidation preference, and governance terms, and then final legal documentation. A lot of time, the early bids can be withdrawn based off of poor market feedback or when a company misses its financial projections (like Moz did in its process). Understanding the process and the materials needed to complete the deal is helpful at setting expectations for founders.

  3. Warrants, SPACs, and IPOs. With SPACMania in full-swing, we wanted to dive into SPACs and see how they work. We’ve discussed SPACs before, with regards to Chamath’s Social Capital merger with Virgin Galactic. But how do traditional SPAC financings work and why is there a rush of famous people, such as LinkedIn founder Reid Hoffman, to raise them? A SPAC or Specialty Purpose Acquisition Company is a blank-check company which goes public with the goal of acquiring a business, thereby taking it public. SPACs can be focused on industry or size of company and they are most frequently led by operational leaders and / or private equity firms. The reason SPACs have been gaining in popularity is that public markets investors are seeking more risk and a few high profile SPAC deals, namely DraftKings and Nikola, have traded better than expected. Most companies that are going public today are older, more mature businesses, and the public markets have been generally favorable to somewhat suspect ventures (Nikola is an electric truck company that has never produced a single truck, but is worth $14B on hype alone). VC firms and companies see the ability to get outsized returns on their investments because so many people are clamoring to find returns above the basically 0% offered by treasury bonds. The S&P 500 P/E ratio is now at around 26x compared to a historical average around 16x, meaning the market seems to be overvalued compared to prior times. SPACs typically come with an odd structure. A unit in a SPAC normally consists of one common share of stock and one warrant, which is the ability to purchase shares for $0.01 after a SPAC merges with its target company. The founders of the SPAC also receive founder shares, normally 20% of the business. Once the target is found, SPACs will often coordinate a PIPE (Private Investment in Public Equity), where a large private investor will invest mainly primary (cash to the balance sheet) capital into the business. This has emerged as a hip, new alternative to traditional IPOs, keeping with the theme of innovation in public offerings like direct listings, however, its unclear that this really benefits the company going public. Often the merged companies are the subject of substantial dilution by the SPAC sponsors and PIPE investors, lowering the overall equity piece management maintains. However, given the somewhat high valuations companies are receiving in the public markets (Zoom at 80x+ LTM Revenue, Shopify at 59x LTM Revenue), it may be worth the dilution.

Business Themes

graph-6.jpg
screen-shot-2016-12-21-at-13-57-20_orig.png
screen-shot-2016-12-21-at-13-59-11_orig.png
  1. How VC’s Make Money. In VC, the typical fund structure includes a general partnership (GP) and limited partners (LPs). The GP is the investors at the VC firm and the limited partners are the institutional investors that provide the money for the VC firm to invest. A typical structure involves the GP investing 1% of their own money (99% comes from LPs) and then getting paid an annual 2% management fee as well as 20% carried interest, or the profit made from investments. Using the example from the book: “Start with the $100 million fund. Assume that it's a successful fund and returns 3× the capital, or $300 million. In this case, the first $100 million goes back to the LPs, and the remaining profit, or $200 million, is split 80 percent to the LPs and 20 percent to the GPs. The VC firm gets $40 million in carried interest and the LPs get the remaining $160 million. And yes, in this case everyone is very happy.” Understanding how investors make money can help the entrepreneur better understand why VC’s pressure companies. As Feld points out, sometimes VC’s are trying to raise a new fund or have invested the majority of the fund already and thus do not care as much about some investments.

  2. Growth at all costs. There has been a concerted focus in VC on the get big quick motto. Nobody better exemplifies this than Masayoshi Son and the $100B VC his firm Softbank raised a few years ago. With notable big bets on current losers like WeWork and Oyo, which are struggling during this pandemic, its unclear whether this motto remains true. Eric Paley, a Managing Partner at Founder Collective, expertly quantifies the potential downsides of a risk-it-all strategy: “Investors today have overstuffed venture funds, and lots of capital is sloshing around the startup ecosystem. As a result, young startups with strong teams, compelling products and limited traction can find themselves with tens of millions of dollars, but without much real validation of their businesses. We see venture investors eagerly investing $20 million into a promising company, valuing it at $100 million, even if the startup only has a few million in net revenue. Now the investors and the founders have to make a decision — what should determine the speed at which this hypothetical company, let’s call it “Fuego,” invests its treasure chest of money in the amazing opportunity that motivated the investors? The investors’ goal over the next roughly 24 months is for the company to become worth at least three times the post-money valuation — so $300 million would be the new target pre-money valuation for Fuego’s next financing. Imagine being a company with only a few million in sales, with a success hurdle for your next round of $300 million pre-money. Whether the startup’s model is working or not, the mantra becomes ‘go big or go home.’” This issue is key when negotiating term sheets with investors and understanding board dynamics. As Feld calls out: “The voting control issues in the early stage deals are only amplified as you wrestle with how to keep control of your board when each lead investor per round wants a board seat. Either you can increase your board size to seven, nine, or more people (which usually effectively kills a well-functioning board), or more likely the board will be dominated by investors.” As an entrepreneur, you need to be cognizant of the pressure VC firms will put on founders to grow at high rates, and this pressure is frequently applied by a board. Often late stage startups have 10 people+ on their board. UiPath, a private venture-backed startup that has raised over $1B and is valued at $10B, has 12 people on its board. With all of the different firms having their own goals, boards can become ineffective. Whenever startups are considering fundraising, it’s important to realize the person you are raising from will be an ongoing member of the company and voice on the board and will most likely push for growth.

  3. Liquidation Preference. One of the least talked about terms in venture capital among startup circles is liquidation preference. Feld describes liquidation preference as: “a certain multiple of the original investment per share is returned to the investor before the common stock receives any consideration.” Startup culture has tended to view fundraises as stamps of approval and success, but thats not always the case. As the book discusses, preference can lead to very negative outcomes for founders and employes. For example, let’s say a company at $10M in revenue raises $100 million with a 1x liquidation preference at a $400 million pre-money valuation ($500M post money). The company is pressured by its VCs to grow quickly but it has issues with product market fit and go to market; five years go by and the company is at $15M in revenue. At this point the VCs are not interested in funding any more, and the board decides to try to sell the company. A buyer offers $80 million and the board accepts it. At this point, all $80M has to go back to the original investors who had the 1x liquidation preference. All of the common stockholders and the founders, get nothing. Its not the desired outcome by any means, but its important to know. Some companies have not heeded this advice and continued to raise at massive valuations including Notion which has raised $10M at a $800 million valuation, despite being rumored to be around $15M in revenue. The company raised at a $1.6B valuation (an obvious 2x) after being rumored to be at $30M in revenue. While not taking dilution is nice as a founder, it also sets up a massive hurdle for the company and seriously cramps returns. A 3x return (which is low for VC investors) means selling the company for $4.8B, which is no small feat.

Dig Deeper

  • Feld Thoughts: Brad Feld’s Blog

  • The Ultimate Guide to Liquidation Preferences

  • Startup Boards: A deep dive by Mark Suster, VC at Upfront Ventures

  • The meeting that showed me the truth about VCs on TechCrunch

  • SPOTAK: The Six Traits Marc Lore Looks for When Hiring

tags: Uber, WeWork, Theranos, Fairchild Semiconductor, Netscape, Marc Andreessen, SPAC, Chamath Palihapitiya, Zynga, Box, Facebook, Brad Feld, Nikola, Draftkings, Zoom, Shopify', Warrants, Liquidation Preference, VC, Founder Collective, Oyo, UiPath, Notion, Softbank, batch2
categories: Non-Fiction
 

July 2020 - Innovator's Dilemma by Clayton Christensen

This month we review the technology classic, the Innovator’s Dilemma, by Clayton Christensen. The book attempts to answer the age-old question: why do dominant companies eventually fail?

Tech Themes

  1. The Actual Definition of Disruptive Technology. Disruption is a term that is frequently thrown around in Silicon Valley circles. Every startup thinks its technology is disruptive, meaning it changes how the customer currently performs a task or service. The actual definition, discussed in detail throughout the book, is relatively specific. Christensen re-emphasizes this distinction in a 2015 Harvard Business Review article: "Specifically, as incumbents focus on improving their products and services for their most demanding (and usually most profitable) customers, they exceed the needs of some segments and ignore the needs of others. Entrants that prove disruptive begin by successfully targeting those overlooked segments, gaining a foothold by delivering more-suitable functionality—frequently at a lower price. Incumbents, chasing higher profitability in more-demanding segments, tend not to respond vigorously. Entrants then move upmarket, delivering the performance that incumbents' mainstream customers require, while preserving the advantages that drove their early success. When mainstream customers start adopting the entrants' offerings in volume, disruption has occurred." The book posits that there are generally two types of innovation: sustaining and disruptive. While disruptive innovation focuses on low-end or new, small market entry, sustaining innovation merely continues markets along their already determined axes. For example, in the book, Christensen discusses the disk drive industry, mapping out the jumps which pack more memory and power into each subsequent product release. There is a slew of sustaining jumps for each disruptive jump that improves product performance for existing customers but doesn't necessarily get non-customers to become customers. It is only when new use cases emerge, like rugged disk usage and PCs arrive, that disruption occurs. Understanding the specific definition can help companies and individuals better navigate muddled tech messaging; Uber, for example, is shown to be a sustaining technology because its market already existed, and the company didn't offer lower prices or a new business model. Understanding the intricacies of the definition can help incumbents spot disruptive competitors.

  2. Value Networks. Value networks are an underappreciated and somewhat confusing topic covered in The Innovator's Dilemma's early chapters. A value network is defined as "The context within which a firm identifies and responds to customers' needs, solves problems, procures input, reacts to competitors, and strives for profit." A value network seems all-encompassing on the surface. In reality, a value network serves to simplify the lens through which an organization must make complex decisions every day. Shown as a nested product architecture, a value network attempts to show where a company interacts with other products. By distilling the product down to its most atomic components (literally computer hardware), we can see all of the considerations that impact a business. Once we have this holistic view, we can consider the decisions and tradeoffs that face an organization every day. The takeaway here is that organizations care about different levels of performance for different products. For example, when looking at cloud computing services at AWS, Azure, or GCP, we see Amazon EC2 instances, Azure VMs, and Google Cloud VMs with different operating systems, different purposes (general, compute, memory), and different sizes. General-purpose might be fine for basic enterprise applications, while gaming applications might need compute-optimized, and real-time big data analytics may need a memory-optimized VM. While it gets somewhat forgotten throughout the book, this point means that organizations focused on producing only compute-intensive machines may not be the best for memory-intensive, because the customers of the organization may not have a use for them. In the book's example, some customers (of bigger memory providers) looked at smaller memory applications and said there was no need. In reality, there was massive demand in the rugged, portable market for smaller memory disks. When approaching disruptive innovation, it's essential to recognize your organization's current value network so that you don't target new technologies at those who don't need it.

  3. Product Commoditization. Christensen spends a lot of time describing the dynamics of the disk drive industry, where companies continually supplied increasingly smaller drives with better performance. Christensen's description of commoditization is very interesting: "A product becomes a commodity within a specific market segment when the repeated changes in the basis of competition, completely play themselves out, that is, when market needs on each attribute or dimension of performance have been fully satisfied by more than one available product." At this point, products begin competing primarily on price. In the disk drive industry, companies first competed on capacity, then on size, then on reliability, and finally on price. This price war is reminiscent of the current state of the Continuous Integration / Continuous Deployment (CI/CD) market, a subsegment of DevOps software. Companies in the space, including Github, CircleCI, Gitlab, and others are now competing primarily on price to win new business. Each of the cloud providers has similar technologies native to their public cloud offerings (AWS CodePipeline and CloudFormation, GitHub Actions, Google Cloud Build). They are giving it away for free because of their scale. The building block of CI/CD software is git, an open-source version control system founded by Linux founder Linus Torvalds. With all the providers leveraging a massive open-source project, there is little room for true differentiation. Christensen even says: "It may, in fact, be the case that the product offerings of competitors in a market continue to be differentiated from each other. But differentiation loses its meaning when the features and functionality have exceeded what the market demands." Only time will tell whether these companies can pivot into burgeoning highly differentiated technologies.

Business Themes

Innovator Dilemma.png
R1512B_BIG_MODEL-1200x1035.png
  1. Resources-Processes-Value (RPV) Framework. The RPV framework is a powerful lens for understanding the challenges that large businesses face. Companies have resources (people, assets, technology, product designs, brands, information, cash, relationships with customers, etc.) that can be transformed into greater value products and services. The way organizations go about converting these resources is the organization's processes. These processes can be formal (documented sales strategies, for example) or informal (culture and habitual routines). Processes are the big reasons organizations struggle to deal with emerging technologies. Because culture and habit are ingrained in the organization, the same process used to launch a mature, slow-growing market may be applied to a fast-growing, dynamic sector. Christensen puts it best: "This means the very mechanisms through which organizations create value are intrinsically inimical to change." Lastly, companies have values, or "the standards by which employees make prioritization decisions." When there is a mismatch between the resources, processes, and values of an organization and the product or market that an organization is chasing, its rare the business can be successful in competing in the disruptive market. To see this misalignment in action, Christensen describes a meeting with a CEO who had identified the disruptive change happening in the disk-drive market and had gotten a product to market to meet the growing market. In response to a publication showing the fast growth of the market, the CEO lamented to Christensen: "I know that's what they think, but they're wrong. There isn't a market. We've had that drive in our catalog for 18 months. Everyone knows we've got it, but nobody wants it." The issue was not the product or market demand, but the organization's values. As Christensen continues, "But among the employees, there was nothing about an $80 million, low-end market that solved the growth and profit problems of a multi-billion dollar company – especially when capable competitors were doing all they could to steal away the customers providing those billions. And way at the other end of the company there was nothing about supplying prototype companies of 1.8-inch drives to an automaker that solved the problem of meeting the 1994 quotas of salespeople whose contacts and expertise were based so solidly in the computer industry." The CEO cared about the product, but his team did not. The RPV framework helps evaluate large companies and the challenges they face in launching new products.

  2. How to manage through technological change. Christensen points out three primary ways of managing through disruptive technology change: 1. "Acquire a different organization whose processes and values are a close match with the new task." 2. "Try to change the processes and values of the current organization." 3. "Separate out an independent organization and develop within it the new processes and values that are required to solve the new problem." Acquisitions are a way to get out ahead of disruptive change. There are so many examples but two recent ones come to mind: Microsoft's acquisition of Github and Facebook's acquisition of Instagram. Microsoft paid a whopping $7.5B for Github in 2018 when the Github was rumored to be at roughly $200M in revenue (37.5x Revenue multiple!). Github was undoubtedly a mature business with a great product, but it didn't have a ton of enterprise adoption. Diane Greene at Google Cloud, tried to get Sundar Pichai to pay more, but he said no. Github has changed Azure's position within the market and continued its anti-Amazon strategy of pushing open-source technology. In contrast to the Github acquisition, Instagram was only 13 employees when it was acquired for $1B. Zuckerberg saw the threat the social network represented to Facebook, and today the acquisition is regularly touted as one of the best ever. Instagram was developing a social network solely based on photographs, right at the time every person suddenly had an excellent smartphone camera in their pocket. The acquisition occurred right when the market was ballooning, and Facebook capitalized on that growth. The second way of managing technological change is through changing cultural norms. This is rarely successful, because you are fighting against all of the processes and values deeply embedded in the organization. Indra Nooyi cited a desire to move faster on culture as one of her biggest regrets as a young executive: "I’d say I was a little too respectful of the heritage and culture [of PepsiCo]. You’ve got to make a break with the past. I was more patient than I should’ve been. When you know you have to make a change, at some point you have to say enough is enough. The people who have been in the company for 20-30 years pull you down. If I had to do it all over again, I might have hastened the pace of change even more." Lastly, Christensen prescribes creating an independent organization matched to the resources, processes, and values that the new market requires. Three great spin-out, spin-in examples with different flavors of this come to mind. First, Cisco developed a spin-ins practice whereby they would take members of their organization and start a new company that they would fund to develop a new process. The spin-ins worked for a time but caused major cultural issues. Second, as we've discussed, one of the key reasons AWS was born was that Chris Pinkham was in South Africa, thousands of miles away from Amazon Corporate in Seattle; this distance and that team's focus allowed it to come up with a major advance in computing. Lastly, Mastercard started Mastercard Labs a few years ago. CEO Ajay Banga told his team: "I need two commercial products in three years." He doesn't tell his CFO their budget, and he is the only person from his executive team that interacts with the business. This separation of resources, processes, and values allows those smaller organizations to be more nimble in finding emerging technology products and markets.

  3. Discovering Emerging Markets.

    The resources-processes-values framework can also show us why established firms fail to address emerging markets. Established companies rely on formal budgeting and forecasting processes whereby resources are allocated based on market estimates and revenue forecasts. Christensen highlights several important factors for tackling emerging markets, including focusing on ideas, failure, and learning. Underpinning all of these ideas is the impossibility of predicting the scale and growth rate of disruptive technologies: "Experts' forecasts will always be wrong. It is simply impossible to predict with any useful degree of precision how disruptive products will be used or how large their markets will be." Because of this challenge, relying too heavily on these estimates to underpin financial projections can cause businesses to view initial market development as a failure or not worthy of the companies time. When HP launched a new 1.3-inch disk drive, which could be embedded in PDAs, the company mandated that its revenues had to scale up to $150M within three years, in line with market estimates. That market never materialized, and the initiative was abandoned as a failed investment. Christensen argues that because disruptive technologies are threats, planning has to come after action, and thus strategic and financial planning must be discovery-based rather than execution-based. Companies should focus on learning their customer's needs and the right business model to attack the problem, rather than plan to execute their initial vision. As he puts it: "Research has shown, in fact, that the vast majority of successful new business ventures, abandoned their original business strategies when they began implementing their initial plans and learned what would and would not work." One big fan of Christensen's work is Jeff Bezos, and its easy to see why with Amazon's focus on releasing new products in this discovery manner. The pace of product releases is simply staggering (~almost one per day). Bezos even talked about this exact issue in his 2016 shareholder letter: "The senior team at Amazon is determined to keep our decision-making velocity high. Speed matters in business – plus a high-velocity decision making environment is more fun too. We don't know all the answers, but here are some thoughts. First, never use a one-size-fits-all decision-making process. Many decisions are reversible, two-way doors. Those decisions can use a light-weight process. For those, so what if you're wrong? I wrote about this in more detail in last year's letter. Second, most decisions should probably be made with somewhere around 70% of the information you wish you had. If you wait for 90%, in most cases, you're probably being slow." Amazon is one of the first large organizations to truly embrace this decision-making style, and clearly, the results speak for themselves.

Dig Deeper

  • What Jeff Bezos Tells His Executives To Read

  • Github Cuts Subscription Price by More Than Half

  • Ajay Banga Opening Address at MasterCard Innovation Forum 2014

  • Clayton Christensen Describing Disruptive Innovation

  • Why Cisco’s Spin-Ins Never Caught On

tags: Amazon, Google Cloud, Microsoft, Azure, Github, Gitlab, CircleCI, Pepsi, Jeff Bezos, Indra Nooyi, Mastercard, Ajay Banga, HP, Uber, RPV, Facebook, Instagram, Cisco, batch2
categories: Non-Fiction
 

June 2020 - Bad Blood by John Carreyrou

This month we review John Carreyrou’s chilling story of the epic meltdown of a company, Theranos. We explore bad decision making, the limits of technology and the importance of strong corporate governance. The saddest thing and the reason Bad Blood hits so hard is that Theranos was a startup that seemed to have everything: a breakthrough blood analyzer, tons of funding, excellent board representation, and a smart, visionary female CEO. But underneath, it was a twisted cult of distrust with an evil leader.

Tech Themes

  1. The limits of technology. Sometimes technology sounds too good to be true. Theranos’ Edison and miniLab blood analyzers were supposed to tell you everything you could ever want to know about your blood. But they didn’t work and never had a shot to work. Stanford professor Phyllis Gardener even told Elizabeth Holmes (Theranos’ founder/CEO) early-on that an early patch-like design of the product would never work: “[Holmes] just kind of blinked and nodded and left. It was just a 19-year-old talking who’d taken one course in microfluidics, and she thought she was gonna make something of it.” It was debunked by almost every scientist as wild fantasy even prior to its commercial use and subsequent fall from grace. There is something so human about wanting to believe there are no limits to technology. In today’s day of fake technology marketing, it’s easy for messaging to slowly take over a company if left unchecked. Think about Snap’s famous declaration, “Snap Inc. is a camera company.” or Dropbox’s S-1 mission statement: “Unleash the world’s creative energy by designing a more enlightened way of working.” These statements ignore what these businesses fundamentally do - advertising and storage. Sometimes there are massive leaps forward, like the transistor, networked computing, and the internet, but even these took many many years to push to fruition. When humans hear a compelling pitch, it is natural to want to remove those limits of technology because the result is so astounding, but we have to remain skeptical or risk another Theranos.

  2. The reality distortion field. Elizabeth Holmes was obsessed with Steve Jobs. Mired in this deep fixation, she also managed to subscribe to one of Jobs’ interesting habits: the reality-distortion field. While we’ve discussed the reality distortion field before in relation to Jobs, Holmes seemed to take it to a new level. Jobs would demand something incredible be done and a lot of times his amazing team could come up with the solution. Holmes also believed this but failed to consider two things: fundamental biology and her team. Biology, at its core, is just not as flexible as the hardware and software that Apple was building. Jobs demanded an excellent product, Holmes demanded a biological impossibility. Beyond searching to enable a biological impossibility, which to be frank, can pop up after years of research (see CRISPR), Holmes operated the Theranos cult as a dictator, ruthlessly seeking out dissenters and punishing or firing them. While Jobs challenged his team repeatedly while being a huge asshole, the team, for the most part, stayed in tact (Phil Schiller, Tony Fadell, Jony Ive, Scott Forstall, and Eddy Cue). There were certainly those who got fired or left, but Holmes active rooting out of non-believers severely limited the chances of success at the company. The additional levels of secrecy were even extreme for a stealth technology startup. Startup founders need to drink the kool-aid sometimes, it comes with being visionary, but getting so drunk on power and image can only lead to personal and business demise as was the case with Theranos.

  3. When startups turn bad. Tons of startups fail, but only a few turn truly malicious. Theranos was one of those few. The company tested people’s blood and gave individuals fake, untested medical results, including indicators of cancer diagnoses! Even when reviewing other major business failures and frauds - Jeff Skilling at Enron and Bernie Madoff’s Ponzi Scheme - nothing compares to Theranos. While it could be argued that Enron and Madoff’s schemes did more and broader financial hurt to society, at least they were never physically endangering individuals. The only comparisons that may be warranted are Boeing and the Fyre Festival. The brainchild of famous clown, Billy McFarland, the Fyrefest certainly endangered people by marooning them on an island with little food. Furthermore, Boeing’s incredibly incoherent internal review process which knowingly led to the production of a faulty airline software system, also endangered people - including two flights that crashed because of its system. Did Elizabeth Holmes set out to build a dangerous device, knowingly defraud investors, and endanger the public? Probably not. It was one decision after another. It was firing CFO Henry Mosley who called out fake projections; it was hiring Boies Schiller to pressure former employees; it was enlisting Sunny Balwani to “run” the company. It was what Clayton Christensen calls marginal thinking - the idea that the incremental bad decision or the incremental costs of doing something frequently outweigh the full costs of doing something. The incremental cost of firing the CFO who wouldn’t make fake numbers was simply easier than facing the difficult reality that the product sucked, and they had pushed through too much investor money to start again. When things turn bad, at startups or other businesses, a trail of marginal decision making can normally be found.

Business Themes

elizabeth-holmes1.0.jpg
AYX.png
PS.png
  1. The Pressure to Succeed. Stress seems to be a part of business, but the pressure can sometimes get too big to handle. Public companies, in particular, face growth targets from wall street analysts and investors. One earnings miss or even a more modest beat than expected can completely derail a stock (See pluralsight and alteryx graphs to the right). Public company CEOs and CFOs can be fired or have compensation withheld for poor stock performance. So when a young hot biotechnology startup wanted to launch a partnership with Walgreens, Dr. J and the Walgreens team were more than ready to fast track the potential partnership. Despite not being allowed to use the bathroom, see the lab or see a partial demo of the product, Walgreens pushed through a deal so that longtime competitor, CVS, wouldn’t get the deal. As then head of the Theranos/Walgreens pilot said, "We can’t not pursue this. We can’t risk a scenario where CVS has a deal with them in six months and it ends up being real.” When the partnership was announced, even the press release sounded oddly formulaic: “Theranos’ proprietary laboratory infrastructure minimizes human error through extensive automation to produce high quality results.” There was no demo. There was no product. There was only pressure at Walgreens to beat CVS and pressure at Theranos to make something from a fake device.

  2. The Importance of Corporate Governance. Corporate Governance has historically rarely been discussed outside of academic settings but has come into sharper focus over the past few years. Some have recently tried to bring some of the prominent corporate governance issues such as member compensation and option grants for executives to the forefront. Warren Buffet even commented on boards in his 2019 annual shareholder letter: “Director compensation has now soared to a level that inevitably makes pay a subconscious factor affecting the behavior of many non-wealthy members. Think, for a moment, of the director earning $250,000-300,000 for board meetings consuming a pleasant couple of days six or so times a year. And job security now? It’s fabulous. Board members may get politely ignored, but they seldom get fired. Instead, generous age limits – usually 70 or higher – act as the standard method for the genteel ejection of directors.” Boards are meant to help guide the company through strategic challenges, ensure the business is focused on the right things, and evaluate the CEO. Theranos’ Board of Directors was a laughable hodgepodge of old white men: George P. Shultz (former U.S. Secretary of State), William Perry (former U.S. Secretary of Defense), Henry Kissinger (former U.S. Secretary of State), Sam Nunn (former U.S. Senator), Bill Frist (former U.S. Senator and heart-transplant surgeon), Gary Roughead (Admiral, USN, retired), James Mattis (General, USMC), Richard Kovacevich (former Wells Fargo Chairman and CEO), and Riley Bechtel. The average age of the directors in 2012 was ~72 years old and few of these men could offer real strategic guidance in pursuing novel biotechnology. On top of that, as Carreyrou points out, “In December 2013, [Holmes] forced through a resolution that assigned one hundred votes to every share she owned, giving her 99.7% of the voting rights.” George Shultz even said later in a deposition, “We never took any votes at Theranos. It was pointless. Elizabeth was going to decide whatever she decided.” The episode brings more clarity to those CEOs and companies who hide behind their Board of Directors, who promise governance for investors, but rarely deliver on anything beyond pandering to the CEO’s whims. In another ludicrous comparison, Apple and Steve Jobs specifically have also been accused of shoddy corporate governance. In 2007, Apple famously backdated Jobs options, allowing him to make an instant profit, and did not even bother to report that it had issued the options. The best companies are not immune, and investors and employees should be aware of the qualifications and monetary interests of a company’s board members.

  3. Search and Destroy. Only the Paranoid Survive, right? Wrong. There is such thing as too much paranoia. When you combine that paranoia with a manipulative persona, you get Elizabeth Holmes. It’s hard to believe that any startup or founder would need the level of security and secrecy that dominated the culture at Theranos. The list of weird security and legal gray areas include: personal security for Holmes, laboratory developed tests (instead of FDA approved tests), copious and vigorously enforced NDAs, siloed teams with no communication, and false representation in the media. Organizations are often secret and many startups operate in stealth to not give away details to competitors. Some larger companies launch new divisions in separate locations from their office, like Amazon a9. The Company hired private investigators (through its powerful law firm Boies Schiller) to threaten and track former employees including Erika Chung and Tyler Schulz. Tyler Schulz, grandson of board member George Schulz, was one of the key informants to author John Carreyrou. After he accused Elizabeth and Sunny of lying and potentially harming patients, he resigned and tried to convince his grandfather that it was all a sham. His grandfather agreed to speak with him one-on-one and at the end of the conversation surprised Tyler with two attorneys from Boies Schiller who almost forced Tyler to sign a confidentiality agreement. Tyler refused, which eventually led to the publication of Carreyrou’s first article. As early board member Avie Tevanian put it, “I had seen so many things that were bad go on. I would never expect anyone would behave the way that she behaved as a CEO. And believe me, I worked for Steve Jobs. I saw some crazy things. But Elizabeth took it to a new level.” Again, sadly, while Theranos may be the pinnacle of secrecy, paranoia and threatening behavior, eBay recently fired six employees for threatening online reviewers. On top of sending live spiders to the reviewers’ household, eBay team members would knock on their doors day or night, to scare the reviewers. How could these employees think this was ok? How could Elizabeth partake in this threatening and manipulative behavior? As Organizational Behavior professor Roderick Kramer reminds us: “‘Reality’ is not a fixed entity but rather a tissue of facts, impressions, and interpretations that can be manipulated and perverted by clever and devious businesses and governments.” Theranos’ fake Edison tests are reminiscent of Enron’s fake trading floor, where 70 low level employees once pretended to be busy to impress wall street analysts. Paranoia and secrecy are powerful weapons when left unchecked, and clearly Theranos' wielded those weapons to the fullest extent.

Dig Deeper

  • HBO Documentary: “The Inventor: Out for Blood in Silicon Valley” has many interviews and deep analysis on Theranos

  • When Paranoia Makes Sense by Organizational Behavior Professor Roderick Kramer

  • Theranos criminal trial set to begin March 9, 2021

  • Ex-Theranos CEO Elizabeth Holmes says 'I don't know' 600-plus times in never-before-broadcast deposition tapes

  • Holmes’ famous Mad Money Interview: “First they think you're crazy, then they fight you, and then all of a sudden you change the world.”

  • Theranos’ still active Twitter account

tags: Theranos, Elizabeth Holmes, Sunny Balwani, Apple, Steve Jobs, Snap, Dropbox, Stanford, Reality distortion field, Fyre Festival, Boeing, Billy McFarland, Jeff Skilling, Enron, Boies Schiller, Clayton Christensen, Walgreens, CVS, Warren Buffett, George Schulz, batch2
categories: Non-Fiction
 

April 2020 - Good To Great by Jim Collins

Collins’ book attempts to answer the question - Why do good companies continue to be good companies? His analysis across several different industries provides meaningful insights into strong management and strategic practices.

Tech Themes

  1. Packard’s Law. We’ve discussed Packard’s law before when analyzing the troubling acquisition history of AOL-Time Warner and Yahoo. As a reminder, Packard’s law states: “No company can consistently grow revenues faster than its ability to get enough of the right people to implement that growth and still become a great company. [And] If a company consistently grows revenue faster than its ability to get enough of the right people to implement that growth, it will not simply stagnate; it will fall.” Given Good To Great is a management focused book, I wanted to explore an example of this law manifesting itself in a recent management dilemma. Look no further than ride-sharing giant, Uber. Uber’s culture and management problems have been highly publicized. Susan Fowler’s famous blog post kicked off a series of blows that would ultimately lead to a board dispute, the departure of its CEO, and a full-on criminal investigation. Uber’s problems as a company, however, can be traced to its insistence to be the only ride-sharing service throughout the world. Uber launched several incredibly unprofitable ventures, not only a price-war with its local competitor Lyft, but also a concerted effort to get into China, India, and other locations that ultimately proved incredibly unprofitable. Uber tried to be all things transportation to every location in the world, an over-indulgence that led to the Company raising a casual $20B prior to going public. Dara Khosrowshahi, Uber’s replacement for Travis Kalanick, has concertedly sold off several business lines and shuttered other unprofitable ventures to regain financial control of this formerly money burning “logistics” pit. This unwinding has clearly benefited the business, but also limited growth, prompting the stock to drop significantly from IPO price. Dara is no stranger to facing travel challenges, he architected the spin-out of Expedia with Barry Diller, right before 9/11. Only time will tell if he can refocus the Company as it looks to run profitably. Uber pushed too far in unprofitable locations, and ran head on into Packard’s law, now having to pay the price for its brash push into unprofitable markets.

  2. Technology Accelerators. In Collins’ Good to Great framework (pictured below), technology accelerators act as a catalyst to momentum built up from disciplined people and disciplined thought. By adapting a “Pause, think, crawl, walk, run” approach to technology, meaning a slow and thoughtful transition to new technologies, companies can establish best practices for the long-term, instead of short term gains from technology faux-feature marketing. Technology faux-feature marketing, which is decoupled from actual technology has become increasingly popular in the past few years, whereby companies adopt a marketing position that is actually complete separate from their technological sophistication. Look no further than the blockchain / crypto faux-feature marketing around 2018, when Long Island iced-tea changed its name to Long Island Blockchain, which is reminiscent of companies adding “.com” to their name in the early 2000’s. Collins makes several important distinctions about technology accelerators: technology should only be a focus if it fits into a company’s hedgehog concept, technology accelerators cannot make up for poor people choices, and technology is never a primary root cause of either greatness or decline. The first two axioms make sense, just think of how many failed, custom software projects have begun and never finished; there is literally an entire wikipedia page dedicated to exactly that. The government has also reportedly been a famous dabbler in homegrown, highly customized technology. As Collins notes, technology accelerators cannot make up for bad people choices, an aspect of venture capital that is overlooked by so many. Enron is a great example of an interesting idea turned sour by terrible leadership. Beyond the accounting scandals that are discussed frequently, the culture was utterly toxic, with employees subjected to a “Performance Review Committee” whereby they were rated on a scale of 1-5 by their peers. Employees rated a 5 were fired, which meant roughly 15% of the workforce turned over every year. The New York Times reckoned Enron is still viewed as a trailblazer for the way it combined technology and energy services, but it clearly suffered from terrible leadership that even great technology couldn’t surmount. Collins’ most controversial point is arguably that technology cannot cause greatness or decline. Some would argue that technology is the primary cause of greatness for some companies like Amazon, Apple, Google, and Microsoft. The “it was just a better search engine” argument abounds discussions of early internet search engines. I think what Collins’ is getting at is that technology is malleable and can be built several different ways. Zoom and Cloudflare are great examples of this. As we’ve discussed, Zoom started over 100 years after the idea for video calling was first conceived, and several years after Cisco had purchased Webex, which begs the question, is technology the cause of greatness for Zoom? No! Zoom’s ultimate success the elegance of its simple video chat, something which had been locked up in corporate feature complexity for years. Cloudflare presents another great example. CDN businesses had existed for years when Cloudflare launched, and Cloudflare famously embedded security within the CDN, building on a trend which Akamai tried to address via M&A. Was technology the cause of greatness for Cloudflare? No! It’s way cheaper and easier to use than Akamai. Its cost structure enabled it to compete for customers that would be unprofitable to Akamai, a classic example of a sustaining technology innovation, Clayton Christensen’s Innovator’s Dilemma. This is not to say these are not technologically sophisticated companies, Zoom’s cloud ops team has kept an amazing service running 24/7 despite a massive increase in users, and Cloudflare’s Workers technology is probably the best bet to disrupt the traditional cloud providers today. But to place technology as the sole cause for greatness would be understating the companies achievements in several other areas.

  3. Build up, Breakthrough Flywheel. Jeff Bezos loves this book. Its listed in the continued reading section of prior TBOTM, The Everything Store. The build up, breakthrough flywheel is the culmination of disciplined people, disciplined thought and disciplined action. Collins’ points out that several great companies frequently appear like overnight successes; all of a sudden, the Company has created something great. But that’s rarely the case. Amazon is a great example of this; it had several detractors in the early days, and was dismissed as simply an online bookseller. Little did the world know that Jeff Bezos had ideas to pursue every product line and slowly launched one after the other in a concerted fashion. In addition, what is a better technology accelerator than AWS! AWS resulted from an internal problem of scaling compute fast enough to meet growing consumer demand for their online products. The company’s tech helped it scale so well that they thought, “Hey! Other companies would probably like this!” Apple is another classic example of a build-up, breakthrough flywheel. The Company had a massive success with the iPod, it was 40% of revenues in 2007. But what did it do? It cannablized itself and pursued the iPhone, with several different teams within the company pursuing it individually. Not only that, it created a terrible first version of an Apple phone with the Rokr, realizing that design was massively important to the phone’s success. The phone’s technology is taken for granted today, but at the time the touch screen was simply magical!

Business Themes

goodtogreatflywheel.png
Ipod_sales.jpeg
Hedgehog-Concept_v2.jpg
Slides-Character-And-Concrete-Actions-Shape-A-Culture.005.png
  1. Level 5 Leader. The first part and probably the most important part of the buildup, breakthrough, flywheel is disciplined people. One aspect of Good to Great that inspired Collins’ other book Built to Last, is the idea that leadership, people, and culture determine the long-term future of a business, even after current leadership has moved on from the business. To set an organization up for long-term success, executives need to display level five leadership, which is a mix of personal humility and professional will. Collins’ leans in on Lee Iacocca as an example of a poor leader, who focused more on personal celebrity and left Chrysler to fail, when he departed. Level 5 leadership has something that you don’t frequently see in technology business leaders, humility. The technology industry seems littered with far more Larry Ellison and Elon Musk’s than any other industry, or maybe its just that tech CEOs tend to shout the loudest from their pedestals. One CEO that has done a great job of representing level five leadership is Shantanu Narayen, who took the reigns of Adobe in December 2007, right on the cusp of the financial crisis. Narayen, who’s been described as more of a doer than a talker, has dramatically changed Adobe’s revenue model, moving the business from a single sale license software business focused on lower ACV numbers, to an enterprise focused SaaS business. This march has been slow and pragmatic but the business has done incredibly well, 10xing since he took over. Adobe CFO, Mark Garrett, summarized it best in a 2015 McKinsey interview: “We instituted open dialogue with employees—here’s what we’re going through, here’s what it might look like—and we encouraged debate. Not everyone stayed, but those who did were committed to the cloud model.”

  2. Hedgehog Concept. The Hedgehog concept (in the picture wheel to the right) is the overlap of three questions: What are you passionate about?, What are you the best in the world at?, and What drives your economic engine? This overlap is the conclusion of Collins’ memo to Confront the Brutal Facts, something that Ben Horowitz emphasizes in March’s TBOTM. Once teams have dug into their business, they should come up with a simple way to center their focus. When companies reach outside their hedgehog concept, they get hurt. The first question, about organizational passion, manifests itself in mission and value statements. The best in the world question manifests itself through value network exercises, SWOT analyses and competitive analyses. The economic engine is typically shown as a single metric to define success in the organization. As an example, let’s walk through an example with a less well-known SaaS company: Avalara. Avalara is a provider of tax compliance software for SMBs and enterprises, allowing those businesses to outsource complex and changing tax rules to software that integrates with financial management systems to provide an accurate view of corporate taxes. Avalara’s hedgehog concept is right on their website: “We live and breathe tax compliance so you don't have to.” Its simple and effective. The also list a slightly different version in their 10-K, “Avalara’s motto is ‘Tax compliance done right.’” Avalara is the best at tax compliance software, and that is their passion; they “live and breath” tax compliance software. What drives Avalara’s economic engine? They list two metrics right at the top of their SEC filings, number of core customers and net revenue retention. Core customers are customers who have been billed more than $3,000 in the last twelve months. The growth in core customers allows Avalara to understand their base of revenue. Tax compliance software is likely low churn because filing taxes is such an onerous process, and most people don’t have the expertise to do it for their corporate taxes. They will however suffer from some tax seasonality and some customers may churn and come back after the tax period has ended for a given year. Total billings allows Avalara to account for this possibility. Avalara’s core customers have grown 32% in the last twelve months, meaning its revenue should be following a similar trajectory. Net retention allows the company to understand how customer purchasing behavior changes over time and at 113% net retention, Avalara’s overall base is buying more software from Avalara than is churning, which is a positive trend for the company. What is the company the best in the world at? Tax compliance software for SMBs. Avalara views their core customer as greater than $3,000 of trailing twelve months revenue, which means they are targeting small customers. The Company’s integrations also speak to this - Shopify, Magento, NetSuite, and Stripe are all focused on SMB and mid-market customers. Notice that neither SAP nor Oracle ERP is in that list of integrations, which are the financial management software providers that target large enterprises. This means Avalara has set up its product and cost structure to ensure long-term profitability in the SMB segment; the enterprise segment is on the horizon, but today they are focused on SMBs.

  3. Culture of Discipline. Collins describes a culture of discipline as an ability of managers to have open and honest, often confrontational conversation. The culture of discipline has to fit within a culture of freedom, allowing individuals to feel responsible for their division of the business. This culture of discipline is one of the first things to break down when a CEO leaves. Collins points on this issue with Lee Iaccoca, the former CEO of Chrysler. Lee built an intense culture of corporate favoritism, which completely unraveled after he left the business. This is also the focus of Collins’ other book, Built to Last. Companies don’t die overnight, yet it seems that way when problems begin to abound company-wide. We’ve analyzed HP’s 20 year downfall and a similar story can be shown with IBM. In 1993, IBM elected Lou Gerstner as CEO of the company. Gerstner was an outsider to technology businesses, having previously led the highly controversial RJR Nabisco, after KKR completed its buyout in 1989. He has also been credited with enacting wholesale changes to the company’s culture during his tenure. Despite the stock price increasing significantly over Gerstner’s tenure, the business lost significant market share to Microsoft, Apple and Dell. Gerstner was also the first IBM CEO to make significant income, having personally been paid hundreds of millions over his tenure. Following Gerstner, IBM elected insider Sam Palmisano to lead the Company. Sam pushed IBM into several new business lines, acquired 25 software companies, and famously sold off IBM’s PC division, which turned out to be an excellent strategic decision as PC sales and margins declined over the following ten years. Interestingly, Sam’s goal was to “leave [IBM] better than when I got there.” Sam presided over a strong run up in the stock, but yet again, severely missed the broad strategic shift toward public cloud. In 2012, Ginni Rometty was elected as new CEO. Ginni had championed IBM’s large purchase of PwC’s technology consulting business, turning IBM more into a full service organization than a technology company. Palmisano has an interesting quote in an interview with a wharton business school professor where he discusses IBM’s strategy: “The thing I learned about Lou is that other than his phenomenal analytical capability, which is almost unmatched, Lou always had the ability to put the market or the client first. So the analysis always started from the outside in. You could say that goes back to connecting with the marketplace or the customer, but the point of it was to get the company and the analysis focused on outside in, not inside out. I think when you miss these shifts, you’re inside out. If you’re outside in, you don’t miss the shifts. They’re going to hit you. Now acting on them is a different characteristic. But you can’t miss the shift if you’re outside in. If you’re inside out, it’s easy to delude yourself. So he taught me the importance of always taking the view of outside in.” Palmisano’s period of leadership introduced a myriad of organizational changes, 110+ acquisitions, and a centralization of IBM processes globally. Ginni learned from Sam that acquisitions were key toward growth, but IBM was buying into markets they didn’t fully understand, and when Ginni layered on 25 new acquisitions in her first two years, the Company had to shift from an outside-in perspective to an inside-out perspective. The way IBM had historically handled the outside-in perspective, to recognize shifts and get ahead of them, was through acquisition. But when the acquisitions occured at such a rapid pace, and in new markets, the organization got bogged down in a process of digestion. Furthermore, the centralization of processes and acquired businesses is the exact opposite of what Clayton Christensen recommends when pursuing disruptive technology. This makes it obvious why IBM was so late to the cloud game. This was a mainframe and services company, that had acquired hundreds of software businesses they didn’t really understand. Instead of building on these software platforms, they wasted years trying to put them all together into a digestible package for their customers. IBM launched their public cloud offering in June 2014, a full seven years after Microsoft, Amazon, and Google launched their services, despite providing the underlying databases and computing power for all of their enterprise customers. Gerstner established the high-pay, glamorous CEO role at IBM, which Palmisano and Ginni stepped into, with corporate jets and great expense policies. The company favored increasing revenues and profits (as a result of acquisitions) over the recognition and focus on a strategic market shift, which led to a downfall in the stock price and a declining mindshare in enterprises. Collins’ understands the importance of long term cultural leadership. “Does Palmisano think he could have done anything differently to set IBM up for success once he left? Not really. What has happened since falls to a new coach, a new team, he says.”

Dig Deeper

  • Level 5 Leadership from Darwin Smith at Kimberly Clark

  • From Good to Great … to Below Average by Steven Levitt - Unpacking underperformance from some of the companies Collins’ studied

  • The Challenges faced by new CEO Arvind Krishna

  • Overview of Cloudflare Workers

  • The Opposite of the Buildup, Breakthrough, Flywheel - the Doom Loop

tags: IBM, Apple, Microsoft, Packard's Law, HP, Uber, Barry Diller, Enron, Zoom, Cloudflare, Innovator's Dilemma, Clayton Christensen, Jeff Bezos, Amazon, Larry Ellison, Adobe, Shantanu Narayen, Avalara, Hedgehog Concept, batch2
categories: Non-Fiction
 

March 2020 - The Hard Thing About Hard Things by Ben Horowitz

Ben Horowitz, GP of the famous investment fund Andreessen Horowitz, addresses the not-so-pleasant aspects of being a founder/CEO during a crisis. This book provides an excellent framework for anyone going through the struggles of scaling a business and dealing with growing pains.

Tech Themes

  1. The importance of Netscape. Now that its been relegated to history by the rise of AOL and internet explorer, its hard to believe that Netscape was ever the best web browser. Founded by Marc Andreessen, who had founded the first web browser, Mosaic (as a teenager!), Netscape would go on to achieve amazing success only to blow up in the face of competition and changes to internet infrastructure. Netscape was an incredible technology company, and as Brian McCullough shows in last month’s TBOTM, Netscape was the posterchild for the internet bubble. But for all the fanfare around Netscape’s seminal IPO, little is discussed about its massive and longstanding technological contributions. In 1995, early engineer Brendan Eich created Javascript, which still stands as the dominant front end language for the web. In the same year, the Company developed Secure Socket Layer (SSL), the most dominant basic internet security protocol (and reason for HTTPS). On top of those two fundamental technologies, Netscape also developed the internet cookie, in 1994! Netscape is normally discussed as the amazing company that ushered many of the first internet users onto the web, but its rarely lauded for its longstanding technological contributions. Ben Horowitz, author of the Hard Thing About Hard Things was an early employee and head of the server business unit for Netscape when it went public.

  2. Executing a pivot. Famous pivots have become part of startup lore whether it be in product (Glitch (video game) —> Slack (chat)), business model (Netflix DVD rental —> Streaming), or some combo of both (Snowdevil (selling snowboards online) —> Shopify (ecommerce tech)). The pivot has been hailed as necessary tool in every entrepreneur’s toolbox. Though many are sensationalized, the pivot Ben Horowitz underwent at LoudCloud / Opsware is an underrated one. LoudCloud was a provider of web hosting services and managed services for enterprises. The Company raised a boatload ($346M) of money prior to going public in March 2001, after the internet bubble had already burst. The Company was losing a lot of money and Ben knew that the business was on its last legs. After executing a 400 person layoff, he sold the managed services part of the business to EDS, a large IT provider, for $63.5M. LoudCloud had a software tool called Opsware that it used to manage all of the complexities of the web hosting business, scaling infrastructure with demand and managing compliance in data centers. After the sale was executed, the company’s stock fell to $0.35 per share, even trading below cash, which meant the markets viewed the Company as already bankrupt. The acquisition did something very important for Ben and the Opsware team, it bought them time - the Company had enough cash on hand to execute until Q4 2001 when it had to be cash flow positive. To balance out these cash issues, Opsware purchased Tangram, Rendition Networks, and Creekpath, which were all software vendors that helped manage the software of data centers. This had two effects - slowing the burn (these were profitable companies), and building a substantial product offering for data center providers. Opsware started making sales and the stock price began to tick up, peaking the attention of strategic acquirers. Ultimately it came down to BMC Software and HP. BMC offered $13.25 per share, the Opsware board said $14, BMC countered with $13.50 and HP came in with a $14.25 offer, a 38% premium to the stock price and a total valuation of $1.6B, which the board could not refuse. The Company changed business model (services —> software), made acquisitions and successfully exited, amidst a terrible environment for tech companies post-internet bubble.

  3. The Demise of the Great HP. Hewlett-Packard was one of the first garage-borne, silicon valley technology companies. The company was founded in Palo Alto by Bill Hewlett and Dave Packard in 1939 as a provider of test and measurement instruments. Over the next 40 years, the company moved into producing some of the best printers, scanners, calculators, logic analyzers, and computers in the world. In the 90s, HP continued to grow its product lines in the computing space, and executed a spinout of its manufacturing / non-computing device business in 1999. 1999 marks the tragic beginning of the end for HP. The first massive mistake was the acquisition of Compaq, a flailing competitor in the personal computer market, who had acquired DEC (a losing microprocessor company), a few years earlier. The acquisition was heavily debated, with Walter Hewlett, son of the founder and board director at the time, engaging in a proxy battle with then current CEO, Carly Firorina. The new HP went on to lose half of its market value and incur heavy job losses that were highly publicized. This started a string of terrible acquisitions including EDS, 3COM, Palm Inc., and Autonomy for a combined $28.8B. The Company spun into two divisions - HP Inc. and HP Enterprise in 2015 and each had their own spinouts and mergers from there (Micro Focus and DXC Technology). Today, HP Inc. sells computers and printers, and HPE sells storage, networking and server technology. What can be made of this sad tale? HP suffered from a few things. First, poor long term direction - in hindsight their acquisitions look especially terrible as a repeat series of massive bets on technology that was already being phased out due to market pressures. Second, HP had horrible corporate governance during the late 90s and 2000s - board in-fighting over acquisitions, repeat CEO fiirings over cultural issues, chairman-CEO’s with no checks, and an inability to see the outright fraud in their Autonomy acquisition. Lastly, the Company saw acquisitions and divestitures as band-aids - new CEO entrants Carly Fiorina (from AT&T), Mark Hurd (from NCR), Leo Apotheker (from SAP), and Meg Whitman (from eBay) were focused on making an impact at HP which meant big acquisitions and strategic shifts. Almost none of these panned out, and the repeated ideal shifts took a toll on the organization as the best talent moved elswehere. Its sad to see what has happened at a once-great company.

Business Themes

51DydLyUcrL.jpg
MarcA_Cover.jpg
  1. Ill, not sick: going public at the end of the internet bubble. Going public is supposed to be the culmination of a long entrepreneurial journey for early company employees, but according to Ben Horowitz’s experience, going public during the internet bubble pop was terrible. Loudcloud had tried to raise money privately but struggled given the terrible conditions for raising money at the beginning of 2001. Its not included in the book but the reason the Company failed to raise money was its obscene valuation and loss. The Company was valued at $1.15B in its prior funding round and could only report $6M in Net Revenue on a $107M loss. The Company sought to go public at $10 per share ($700M valuation), but after an intense and brutal roadshow that left Horowitz physically sick, they settled for $6.00 per share, a massive write-down from the previous round. The fact that the banks were even able to find investors to take on this significant risk at this point in the business cycle was a marvel. Timing can be crucial in an IPO as we saw during the internet bubble; internet “businesses” could rise 4-5x on their first trading day because of the massive and silly web landgrab in the late 90s. On the flip side, going public when investors don’t want what you’re selling is almost a death sentence. Although they both have critical business and market issues, WeWork and Casper are clear examples of the importance of timing. WeWork and Casper were late arrivals on the unicorn IPO train. Let me be clear - both have huge issues (WeWork - fundamental business model, Casper - competition/differentiation) but I could imagine these types of companies going public during a favorable time period with a relatively strong IPO. Both companies had massive losses, and investors were especially wary of losses after the failed IPOs of Lyft and Uber, which were arguably the most famous unicorns to go public at the time. Its not to say that WeWork and Casper wouldn’t have had trouble in the public markets, but during the internet bubble these companies could’ve received massive valuations and raised tons of cash instead of seeking bailouts from Softbank and reticent public market investors.

  2. Peactime / Wartime CEO. The genesis of this book was a 2011 blog post written by Horowitz detailing Peacetime and Wartime CEO behavior. As the book and blog post describe, “Peacetime in business means those times when a company has a large advantage vs. the competition in its core market, and its market is growing. In times of peace, the company can focus on expanding the market and reinforcing the company’s strengths.” On the other hand, to describe Wartime, Horowitz uses the example of a previous TBOTM, Only the Paranoid Survive, by Andy Grove. In the early 1980’s, Grove realized his business was under serious threat as competition increased in Intel’s core business, computer memory. Grove shifted the entire organization whole-heartedly into chip manufacturing and saved the company. Horowitz outlines several opposing behaviors of Peacetime and Wartime CEOs: “Peacetime CEO knows that proper protocol leads to winning. Wartime CEO violates protocol in order to win; Peacetime CEO spends time defining the culture. Wartime CEO lets the war define the culture; Peacetime CEO strives for broad based buy in. Wartime CEO neither indulges consensus-building nor tolerates disagreements.” Horowitz concludes that executives can be a peacetime and wartime CEO after mastering each of the respective skill sets and knowing when to shift from peacetime to wartime and back. The theory is interesting to consider; at its best, it provides an excellent framework for managing times of stress (like right now with the Coronavirus). At its worst, it encourages poor CEO behavior and cut throat culture. While I do think its a helpful theory, I think its helpful to think of situations that may be an exception, as a way of testing the theory. For example, lets consider Google, as Horowitz does in his original article. He calls out that Google was likely entering in a period of wartime in 2011 and as a result transitioned CEOs away from peacetime Eric Schmidt to Google founder and wartime CEO, Larry Page. Looking back however, was it really clear that Google was entering wartime? The business continued to focus on what it was clearly best at, online search advertising, and rarely faced any competition. The Company was late to invest in cloud technology and many have criticized Google for pushing billions of dollars into incredibly unprofitable ventures because they are Larry and Sergey’s pet projects. In addition, its clear that control had been an issue for Larry all along - in 2011, it came out that Eric Schmidt’s ouster as CEO was due to a disagreement with Larry and Sergey over continuing to operate in China. On top of that, its argued that Larry and Sergey, who have controlling votes in Google, stayed on too long and hindered Sundar Pichai’s ability to effectively operate the now restructured Alphabet holding company. In short, was Google in a wartime from 2011-2019? I would argue no, it operated in its core market with virtually no competition and today most Google’s revenues come from its ad products. I think the peacetime / wartime designation is rarely so black and white, which is why it is so hard to recognize what period a Company may be in today.

  3. Firing people. The unfortunate reality of business is that not every hire works out, and that eventually people will be fired. The Hard Thing About Hard Things is all about making difficult decisions. It lays out a framework for thinking about and executing layoffs, which is something that’s rarely discussed in the startup ecosystem until it happens. Companies mess up layoffs all the time, just look at Bird who recently laid off staff via an impersonal Zoom call. Horowitz lays out a roughly six step process for enacting layoffs and gives the hard truths about executing the 400 person layoff at LoudCloud. Two of these steps stand out because they have been frequently violated at startups: Don’t Delay and Train Your Managers. Often times, the decision to fire someone can be a months long process, continually drawn out and interrupted by different excuses. Horowitz encourages CEOs to move thoughtfully and quickly to stem leaks of potential layoffs and to not let poor performers continue to hurt the organization. The book discusses the Law of Crappy People - any level of any organization will eventually converge to the worst person on that level; benchmarked against the crappiest person at the next level. Once a CEO has made her mind up about the decision to fire someone, she should go for it. As part of executing layoffs, CEOs should train their managers, and the managers should execute the layoffs. This gives employees the opportunity to seek direct feedback about what went well and what went poorly. This aspect of the book is incredibly important for all levels of entrepreneurs and provides a great starting place for CEOs.

Dig Deeper

  • Most drastic company pivots that worked out

  • Initial thoughts on the Opsware - HP Deal from 2007

  • A thorough history of HP’s ventures, spin-offs and acquisitions

  • Ben’s original blog post detailing the pivot from service provider to tech company

  • The First (1995-01) and Second Browser War (2004 - 2017)

tags: Apple, IBM, VC, Google, HP, Packard's Law, Amazon, Android, Internet History, Marc Andreessen, Andreessen Horowitz, Loudcloud, Opsware, BMC Software, Mark Hurd, Javascript, Shopify, Slack, Netflix, Compaq, DEC, Micro Focus, DXC Technology, Carly Firoina, Leo Apotheker, Meg Whitman, WeWork, Casper, Larry Page, Eric Schmidt, Sundar Pichai, batch2
categories: Non-Fiction
 

February 2020 - How the Internet Happened: From Netscape to the iPhone by Brian McCullough

Brian McCullough, host of the Internet History Podcast, does an excellent job of showing how the individuals adopted the internet and made it central to their lives. He follows not only the success stories but also the flame outs which provide an accurate history of a time of rapid technological change.

Tech Themes

  1. Form to Factor: Design in Mobile Devices. Apple has a long history with mobile computing, but a few hiccups in the early days are rarely addressed. These hiccups also telegraph something interesting about the technology industry as a whole - design and ease of use often trump features. In the early 90’s Apple created the Figaro, a tablet computer that weighed eight pounds and allowed for navigation through a stylus. The issue was it cost $8,000 to produce and was 3/4 of an inch thick, making it difficult to carry. In 1993, the Company launched the Newton MessagePad, which cost $699 and included a calendar, address book, to-do list and note pad. However, the form was incorrect again; the MessagePad was 7.24 in. x 4.5 in. and clunky. With this failure, Apple turned its attention away from mobile, allowing other players like RIM and Blackberry to gain leading market share. Blackberry pioneered the idea of a full keyboard on a small device and Marc Benioff, CEO of salesforce.com, even called it, “the heroin of mobile computing. I am serious. I had to stop.” IBM also tried its hand in mobile in 1992, creating the Simon Personal Communicator, which had the ability to send and receive calls, do email and fax, and sync with work files via an adapter. The issue was the design - 8 in. by 2.5 in. by 1.5 in. thick. It was a modern smartphone, but it was too big, clunky, and difficult to use. It wasn’t until the iPhone and then Android that someone really nailed the full smart phone experience. The lessons from this case study offer a unique insight into the future of VR. The company able to offer the correct form factor, at a reasonable price can gain market share quickly. Others who try to pioneer too much at a time (cough, magic leap), will struggle.

  2. How to know you’re onto something. Facebook didn’t know. On November 30, 2004, Facebook surpassed one million users after being live for only ten months. This incredible growth was truly remarkable, but Mark Zuckerberg still didn’t know facebook was a special company. Sean Parker, the founder of Napster, had been mentoring Zuckerberg the prior summer: “What was so bizarre about the way Facebook was unfolding at that point, is that Mark just didn’t totally believe in it and wanted to go and do all these other things.” Zuckerberg even showed up to a meeting at Sequoia Capital still dressed in his pajamas with a powerpoint entitled: “The Top Ten Reasons You Should Not Invest.” While this was partially a joke because Sequoia has spurned investing in Parker’s latest company, it represented how immature the whole facebook operation was, in the face of rapid growth. Facebook went on to release key features like groups, photos, and friending, but most importantly, they developed their revenue model: advertising. The quick user growth and increasing ad revenue growth got the attention of big corporations - Viacom offered $2B in cash and stock, and Yahoo offered $1B all cash. By this time, Zuckerberg realized what he had, and famously spurned several offers from Yahoo, even after users reacted negatively to the most important feature that facebook would ever release, the News Feed. In today’s world, we often see entrepreneur’s overhyping their companies, which is why Silicon Valley was in-love with dropout founders for a time, their naivite and creativity could be harnessed to create something huge in a short amount of time.

  3. Channel Partnerships: Why apple was reluctant to launch a phone. Channel partnerships often go un-discussed at startups, but they can be incredibly useful in growing distribution. Some industries, such as the Endpoint Detection and Response (EDR) market thrives on channel partnership arrangements. Companies like Crowdstrike engage partners (mostly IT services firms) to sell on their behalf, lowering Crowdstrike’s customer acquisition and sales spend. This can lead to attractive unit economics, but on the flip side, partners must get paid and educated on the selling motion which takes time and money. Other channel relationships are just overly complex. In the mid 2000’s, mobile computing was a complicated industry, and companies hated dealing with old, legacy carriers and simple clunky handset providers. Apple tried the approach of working with a handset provider, Motorola, but they produced the terrible ROKR which barely worked. The ROKR was built to run on the struggling Cingular (would become AT&T) network, who was eager to do a deal with Apple in hopes of boosting usage on their network. After the failure of the ROKR, Cingular executives begged Jobs to build a phone for the network. Normally, the carriers had specifications for how phones were built for their networks, but Jobs ironed out a contract which exchanged network exclusivity for complete design control, thus Apple entered into mobile phones. The most important computing device of the 2000’s and 2010’s was built on a channel relationship.

Business Themes

caseaoltimewarner.jpg
timewarner_aol_facts1.jpg
  1. AOL-Time Warner: the merger destined to fail. To fully understand the AOL-Time Warner merger, you must first understand what AOL was, what it was becoming, and why it was operating on borrowed time. AOL started as an ISP, charging customers $9.95 for five hours of dial-up internet access, with each additional hour costing $2.95. McCullough describes AOL: “AOL has often been described as training wheels for the Internet. For millions of Americans, their aol.com address was their first experience with email, and thus their first introduction to the myriad ways that networked computing could change their lives.” AOL grew through one of the first viral marketing campaigns ever; AOL put CDs into newspapers which allowed users to download AOL software and get online. The Company went public in March of 1992 and by 1996 the Company had 2.1 million subscribers, however subscribers were starting to flee to cheaper internet access. It turned out that building an ISP was relatively cheap, and the high margin cash flow business that AOL had built was suddenly threatened by a number of competitors. AOL persisted with its viral marketing strategy, and luckily many americans still had not tried the internet yet and defaulted to AOL as being the most popular. AOL continued to add subscribers and its stock price started to balloon; in 1998 alone the stock went up 593%. AOL was also inking ridiculous, heavily VC funded deals with new internet startups. Newly public Drkoop, which raised $85M in an IPO, signed a four year $89M deal to be AOL’s default provider of health content. Barnes and Noble paid $40M to be AOL’s bookselling partner. Tel-save, a long distance phone provider signed a deal worth $100M. As the internet bubble continued to grow, AOL’s CEO, Steve Case realized that many of these new startups would be unable to fufill their contractual obligations. Early web traffic reporting systems could easily be gamed, and companies frequently had no business model other than attract a certain demographic of traffic. By 1999, AOL had a market cap of $149.8B and was added to the S&P 500 index; it was bigger than both Disney and IBM. At this time, the world was shifting away from dial-up internet to modern broadband connections provided by cable companies. One AOL executive lamented: “We all knew we were living on borrowed time and had to buy something of substance by using that huge currency [AOL’s stock].” Time Warner was a massive media company, with movie studios, TV channels, magazines and online properties. On Jan 10, 2000, AOL merged with Time Warner in one of the biggest mergers in history. AOL owned 56% of the combined company. Four days later, the Dow peaked and began a downturn which would decimate hundreds of internet businesses built on foggy fundamentals. Acquisitions happen for a number of reasons, but imminent death is not normally considered by analysts or pundits. When you see acquisitions, read the press release and understand why (at least from a marketing perspective), the two companies made a deal. Was the price just astronomical (i.e. Instagram) or was their something very strategic (i.e. Microsoft-Github)? When you read the press release years later, it should indicate whether the combination actually was proved out by the market.

  2. Acquisitions in the internet bubble: why acquisitions are really just guessing. AOL-Time Warner shows the interesting conundrum in acquisitions. HP founder David Packard coined this idea somewhat in Packard’s law: “No company can consistently grow revenues faster than its ability to get enough of the right people to implement that growth and still become a great company. If a company consistently grows revenue faster than its ability to get enough of the right people to implement that growth, it will not simply stagnate; it will fall.” Author of Good to Great, Jim Collins, clarified this idea: “Great companies are more likely to die of ingestion of too much opportunity, than starvation from too little.” Acquisitions can be a significant cause of this outpacing of growth. Look no further than Yahoo, who acquired twelve companies between September 1997 and June 1999 including Mark Cuban’s Broadcast.com for $5.7B (Kara Swisher at WSJ in 1999), GeoCities for $3.6B, and Y Combinator founder Paul Graham’s Viaweb for $48M. They spent billions in stock and cash to acquire these companies! Its only fitting that two internet darlings would eventually end up in the hands of big-telecom Verizon, who would acquire AOL for $4.4B in 2015, and Yahoo for $4.5B in 2017, only to write down the combined value by $4.6B in 2018. In 2013, Yahoo would acquire Tumblr for $1.1B, only to sell it off this past year for $3M. Acquisitions can really be overwhelming for companies, and frequently they don’t work out as planned. In essence, acquisitions are guesses about future value to customers and rarely are they as clean and smart as technology executives make them seem. Some large organizations have gotten good at acquisitions - Google, Microsoft, Cisco, and Salesforce have all made meaningful acquisitions (Android, Github, AppDynamics, ExactTarget, respectively).

  3. Google and Excite: the acquisition that never happened. McCullough has an incredible quote nestled into the start of chapter six: “Pioneers of new technologies are rarely the ones who survive long enough to dominate their categories; often it is the copycat or follow-on names that are still with us to this day: Google, not AltaVista, in search; Facebook, not Friendster, in social networks.” Amazon obviously bucked this trend (he mentions that), but in search he is absolutely right! In 1996, several internet search companies went public including Excite, Lycos, Infoseek, and Yahoo. As the internet bubble grew bigger, Yahoo was the darling of the day, and by 1998, it had amassed a $100B market cap. There were tons of companies in the market including the players mentioned above and AltaVista, AskJeeves, MSN, and others. The world did not need another search engine. However, in 1998, Google founders Larry Page and Sergey Brin found a better way to do search (the PageRank algorithm) and published their famous paper: “The Anatomy of a Large-Scale Hypertextual Web Search Engine.” They then went out to these massive search engines and tried to license their technology, but no one was interested. Imagine passing on Goolge’s search engine technology. In an over-ingestion of too much opportunity, all of the search engines were trying to be like AOL and become a portal to the internet, providing various services from their homepages. From an interview in 1998, “More than a "portal" (the term analysts employ to describe Yahoo! and its rivals, which are most users' gateway to the rest of the Internet), Yahoo! is looking increasingly like an online service--like America Online (AOL) or even CompuServe before the Web.” Small companies trying to do too much (cough, uber self-driving cars, cough). Excite showed the most interest in Google’s technology and Page offered it to the Company for $1.6M in cash and stock but Excite countered at $750,000. Excite had honest interest in the technology and a deal was still on the table until it became clear that Larry wanted Excite to rip out its search technology and use Google’s instead. Unfortunately that was too big of a risk for the mature Excite company. The two companies parted ways and Google eventually became the dominant player in the industry. Google’s focus was clear from the get-go, build a great search engine. Only when it was big enough did it plunge into acquisitions and development of adjacent technologies.

Dig Deeper

  • Raymond Smith, former CEO of Bell Atlantic, describing the technology behind the internet in 1994

  • Bill Gates’ famous memo: THE INTERNET TIDAL WAVE (May 26, 1995)

  • The rise and fall of Netscape and Mosaic in one chart

  • List of all the companies made famous and infamous in the dot-com bubble

  • Pets.com S-1 (filing for IPO) showin a $62M net loss on $6M in revenue

  • Detail on Microsoft’s antitrust lawsuit

tags: Apple, IBM, Facebook, AT&T, Blackberry, Sequoia, VC, Sean Parker, Yahoo, Excite, Netscape, AOL, Time Warner, Google, Viaweb, Mark Cuban, HP, Packard's Law, Disney, Steve Case, Steve Jobs, Amazon, Drkoop, Android, Mark Zuckerberg, Crowdstrike, Motorola, Viacom, Napster, Salesforce, Marc Benioff, Internet, Internet History, batch2
categories: Non-Fiction
 

January 2020 - The Innovators by Walter Isaacson

Isaacson presents a comprehensive history of modern day technology, from Ada Lovelace to Larry Page. He weaves in intricate detail around the development of the computer, which provides the landscape on which all the major players of technological history wander.

Tech Themes

  1. Computing Before the Computer. In the Summer of 1843, Ada Lovelace, daughter of the poet Lord Byron, wrote the first computer program, detailing a way of repeatedly computing Bernoulli numbers. Lovelace had been working with Charles Babbage, an English mathematician who had conceived of an Analytical Engine, which could be used as a general purpose arithmetic logic unit. Originally, Babbage thought his machine would only be used for computing complex mathematical problems, but Ada had a bigger vision. Ada was well educated and artistic like her father. She knew that the general purpose focus of the Analytical Engine could be an incredible new technology, even hypothesizing, “Supposing, for instance, that the fundamental relations, of pitched sounds in the science of harmony and musical composition were susceptible to such expression and adaptations, the engine might compose elaborate and scientific pieces of music of any degree of complexity.” 176 years later, in 2019, OpenAI released a deep neural network that produces 4 minute musical compositions, with ten different instruments.

    2. The Government, Education and Technology. Babbage had suggested using punch cards for computers, but Herman Hollerith, an employee of the U.S. Census Bureau, was the first to successfully implement them. Hollerith was angered that the decennial census took eight years to successfully complete. With his new punch cards, designed to analyze combinations of traits, it took only eight. In 1924, after a series of mergers, the company Hollerith founded became IBM. This was the first involvement of the US government with computers. Next came educational institutions, namely MIT, where by 1931 Vanneaver Bush had built a Differential Analyzer (pictured below), the world’s first analog electric computing machine. This machine would be copied by the U.S. Army, University of Pennsylvania, Manchester University and Cambridge University and iterated on until the creation of the Electronic Numerical Integrator and Computer (ENIAC), which firmly established a digital future for computing machines. With World War as a motivator, the invention of the computer was driven forward by academic institutions and the government.

Business Themes

102680080-03-01.jpg
  1. Massive Technological Change is Slow. Large technological change almost always feels sudden, but it rarely ever is. Often, new technological developments are relegated to small communities, like Homebrew computing club, where Steve Wozniak handed out mock-ups for the Apple Computer, which was the first to map a keyboard to a screen for input. The development of the transistor (1947) preceded the creation of the microchip (1958) by eleven years. The general purpose chip, a.k.a. the microprocessor popped up thirteen years after that (1971), when Intel introduced the 4004 into the business world. This phenomenon was also true with the internet. Packet switching was first discovered in the early 1960s by Paul Baran, while he was at the RAND Corporation. The Transmission Control Protocol and Internet Protocol were created fifteen years after that (1974) by Vint Cerf and Bob Kahn. The HyperText Transfer Protocol (HTTP) and the HyperText Markup Language (HTML) were created sixteen years after that in 1990 by Tim Berners-Lee. The internet wasn’t in widespread use until after 2000. Introductions of new technologies often seem sudden, but they frequently call on technologies of the past and often involve a corresponding change that address the prior limiting factor of a previous technology. What does that mean for cloud computing, containers, and blockchain? We are probably earlier in the innovation cycle than we can imagine today. Business does not always lag the innovation cycle, but is normally the ending point in a series of innovations.

  2. Teams are Everything. Revolution and change happens through the iteration of ideas through collaborative processes. History provides a lot of interesting lessons when it comes to technology transformation. Teams with diverse backgrounds, complementary styles and a mix of visionary and operating capabilities executed the best. As Isaacson notes: “Bell Labs was a classic example. In its long corridors in suburban New Jersey, there were theoretical physicists, experimentalists, material scientists, engineers, a few businessmen, and even some telephone pole climbers with grease under their fingernails.” Bell Labs created the first transistor, a semiconductor that would be the foundation of Intel’s chips, where Bob Noyce and Gordon Moore (yes – Moore’s Law) would provide the vision, and Andy Grove would provide the focus.

Dig Deeper

  • Alan Turing and the Turing Machine

  • The Deal that Ruined IBM and Catapulted Microsoft

  • Grace Hopper and the First Compiler

  • ARPANET and the Birth of the Internet

tags: IBM, Microsoft, Moore's Law, Apple, Alan Turing, OpenAI, Cloud Computing, Bell Labs, Intel, MIT, Ada Lovelace, batch2
categories: Non-Fiction
 
Newer / Older

About Contact Us | Recommend a Book Disclaimer