• Tech Book of the Month
  • Archive
  • Recommend a Book
  • Choose The Next Book
  • Sign Up
  • About
  • Search
Tech Book of the Month
  • Tech Book of the Month
  • Archive
  • Recommend a Book
  • Choose The Next Book
  • Sign Up
  • About
  • Search

December 2021 - Trillion Dollar Coach: The Leadership Playbook of Silicon Valley's Bill Campbell by Eric Schmidt, Jonathan Rosenberg, and Alan Eagle

This month we read a book about famous CEO and executive coach, Bill Campbell. Bill had an unusual background for a silicon valley legend: he was a losing college football coach at Columbia. Despite a late start to his technology career, Bill’s timeless leadership principles and focus on people are helpful for any leader at any size company.

Tech Themes

  1. Product First. After a short time at Kodak, Bill realized the criticality of supporting product and engineering. As a football coach, he was not intimately familiar with the intricacies of photographic film. Still, Campbell understood that the engineers ultimately determined the company's fate. After a few months at Kodak, Bill did something that no one else ever thought of - he went into the engineering lab and started talking to the engineers. He told them that Fuji was hot on Kodak's heels and that the company should try to make a new type of film that might thwart some competitive pressure. The engineers were excited to hear feedback on their products and learn more about other aspects of the business. After a few months of gestation, the engineering team produced a new type of film: "This was not how things worked at Kodak. Marketing guys didn't go talk to engineers, especially the engineers in the research lab. But Bill didn't know that, or if he did, he didn't particularly care. So he went over to the building that housed the labs, introduced himself around, and challenged them to come up with something better than Fuji's latest. That challenge helped start the ball rolling on the film that eventually launched as Kodacolor 200, a major product for Kodak and a film that was empirically better than Fuji's. Score one for the marketing guy and his team!" Campbell understood that product was the heart of any technology company, and he sought to empower product leaders whenever he had a chance.

  2. Silicon Valley Moments. Sometimes you look back at a person's career and wonder how they managed to be at the center of several critical points in tech history. Bill was a magnet to big moments. After six unsuccessful years as coach of Columbia's football team, Bill joined an ad agency and eventually made his way to the marketing department at Kodak. At the time, Kodak was a blockbuster success and lauded as one of the top companies in the world. However, the writing was on the wall, film was getting cheaper and cheaper, and digital was on the rise. After a few years, Bill was recruited to Apple by John Sculley. Bill joined in 1983 as VP of Marketing, just two years before Steve Jobs would famously leave the company. Bill was incessant that management try to keep Jobs. Steve would not forget his loyalty, and upon his return, Jobs named Campbell a director of Apple in 1997. Bill became CEO of Claris, an Apple software division that functioned as a separate company. In 1990, when Apple signaled it would not spin Claris off into a separate company, Bill left with the rest of management. After a stint at Intuit, Bill became a CEO coach to several Silicon Valley luminaries, including Eric Schmidt, Steve Jobs, Shellye Archambeau, Brad Smith, John Donahoe, Sheryl Sandberg, Jeff Bezos, and more. Bill helped recruit Sandberg and current CFO Ruth Porat to Google. Bill was a serial networker who stood at the center of silicon valley.

  3. Failure and Success. Following his departure from Claris/Apple, Bill founded Go Corporation, one of the first mobile computers. The company raised a ton of venture capital for the time ($75m) before an eventual fire-sale to AT&T. The idea of a mobile computer was compelling, but the company faced stiff competition from Microsoft and Apple's Newton. Beyond competition, the original handheld devices lacked very basic features (easy internet, storage, network and email capabilities) that would be eventually be included in Apple's iPhone. Sales across the industry were a disappointment, and AT&T eventually shut down the acquired Go Corp. After the failure of Go. Corporation, Bill was unsure what to do. John Doerr, the famous leader of Kleiner Perkins, introduced Bill to Intuit founder Scott Cook. Cook was considering retirement and looking for a replacement. Bill met with Cook, but Cook remained unimpressed. It was only after a second meeting where Bill shared his philosophy on management and his focus on people that Cook considered Campbell for the job. Bill joined Intuit as CEO and went on to lead the company until 1998, after which he became Chairman of the board, a position he held until 2016. Within a year of Campbell joining, Microsoft agreed to purchase the company for $1.5b. However, the Justice Department raised flags about the acquisition, and Microsoft called off the deal in 1995. Campbell continued to lead the company to almost $600M of revenue. When he retired from the board in 2016, the company was worth $30B.

Business Themes

Communication_Leadership.png
Bill_Campbell.jpg
  1. Your People Make You a Leader. Campbell believed that people were the most crucial ingredient in any successful business. Leadership, therefore, was of utmost importance to Bill. Campbell lived by a maxim passed by former colleague Donna Dubinsky: "If you're a great manager, your people will make you a leader. They acclaim that, not you." In an exchange with a struggling leader, Bill added to this wisdom: "You have demanded respect, rather than having it accrue to you. You need to project humility, a selflessness, that projects that you care about the company and about people." The humility Campbell speaks about is what John Collins called Level 5 leadership (covered in our April 2020 book, Good to Great). Research has shown that humble leaders can lead to higher performing teams, better flexibility, and better collaboration.

  2. Teams Need Coaches. Campbell loved to build community. Every year he would plan a trip to the super bowl, where he would find a bar and set down roots. He'd get to know the employees, and after a few days, he was a regular at the bar. He understood how important it was to build teams and establish a community that engendered trust and psychological safety. Every team needs a good coach, and Campbell understood how to motivate individuals, give authentic feedback, and handle interpersonal conflicts. "Bill Campbell was a coach of teams. He built them, shaped them, put the right players in the right positions (and removed the wrong players from the wrong positions), cheered them on, and kicked them in their collective butt when they were underperforming. He knew, as he often said, that 'you can't get anything done without a team.'" After a former colleague left to set up a new private equity firm, Bill checked out the website and called him up to tell him it sucked. As part of this feedback style, Bill always prioritized feedback in the moment: "An important component of providing candid feedback is not to wait. 'A coach coaches in the moment,' Scott Cook says. 'It's more real and more authentic, but so many leaders shy away from that.' Many managers wait until performance reviews to provide feedback, which is often too little, too late."

  3. Get the Little Things Right. Campbell understood that every interaction was a chance to connect, help, and coach. As a result, he thought deeply about maximizing the value out of every meeting: "Bill took great care in preparing for one-on-one meetings. Remember, he believed the most important thing a manager does is to help people be more effective and to grow and develop, and the 1:1 is the best opportunity to accomplish that." Meetings with Campbell frequently started with family and life discussions and would move back and forth between business and the meaning of life - deep sessions that made people think, reconsider what they were doing and come back energized for more. He also was not shy about addressing issues and problems: "There was one situation we had a few years ago where two different product leaders were arguing about which team should manage a particular group of products. For a while, this was treated as a technical discussion, where data and logic would eventually determine which way to go. But that didn't happen, the problem festered, and tensions rose. Who was in control? This is when Bill got involved. There had to be a difficult meeting where one exec would win and the other would lose. Bill made the meeting happen; he spotted a fundamental tension that was not getting resolved and forced the issue. He didn't have a clear opinion on how to resolve the matter, on which team the product belonged, he simply knew we had to decide one way or another, now. It was one of the most heated meetings we've had, but it had to happen." Bill extended this practice to email where he perfected concise and effective team communication. On top of 1:1's, meetings, and emails, Campbell stayed on top of messages: ""Later, when he was coach to people all over the valley, he spent evenings returning the calls of people who had left messages throughout the day. When you left Bill a voice mail, you always got a call back." Bill was a master of communication and a coach to everyone he met.

Dig Deeper

  • Intuit founder Scott Cook on Bill Campbell

  • A Conversation between Brad Smith (Intuit CEO) and Bill Campbell

  • A Bill Campbell Reading List

  • Silicon Valley mourns its ‘coach,’ former Intuit CEO Bill Campbell

  • CHM Live | Trillion Dollar Coach: The Leadership Playbook of Silicon Valley’s Bill Campbell

tags: Intuit, Google, ServiceNow, Eric Schmidt, Jonathan Rosenberg, Alan Eagle, Columbia, Bill Campbell, Shellye Archambeau, John Donahoe, Jeff Bezos, Steve Jobs, Go Corporation, Football, Kodak, Fuji, Apple, Claris, Sheryl Sandberg, Brad Smith, Ruth Porat, AT&T, John Doerr, Microsoft, Donna Dubinsky, John Collins, Leadership
categories: Non-Fiction
 

November 2021 - Ender's Game by Orson Scott Card

This month we check out the futuristic sci-fi war drama, Ender’s Game. While the book is meant for kids, its a quick read and a great story.

Tech Themes

  1. The Metaverse. During Ender's time at Battle School, he interacts with the Mind Game, an individual game reflecting the thoughts and experiences of each person. Later, as he preps for battle, Ender uses a simulator to learn and practice commanding an army of battleships. These experiences in the simulator are completely personalized, driven by a supercomputer that can do whatever it wants to serve up experiences in the game: "You don't understand, sir. Our Battle School computer is only a part of the IF network. lf we want a picture, we have to get a requisition, but if the mind game program determines that the picture is necessary--it can just go take it." These hyper-personalized mind game experiences are similar to the latest ideas surrounding the Metaverse. The Metaverse is an unclear vision of cyberspace where individuals can interact in virtual reality, mixed reality, or augmented reality in a new computing paradigm. Facebook was so excited about the Metaverse that the company announced it was going to invest $10B in building out its virtual reality platform and changed its name to Meta Platforms, Inc. Matthew Ball has covered the Metaverse since 2018 and has penned his own definition: "The Metaverse is a massively scaled and interoperable network of real-time rendered 3D virtual worlds which can be experienced synchronously and persistently by an effectively unlimited number of users with an individual sense of presence, and with continuity of data, such as identity, history, entitlements, objects, communications, and payments." This is reminiscent of the world we explored in our September 2019 book, Ready Player One, and somewhat similar to the 1990s promise of the information superhighway. It will be interesting to see how the Metaverse develops in the coming years.

  2. Anonymity on the Internet. As Ender continues his training at Battle School, Peter and Valentine hatch a plot to create division throughout the world. The two decide the best way to take over the world as young, intelligent children is to write blog posts under a pseudonym and gain a mass following, eventually exercising their political influence. To avoid suspicion, Peter and Valentine switch emotional positions and take on the roles of historical figures aligning with their viewpoints. Peter becomes John Locke, a liberal philosopher, and inventor of the Social Contract, while Valentine becomes Demosthenes, an Athenian hellbent on inciting a war against Macedonia. While the idea of two teenage children starting war by writing on the internet is comical now, the specter might have been possible in the pre-mass internet era of 1985, the year Ender was published. This also raises the contentious shield of anonymity offered by the internet. While some argue that complete anonymity could mean the end of rational society, others say that anonymity must be preserved. This concept of anonymity is extended further in the over-hyped decentralized, crypto/web3 world of the future, where 15-word recovery phrases might become the norm for ultimate secrecy. Internet security and anonymity are likely to evolve if we move to a decentralized computing world - whether this is good or bad remains a matter of view.

  3. Technology and Governments. The International Force (IF) is a space army designed by the world to fight against the evil Buggers. The surprising thing about this International Force is how it unifies different governments: "Val, it was bound to happen. Right now there's a vast international fleet and army in existence, with American hegemony. When the bugger wars are over, all that power will vanish, because it's all built on fear of the buggers. And suddenly we'll look around and discover that all the old alliances are gone, dead and gone, except one, the Warsaw Pact." The Warsaw Pact was the agreement between the Soviet Union and several neighboring states following the creation of NATO. Funnily enough, the Warsaw Pact disbanded in 1991 with the fall of the Soviet Union, six years following the publication of Ender's Game. After Ender defeats the Buggers, the world immediately descends into political chaos until Peter comes to power. A once unified world with incredible technology like real-time technological communication through the Ansible is now torn apart by politics. These events bring up the broader role of government in the technological landscape. As we saw earlier this year, global non-US tech superpowers like Bytedance (owner of Tiktok) can cause immense political tension. Furthermore, companies like Taiwan Semiconductor (TSMC), that offer a unique product in a politically contentious region can even provoke the potential for war. Technology enables globalization while also raising the question of who owns non-physical products - the government, a company, or the world?

Business Themes

self-managed teams - four types of teams-1.png
unnamed-1636480175.jpeg
  1. Lonely at the Top. Card paints a world where the entire universe's future lies on children's shoulders. Ender becomes commander of the International Fleet, put in the challenging position as leader of older kids. He has to generate empathy while maintaining command. But Ender is just an intelligent child, and throughout the book, he finds himself in bad situations. He eventually grows to be the leader of his launch group and then the leader of his own Dragon Army. As Ender gains in stature, he loses touch with his friends. In one instance, he fears battle school enemies might jump him in the hallway and chastises Petra when she asks him to chat: "'Petra, if you had actually taken me aside just now, there are about a dozen boys following along who would have taken me in the corridor. Can you tell me you didn't notice them?' Suddenly her face flushed. "No. I didn't. How can you think I did? Don't you know who your friends are?" Many CEOs describe that the job can be lonely because you are naturally the final decision-maker. Even as a young child, Ender was forced to become a leader and suffered the mental instability of the job.

  2. Sending a Message. The IF chooses Ender because he is a mix of his two siblings, Peter, who represents extreme violence, and Valentine, who represents empathy. Violence is a recurring theme throughout the book - personal violence between individuals, violence between nations, and violence between civilizations (humans and buggers). In two dramatic sequences, older boys try to corner an unsuspecting Ender. Ender uses his brains to evade an attack but severely injures the attacker to send a message: "They were all wondering if he was dead. Ender, however, was trying to figure out a way to forestall vengeance. To keep them from taking him in a pack tomorrow. I have to win this now, and for all time, or I'll fight it every day and it will get worse and worse. Ender knew the unspoken rules of manly warfare, even though he was only six." Ender thought he needed to message all of his potential attackers. However, these beatings weigh on him constantly, and he spends the rest of his life regretting them. The violent nature of these attacks is reprehensible and difficult to compare to the business world. But it does raise how some executives act with emotion to humiliate or denigrate employees. Recently the CEO of online mortgage startup Better.com fired 900 people over a zoom video call. Beyond the act, the message it sends to employees is even worse. These events can follow executives, with media coverage continuing for over five years after the event itself. Actions send messages. They should be taken with caution when emotion or retaliation is involved.

  3. Self-Managed Teams. Ender is a tactical magician and completely changes the Battle Game. Ender's approach is novel: "He had the army drill in eight-man toon maneuvers and four-man half-toons, so that at a single command, his army could be assigned as many as ten separate maneuvers and carry them out at once. No army had ever fragmented itself like that before, but Ender was not planning to do anything that had been done before, either. Most armies practiced mass maneuvers, performed strategies. Ender had none. Instead, he trained his toon leaders to use their small units effectively in achieving limited goals. Unsupported, alone, on their own initiative." This approach is called Self-Managed Teams. The autonomy offered by allowing individuals to manage themselves gives extreme ownership to employees. Self-Managed teams work well in places with repeated work, where employees trust each other and have high self-awareness. This exciting concept has worked well in several businesses, including Facebook and Google.

Dig Deeper

  • Ender’s Game (the Movie)

  • Demosthenes and Locke - An Essay by Alyssa Rosenberg at the Atlantic

  • An Interview with Orson Scott Card

  • The Department of Defense is issuing AI ethics guidelines for tech

  • Peter and Valentine Wiggin in Ender’s Game

tags: Facebook, Microsoft, Ender, Meta, Metaverse, Ready Player One, John Locke, Demosthenes, Social Contract, web3, Crypto, NATO, Soviet Union, Bytedance, Tiktok, TSMC, Better.com, Self-Managed Teams
categories: Fiction
 

October 2021 - Unapologetically Ambitious by Shellye Archambeau

This month we hear the story of famous technology CEO Shellye Archambeau, former leader of GRC software provider, Metricstream. Archambeau packs her memoir full of amazing stories and helpful career advice; the book is a must-read for any ambitious leader looking for how to break into Silicon Valley’s top ranks.

Tech Themes

  1. The Art of the Pivot. When Archambeau joined Zaplet in 2003 as its new CEO, she had a frank conversation with the chairman of the board Vinod Khosla. She asked him one question: “You have a great reputation for supporting your companies, but you also have a reputation of being strong-willed and sometimes dominating. I just need to know before I answer [where I will take the job], are you hiring me to implement your strategy, or are you hiring me to be the CEO?” Vinod responded: “I would be hiring you to be the CEO, to run the company, fully responsible and accountable.” With that answer, Archambeau accepted the job and achieved her life-long goal of becoming a CEO before age forty. Archambeau had just inherited a struggling former silicon-valley darling that had raised over $100M but had failed to translate that money into meaningful sales. Zaplet’s highly configurable technology was a vital asset, but the company had not locked on to a real problem. Struggling to set a direction for the company, Archambeau spoke with board member Roger McNamee, who suggested pivoting into compliance software. In early 2004, Zaplet merged with compliance software provider MetricStream (taking its name), with Archambeau at the helm of the combined company. She wasn’t out of the woods yet. The 2008/09 financial crisis pushed MetricStream to the brink. With less than $2M in the bank, Archambeau ditched her salary, executed a layoff, and rallied her executive through the financial crisis. As banks recapitalized, they sought new compliance and risk management platforms to avoid future issues, and MetricStream was well-positioned to serve this new set of highly engaged customers. Archambeau’s first and only CEO role lasted for 14 years, as she led Metricstream to $100M in revenue and 2,000+ employees.

  2. Taking Calculated Risks. Although Archambeau architected a successful turnaround, her career was not without challenges. After years of working her way up at IBM, Archambeau strategically chose to seek out a challenging international assignment, an essential staple of IBM’s CEOs. While working in Tokyo as VP and GM for Public Sector in Asia Pacific, Archambeau was not selected for a meeting with Lou Gerstner, IBM’s CEO. She put it bluntly: “I was ranked highly in terms of my performance - close to the top of the yearly ranking, not just in Japan, but globally. Yet I was pretty sure I wasn’t earning the salary many of my colleagues were getting.” It was then that Archambeau realized that she might need to leave IBM to achieve her goal of becoming CEO. She left IBM and became President of Blockbuster.com, as they were beginning to compete with Netflix. Blockbuster was staunch in its dismissal of Netflix, refusing to buy the streaming company when it had a chance for a measly $50M. Archambeau was unhappy with management’s flippant attitude toward a legitimate threat and left Blockbuster’s Dallas HQ after only 9 months. After this difficult work experience, Archambeau sought out work in Silicon Valley, moving to the nation’s tech hub without her family. She became Head of Sales and Marketing for Northpoint Communications. The company was fighting a losing DSL cable battle, and after a merger with Verizon fell through, the company went bankrupt. Then Archambeau became CMO of Loudcloud, Ben Horowitz’s early cloud product covered in our March 2020 book, The Hard Thing About Hard Things. But things were already blowing up at Loudcloud, and after a year, Archambeau was looking for another role following the sale of LoudCloud’s services business to EDS. At 40 years old, Archambeau had completed international assignments, managed companies across technology, internet, and telecom, and seen several mergers and bankruptcies. That experience laid the bedrock for her attitude: “After the dot-com bubble burst, I would need to double down and take greater risks, but-and this probably won’t surprise you-I had planned for this…It’s 2002, I’m almost forty, I’ve learned a great deal from Northpoint and Loudcloud, and I’m feeling ready for my chance to be a CEO.” Archambeau was always ready for the next challenge, unafraid of the risks posed - prepared to make her mark on the Tech industry.

  3. Find the Current. Trends drive the Tech industry, and finding and riding those trends can be hugely important to creating a career. As in Archambeau’s journey, she saw the growing role of technology as an intern at IBM in the 1980s and knew the industry would thrive over time. As the internet and telecom took hold, she jumped into new and emerging businesses, unafraid of roadblocks. As she puts it: “Ultimately, when it comes to reaching your goals, the real skill lies in spotting the strongest current - in an organization, in an industry, even in the larger economy - and then positioning yourself so it propels you forward. Sail past the opportunities that lead you into the weeds and take the opportunities that will move you toward your goals.”

Business Themes

Shellye-Archambeau-on-Remarkable-People-podcast.jpg
  1. The Power of Networking. One of Archambeau’s not-so-secret strategies toward career success was networking. She is a people person and radiates energy in every conversation. Beyond this natural disposition, Archambeau took a very concerted and intentional approach toward building her network, and it shows. Archambeau crosses paths with Silicon Valley legends like Bill Campbell and Ben Horowitz throughout the book. Beyond one-to-one mentorship relationships, Archambeau joined several organizations to grow her network, including Watermark, the Committee of 200, ITSM Form, Silicon Valley Leadership Group, and more. These groups offered a robust foundation and became a strong community, empowering and inspiring her to lead!

  2. Support and Tradeoffs. As a young college sophomore, Archambeau knew she wanted to be the breadwinner of the family. When she met her soon-to-be husband Scotty, a 38-year-old former NFL athlete, she was direct with him: “I would really like to be able to have someone stay home with the kids, especially when they are in school. But the thing is…I just don’t want it to be me.” Scotty thought patiently, “You know, Archambeau, I’ve had a lot of experiences in my life. I’ve had three different careers and you know I like working. But, I think I could see myself doing that, for you.” That was the icing on top of the cake. The two married and had two children while Archambeau worked up the ranks to become CEO. Scotty took care of the kids, Kethlyn and Kheaton, when Archambeau moved to Silicon Valley for work. She understood the tough tradeoff she was making and acknowledged that her relationship with her daughter felt more strained during Kethlyn’s teenage years. It begs the question, how comfortable are you with the tradeoffs you are making today? Moving to a new city to pursue a career that may strain family dynamics is never an easy decision. Family was always important to Archambeau, but it became front and center when Scotty was diagnosed with blood cancer in 2010. Although she was still CEO of MetricStream, things changed: “I had accumulated vacation days, I was putting off trips and experiences for ‘when the time was right’…We’re going to do things that we would have waited to do. We’re going to them now.” Family and friends became a priority - they always were!

  3. Earning Respect. As a Black woman in Technology, Archambeau had to overcome the odds repeatedly. She recounted: “As a young African American woman, I was accustomed to earning respect. Whenever I got a promotion or a new job, I walked into it understanding that people likely would assume I was not quite qualified or not equity ready. I presumed I need to establish relationships and credibility, to develop a reputation, to prove myself.” While incredibly sad that Archambeau had to deal with this questioning, she learned how to use it to her advantage. As her family moved around the country, Archambeau faced repeated challenges: getting denied from taking advanced classes in school, getting bullied and beaten walking home from school, and starting high school with leg braces in a new city. Through these difficulties, she developed a simple methodology for getting through tough times: “Accept the circumstances, fake it ‘til you make it, control what you can, and trust that things will get better.” Archambeau took that mentality with her and earned the respect of the entire IBM Japan when she presented her introduction slides entirely in Japanese to build trust with her new co-workers. It was the first time a foreign executive had done so. Archambeau’s ability to boldly take action in face of many obstacles is impressive.

Dig Deeper

  • Knowing Your Power | Shellye Archambeau | TEDxSonomaCounty

  • Spelman College Courageous Conversations - Shellye Archambeau

  • Shellye Archambeau: Becoming a CEO (A) - A Harvard Business School Case

  • MetricStream Raises $50M to Take on the GRC Market

tags: Metricstream, Zaplet, Shellye Archambeau, Vinod Khosla, Ben Horowitz, Loudcloud, Bill Campbell, GRC, Japan, Lou Gerstner, IBM, Blockbuster, Netflix, Silicon Valley, Silver Lake, Roger McNamee, Northpoint Communications, Verizon
categories: Non-Fiction
 

September 2021 - Super Mario: How Nintendo Conquered America by Jeff Ryan

This month we dive into the history of Nintendo and Super Mario, the loveable, super-smashing, tennis-playing, go-karting, partier. Jeff Ryan’s book explores the history of Nintendo and the evolution of the Video Game industry to the console competition we have today.

Tech Themes

  1. Constraint Breeds Creativity. Sometimes, challenges drum up creativity like nothing other than having your back against the wall can. This was the case with Nintendo. In 1980, Nintendo’s CEO Hiroshi Yamauchi sent his son-in-law, Minoru Arakawa to Manhattan to launch Nintendo of America. The idea was to launch Nintendo into the large and growing market for arcade cabinet games in the US. Nintendo had developed a Space Invaders knock-off called Radar Scope to take the market by storm. However, it sold incredibly poorly and months after moving to the US, Arakawa found himself with 2,000 large, unsold arcade cabinet games and a disappointed father-in-law. Yamauchi scoured the company for interesting game ideas, not wanting the pre-made cabinets to go to waste, and found one from a young designer named Shigeru Miyamoto. Miyamoto drew inspiration from Popeye and King Kong to come up with Donkey Kong, a revolutionary “platform” style game that involved a character named Jumpman trying to save a damsel in distress Pauline from a giant evil gorilla. After coming up with this crazy concept game, Nintendo still had to re-work the original Radar Scope circuit boards. The boards were shipped from Nintendo’s Japanese headquarters to Manhattan, where Arakawa and his wife carefully removed the Radar Scope game and installed the new Donkey Kong game. Nintendo’s sales network convinced two bars in Seattle to pilot the game and it took off like crazy; people played 120 times per day, yielding $30 of profit to Nintendo every day. Jumpman would later become Mario, Donkey Kong would go on to become a staple character in Nintendo’s video gaming world, and all because of an epic failure and a distressed company.

  2. Cabinet, Console, and Competition. Staying relevant in technology evolution. Nintendo successfully moved from a video game cabinet to the super Nintendo, the Gameboy, the N64, the GameCube, the wii, and now the Switch. At each stop, Nintendo tried hard to leverage all of the resources available in the hardware of the day. By purposefully maxing out its new hardware capabilities, Nintendo was able to build innovation into its games. As an example, Nintendo leveraged a special aspect of code in the NES to build Mario’s initial music theme. While Mario is a silent character, this created a new atmosphere for gamers. Later on, Nintendo would launch the N64 Rumble Pak, which provided haptic feedback through the controller based on gameplay. This became a staple concept for all consoles on the market. However, it wasn’t always fun and games. Nintendo missteps are single-handedly responsible for the creation of Sony’s Playstation. In 1988, a Sony Engineer began secretly developing a chip to help make CD-ROM games compatible with the Nintendo. Nintendo was interested in broadening its capabilities and signed a contract with Sony to produce an add-on device for the Super Nintendo Entertainment System (SNES). Although the two companies had signed a deal, it was clear that Nintendo would have to give up substantial control of the creative rights and hardware to Sony with the add-on. Yamauchi could not give Sony that much control, and in a historic change of direction at the 1991 CES, he went behind Sony’s back to partner with Sony’s biggest rival, Phillips. However, Phillips was not a super-strong development partner and the SNES CD-ROM add-on was plagued with delays. Sony continued the development of a gaming system on its own and Nintendo shifted priorities to its next console, the N64. Sony’s CD-ROM gaming system had a significant advantage over the N64 cartridge-based system in that it allowed much easier and consistent, open standards for developers. Sony went to Square, one of Nintendo’s top game makers, and lured them over to produce its famous Final Fantasy series for the upcoming launch of the PlayStation in 1994. The PlayStation seized significant market share from Nintendo and entered Sony into the gaming space. Nintendo’s decision to opt for control and proprietary formats in the N64 and GameCube helped avoid counterfeit games but left the market open to Sony’s Playstation and consumers that wanted an all-in-one device (games, CDs, DVDs).

  3. Play the Long Game. Miyamoto had the idea for a three-dimensional Mario that would take advantage of all of the improvements in graphics rendering by the early 90s. While the idea gestated, Miyamoto tried to think of how game mechanics for 3D games could work. After serious thought and some development time spent in the early 1990s, Miyamoto shelved the idea because he felt they would need a bigger controller with more buttons to fully realize the vision of a 3D Super Mario. After Nintendo and Miyamoto began development on Super Mario 64 in September 1994, they ran into delays caused by contrasting opinions on camera views and game layout. On top of this, Miyamoto had grander designs than Nintendo had time for, and several courses had to be scrapped to get to a working version. The game shipped after the 1995 holiday season and delayed the launch of the Nintendo64 until April 1996. However, because Nintendo had created such strong, single-player, free-roaming game mechanics, this allowed some of the unused levels to be put into Legend of Zelda: Ocarina of Time, which debuted in 1998. Sometimes it takes time for the world and technology to catch up to your ambitions.

Business Themes

nintendo_timeline_by_vincentweir_darjnzt-fullview.jpg
Super_Mario_64_box_cover.jpg
  1. An Intense Family Business. Nintendo was started in 1889 by Fusajiro Yamauchi to produce flower cards, which are a type of Japanese playing card. Despite significant trouble during the Russo-Japanese War of 1907 and World War II, the company survived long enough for third-generation Hiroshi Yamauchi to take the reigns in 1950. Over the next 20 years, Nintendo would ride the wave of post-war popularity to a 1963 IPO on the Osaka and Kyoto stock exchanges. However, in the late 60’s, appetite for cards decreased and Yamauchi was looking for a new market to support the company’s growth. In 1969, Gunpei Yokoi joined the company and set it off on a new trajectory developing simple electric toys. In the 1970s and 80’s the company repositioned itself as a handheld, console, and cabinet video game producer. Since then, Nintendo has gone on to produce millions of games and systems. There is something amazing to be said about a business that finds its next wave of growth in its S-curve and somehow stays alive through multiple wars, products, and competitors.

  2. Counter-Positioning. Nintendo is famous for its numerous licensing deals to promote its characters on everything to build brand awareness and associations amongst consumers (Super Mario Mac & Cheese anyone?). Nintendo leveraged its history selling toys to children to create a strong brand of reputable characters only rivaled by the likes of Disney today. Because Nintendo focused on a family-friendly, younger customer base (no blood in games on the original Nintendo), it left some un-fulfilled customers in the market. Enter SEGA and Sonic the Hedgehog. SEGA was started as a simple amusement game provider for military bases in the 1940s. The company launched its first video game in 1973, its first console in 1982, and created Sonic in 1991. Sonic was everything Mario was not - he was purposely built to be a character built for teenagers. As School of Game Design points out: “Just as the 19th century expressionists use shape and line to evoke emotional responses, character designers today use the shape of a character’s body to communicate the personality of a character to us. Mario is circular, he has a button nose, a pot belly, and his hands, feet, and head, are all round. Sonic’s design on the other hand is all jaggy triangles, he has spiky hair, pointy cat ears, ski goggle eyes, and torpedo shoes…Right out of the gate the personalities clash. Sonic has the image of a mischievous bad boy, while Mario is playful, and aloof.” This is a classic example of counter-positioning - or directly occupying a competitive place in the industry that is the exact opposite of the incumbent firm. Sonic was the anti-Mario, and helped SEGA launch its Genesis platform.

  3. The Video Game Recession & Supply Chain Bullwhip. While Super Mario and Donkey Kong helped launch a massive interest in video and arcade games, there were some periods in the 1980s when people thought videogames were just a fad. In 1983, the arcade game industry experienced a massive recession driven by a common supply-chain issue called the bullwhip effect. As explained in this simple video, the bullwhip effect occurs when a change in demand has an amplified effect across a supply chain from customer to retailer to wholesale to distributor to manufacturer. The effect causes massive forecasting errors and inventory build up due to an over-extrapolation of demand. In the late 1970s and early 1980s, video games were all the rage driven by Atari’s Pong and Space Invaders games. This attracted a flood of competition from Coleco, Mattel, and Phillips. Everyone forecasted that market saturation was years away, and consumers would be itching for video game and cabinet game systems for the next few years. As a result, many video game companies over-ordered from their cartridge and console manufacturers. Once the video game companies had too much inventory on hand, they started discounting it to try to sell more, but it could only sell so much. After being unable to sell several systems, Atari famously buried some of its inventory at a landfill site in New Mexico. This effect can cause compounding losses for companies, because they buy inventory at full or sometimes above full price, sell games at cheaper prices due to market saturdation, and often have to pay to house or destroy extra inventory. The bullwhip effect is a crippling issue that companies like Peloton are facing today.

Dig Deeper

  • There will Never Ever be another Melee player like Hungrybox - Documentary exploring Professional Super Smash Brothers Athletes

  • Super Mario Bros 30th Anniversary Special Interview with Shigeru Miyamoto and Takashi Tezuka

  • CRASH: The Year Video Games Died

  • The History of the Gameboy

  • The 10 Biggest Mistakes in Nintendo History

tags: Nintendo, Super Mario, Mario, Luigi, Hiroshi Yamauchi, Shigeru Miyamoto, Donkey Kong, Video Games, Jumpman, Wii, Switch, Gamecube, N64, NES, SNES, Zelda, Playstation, Phillips, CDs, DVDs, Disney, SEGA, Sonic, Genesis, Bullwhip Effect, Mattel, Coleco, Pong, Space Invaders, Minoru Arakawa, Gameboy
categories: Non-Fiction
 

August 2021 - Hit Refresh by Satya Nadella, with Greg Shaw and Jill Tracie Nichols

This month we look at how Satya Nadella reignited Microsoft’s fire and attacked new spaces with a growth mindset. The book is loaded with excellent management philosophy and complex Microsoft history.

Tech Themes

  1. Bing: The Other Search Engine. After starting at Microsoft as an engineer and rising through the ranks to lead Microsoft Dynamics (its CRM product), Nadella was handpicked to lead the re-launch of a brand new search engine, Microsoft Bing. Bing was one of Microsoft’s first “born-in-the-cloud” businesses and Nadella quickly recognized four core areas of focus: distributed systems, consumer product design, understanding the economics, of two-sided marketplaces, and AI. Microsoft had a troubled history with search engines and wanted to go big quickly, submitting an offer to buy Yahoo for $45B in February of 2008. Microsoft was rebuffed and thus Nadella found himself launching Search Checkpoint #1 in September of 2008 ahead of a June 2009 Bing launch. What are the odds that Microsoft’s future CEO would have early cloud, distributed systems, and advanced AI leadership experience? It was an almost prescient combination!

  2. Red Dog to Azure. Microsoft started working on the cloud two years after Amazon launched AWS. In 2008, veteran software architects Ray Ozzie and Dave Cutler created a secret team inside Microsoft known as Red Dog, which was focused on building a cloud infrastructure product. Red Dog was stationed under Microsoft’s Servers and Tools business unit (STB), with products such as Windows Server and Microsoft’s powerful RDBMS, SQL Server. In 2010, Microsoft CEO Steve Ballmer asked Nadella to lead the STB business unit and set the vision for their then single-digit millions cloud infrastructure business. It was a precarious situation: “The server and tools business was at the peak of its commercial success and yet it was missing the future. The organizing was deeply divided over the importance of the cloud business. There was constant tension between diverging forces.” How did Nadella resolve this tension? It was simple - he made choices and rallied his team around those decisions. He focused the team on hybrid cloud, data, and ML capabilities where Microsoft could take advantage of its on-premise, large enterprise heritage while providing an on-ramp for customers eager to make the shift to the cloud. Microsoft has since surged to an estimated 20% worldwide market share making it one of the biggest and fastest-growing products in the world!

  3. Re-Mixed Reality. Microsoft’s gaming portfolio is impressive: Xbox, Mojang (aka Minecraft), Zenimax Media (Maker of Fallout, Wolfenstein, and DOOM). Microsoft also owns the Hololens, a virtual reality headset that competes with Facebook’s Oculus. Many believe the future computing generations will take place in virtual reality, augmented, or mixed reality. Nadella doesn’t mince words - he believes that the future will not be in virtual reality (as Facebook is betting) but rather in mixed reality, a combination of augmented reality (AR) and virtual reality, where the user experiences an augmented experience but still maintains some semblance of the outside world. Nadella lays out the benefits: “HoloLens provides access to mixed reality in which the users can navigate both their current location - interact with people in the same room - and a remote environment while also manipulating holograms and other digital objects.” Virtual reality blocks out the outside world, but that can be an overwhelming experience and impractical particularly for enterprise users of AR/VR/MR technologies. One of the big users of the HoloLens is the US Army, which recently signed a rumored $22B deal with Microsoft. It is still early days, but the future needs a new medium of computing and it might just be mixed reality!

Business Themes

0417red_microsoftlinux.jpg
  1. Leading with Empathy. Satya Nadella’s life changed with the birth of his son. “The arrival of our son, Zain, in August 1996 had been a watershed moment in Anu’s and my life together. His suffering from asphyxia in utero had changed our lives in ways we had not anticipated. We came to understand life as something that cannot always be solved in the manner we want. Instead, we had to learn to cope. When Zain came home from the intensive care unit, Anu internalized this understanding immediately. There were multiple therapies to be administered to him every day, not to mention quite a few surgeries he needed that called for strenuous follow-up care after nerve-racking ICU stays…My son’s condition requires that I draw daily upon the very same passion for ideas and empathy that I learned from my parents.” Nadella reiterates the importance of empathy throughout the book, and rightly so, empathy is viewed as the most important leadership skill, according to recent research. How does one increase empathy? It’s actually quite simple - talk to people! Satya understands this: “It is impossible to be an empathetic leader sitting in an office behind a computer screen all day. An empathetic leader needs to be out in the world, meeting people where they live, and seeing how the technology we create affects their daily activities.” Leadership requires empathy - hopefully, we see more of it from big technology soon!

  2. Frenemies. One of the first things that Satya Nadella did after taking over the CEO role from Steve Ballmer in 2014 was reach out to Tim Cook. Apple and Microsoft had always had a love-hate relationship. In 1997, Microsoft saved Apple shortly after Steve Jobs returned by investing $150M in the company so that Apple could stave off potential bankruptcy. However, in 2014, Nadella called on Apple: “I decided we needed to get Office everywhere, including iOS and Android…I wanted unambiguously to declare, both internally and externally, that the strategy would be to center our innovation agenda around users’ needs and not simply their device.” Microsoft had tried to become a phone company with Windows Mobile in 2000, tried again with Windows Phone in 2010, and tried even harder at Windows Phone in 2013 with a $7.2B acquisition of Nokia’s mobile phone unit. Although Nadella voted ‘No’ on the deal before becoming CEO, he was forced to manage the company through a total write-off of the acquisition and the elimination of eighteen thousand jobs. So how could Nadella catch up to the mobile wave? “For me, partnerships - particularly with competitors - have to be about strengthening a company’s core businesses, which ultimately centers on creating additional value for the customer…We have to face reality. When we have a great product like Bing, Office, or Cortana but someone else has created a strong market position with their service or device, we can’t just sit on the sidelines. We have to find smart ways to partners so that our products can become available on each other's popular platforms.” Nobody knows platforms like Microsoft; Bill Gates wrote the definition of a platform: “A platform is when the economic value of everybody that uses it, exceeds the value of the company that creates it.” Nadella got over his predecessor’s worry and hatred of the competition to bring Microsoft’s software to other platforms to strengthen both of their leadership positions.

  3. Regulation and Technology. Nadella devotes an entire chapter to the idea of trust in the digital age. Using three case studies - North Korea’s attack on Sony’s servers, Edward Snowden’s leaked documents (that were held on Microsoft’s servers), and the FBI’s lawsuit against Apple to unlock an iPhone that might contain criminal information - Nadella calls for increased(!) regulation, particularly around digital technology. Satya uses a simple equation for trust: “Empathy + Shared values + Safety and Reliability = Trust over time.” Don’t you love it when a company that the government sued over anti-trust practices calls on the government to develop better laws! You’d love it even more if you saw how they used the same tactics to launch Microsoft Teams! Regulation in technology has been a hot topic recently, and Nadella is right to call on the government to create new laws for our digital world: “We do not believe that courts should seek to resolve issues of twenty-first-century technology relying on law that was written in the era of the adding machine.” He goes further to suggest potential remedies, including an efficient system for government access to corporate data, stronger privacy protections, globalized digital evidence sharing, and transparency of corporate and government data. I imagine the trend will be toward more regulation, especially with the passage of recent data laws like GDPR or CCPA, but I’m not sure we will see any real sweeping changes.

Dig Deeper

  • “Culture Eats Strategy for Breakfast” - How Satya Nadella Rebooted Microsoft

  • Satya Nadella Interview at Stanford Business School (2019)

  • Microsoft is Rolling out a New Framework to its Leaders - Business Insider

  • Satya Nadella email to employees on first day as CEO

  • HoloLens Mixed Reality Demonstration

tags: Microsoft, Satya Nadella, Apple, Tim Cook, Bing, Yahoo, Xbox, Minecraft, Facebook, Army, Mixed Reality, AR, VR, HoloLens, Oculus, Steve Jobs, Bill Gates, iOS, Android, Office, Sony, North Korea, FBI, Snowden, Empathy, Regulation, Privacy
categories: Non-Fiction
 

July 2021 - Genentech: The Beginnings of Biotech by Sally Smith Hughes

This month we dive into the birth of the biotech industry and learn about Genentech, a biotech company that was built on the back of novel recombinant DNA research in the 1970’s. The book covers most of the discovery and pre-IPO story of the company, weaving in commentary about political, social, and fundraising challenges the company faced.

Tech Themes

  1. Education & Profits. The biotech industry creates an interesting symbiotic relationship between universities and businesses. Genentech was founded by an out-of-work venture capitalist named Bob Swanson and an exuberant scientific genius named Herb Boyer. In 1973, Boyer and a colleague, Stan Cohen, had conceived of the idea of using restriction enzymes to cleave DNA fragments, allowing the scientists to insert and express almost any gene in bacteria. In 1977-78, Boyer, Riggs, and Itakura showed that the recombinant DNA process could create somatostatin and insulin. Because of the unbelievable economic potential of their findings, Stanford (where Cohen worked) and UCSF (where Boyer worked) decided to file a patent on the recombinant DNA procedure. The patent process sparked a massive debate about the commercialized use of their procedure, with several scientists, like National Academy of Science Chairman Paul Berg, calling for an investigation and formal rules. As Hughes notes, “The 1970s was notably inhospitable to professors forming consuming relationships with business, let alone taking the almost unheard-of step of founding a company without giving up a professorship.” This challenge of balance incentives: helping society, contributing all biological research back to the world for free, and personal financial and celebrity gain is hard. Many of the world’s leading researchers are motivated not only by deep investigative science but also by the notoriety of being published in the world’s leading journals. Today, several of the world’s leading AI researchers face a similar dilemma. In 2012, Geoff Hinton, a former Unversity of Toronto professor, auctioned off his AI algorithm and job between Google, Baidu, and Microsoft for a one-time £30M payout. Databricks, a big data company, recently raised money at a $38B valuation - their CEO, Ali Ghodsi, conceived of the idea for Databricks as a Ph.D. student at UC Berkeley, where he remains an adjunct professor. The twisted and complicated world of Academia and corporations continues!

  2. IP. One of the big challenges of Genentech’s unique academic heritage was a massive intellectual property battle that would last for years. In 1976, Bob Swanson set out to negotiate an exclusive license to the Boyer-Cohen patent from Stanford and UCSF. He was rebuffed by the administration, trying to avoid the politically heated topic of recombinant DNA research. Things were made even more complicated in 1978. On New Year’s eve at 12:00 am, soon-to-be new employees Peter Seeburg and Axel Ulrich broke into their former UCSF lab to take research specimens related to contract research work they were performing for Genentech. In 1999, after years of patent disputes, Genentech finally settled the patent infringement for $200M, one of the largest biotech settlements ever. With such enormous sums of money at stake, the question of who owns the invention and how that invention is used is hotly debated and contested - pharmaceutical companies have seen larger and larger misuse settlements.

  3. Regulation & Action. An often forgotten aspect of commercial industry change is regulation, perhaps because it is complicated and slow to develop, but the effects can be enormous. iN 1983, in reaction to chronic under-investment in drugs serving small patient population sizes (“Rare Disease”), the Department of Health and Human Services and FDA helped enact the Orphan Drug act of 1983. “That law, the Orphan Drug Act, provided financial incentives to attract industry’s interest through a seven-year period of market exclusivity for a drug approved to treat an orphan disease, even if it were not under patent, and tax credits of up to 50 percent for research and development expenses. In addition, FDA was authorized to designate drugs and biologics for orphan status (the first step to getting orphan development incentives), provide grants for clinical testing of orphan products, and offer assistance in how to frame protocols for investigations.” A further revision to the Act in 2002 specified a rare disease as a disease affecting a patient population of <200,000 people. Coupled with these amazing incentives was the ability to price drugs in response to the exclusivity received for performing the research that led to the drug’s discovery. Such exclusivity has led to much higher prices for rare disease drugs, causing anger from patients (and insurance groups) who need to pay for these effective but high-priced drugs. Some economists have even studied the idea of “fairness” in orphan drug pricing - considering whether a rare disease drug that cures 90% of patients with the disease should be priced significantly higher than those that cure a smaller percentage of the population. These incentives have produced a massive influx of investment into the space, with 838 total orphan drug indications and 564 distinct drugs created to help patients with rare diseases.

Business Themes

drug-development-failure-and-success-lrg1.jpg
S-Curves-New-Products.png
  1. Partnerships. The biotech industry thrives off of partnerships. This is primarily due to the enormous cost of bringing a drug to market, with a recent paper pinning the number for just R&D costs at greater than ~$1B. Beyond the cost of FDA Phase 1, 2, and 3 trials - $4M, $13M, and $20M median - companies often have to deal with many failures and re-directions along the way. On top of that, companies have to manufacture, sell, and market the drug to patient populations and physicians. Genentech was one of the first companies to establish partnerships with major pharmaceuticals companies. Genentech considered many different partnerships for different parts of its drug pipeline (something that is still done today). In August of 1978, Genentech partnered with Kabi, a Swedish pharmaceutical manufacturer, to produce human growth hormone using the Genentech approach. The deal included a $1M upfront payment for exclusive foreign marketing rights. Three weeks later, Genentech partnered with Eli Lilly to start making human insulin using the recombinant DNA approach - the deal was a twenty-year R&D contract with an upfront fee of $500,000 for exclusive worldwide rights to insulin; Genentech received 6% royalties and City of Hope (an education institution) received 2% of product sales. In January of 1980, Genentech signed a deal with Hoffman-La Roche to collaborate on leukocyte and fibroblast interferon - a chemical that was believed to be a potential cancer panacea. All of these deals were new back then but are now commonplace today - with marketing, R&D, and royalty partnerships the norm in the biotech and pharmaceuticals industry.

  2. The Perils and Beauty R&D. Pharmaceutical and Biotech companies face a very difficult challenge in bringing a drug to market. Beyond the costs detailed above, the success rate is so low that companies often need to have multiple scientific projects going on at once. The book details this challenge: “By the second quarter of 1979, the company had four new projects underway, all but one sponsored by a major corporation: Hoffman-La Roche on interferon; Monsanto on animal growth hormone; Institut Merieux on hepatitis B vaccine; and a Genentech fund project on the hormone thymosin.” This was all in addition to its Kabi and Eli Lilly deals! This brings up the idea of S curves, whereby product adoption reaches a peak and new products pick up to continue the growth of the organization. This is common in all businesses and markets but especially difficult to predict in biotech and pharma where drug development takes years, patents come and go, and new drug success is probabilistically low. This is the double-sided challenge of big pharma, where companies debate internal R&D spending or external M&A to drive new growth vectors on a company’s S-Cuve. It’s something that Genetech is still trying to figure out today.

  3. A Silicon Valley Story. While the center of the biotech industry today is arguably Cambridge, MA, Genentech was an original Silicon Valley - high risk/high reward bet. Genentech was funded by the historically great Kleiner Perkins - a silicon valley VC born out of the semiconductor company Fairchild Semiconductor (where Kleiner was part of the traitorous eight). Kleiner was joined by Tom Perkins, who worked at Hewlett Packard in the 1960s, and brought HP into the minicomputer business. As one of the earliest venture capitalists, with a great knowledge of the Silicon Valley semiconductor and technological innovation boom, they hit big winners with Compaq, EA, Amazon, Sun Microsystems, and many others. A lot of these investments were speculative at the time and the team understood more risk at the earlier stages meant more reward down the line. As Perkins put it: “Kleiner & Perkins realizes that an investment in Genentech is highly speculative, but we are in the business of making highly speculative investments.” After weeks of meeting with Swanson and a key meeting with Herb Boyer, Perkins took the plunge, leading a $100,000 seed investment in Genentech in May of 1976. Perkins commented: “I concluded that the experiment might not work, but at least they knew how to do the experiment.” Despite the work of raising billions of dollars for Genentech’s continually growing product and partnership pipeline, Perkins commented years later on his involvement with Genentech: “I can’t remember at what point it dawned on me that Gentech would probably be the most important deal of my life, in many terms - the returns, the social benefits, the excitement, the technical prowess, and the fun.” Perkins stayed on the board for 20 years and Kleiner Perkins led several investments in the company over the years. Genentech eventually got acquired by Hoffman-La Roche (now called Roche), when they bought 60% of the company for $2B in 1990 and the rest of the company for $47B in 2009. Genentech was the first big biotech win and helped establish Silicon Valley’s cache in the process!

Dig Deeper

  • An Overview of Genetic Engineering (the tech underpinning Genentech)

  • The History of Insulin - 100 Years of Innovation by Dr. Daniel Drucker

  • How Drug Prices Work by the Wall Street Journal

  • How to Value Biotech Stocks by the Biotechnology Innovation Organization

  • Wonderful Life: An Interview with Herb Boyer

tags: Biotech, Genentech, Eli Lilly, Orphan Drug Act, Bob Swanson, Paul Berg, National Academy of Science, Stan Cohen, Herb Boyer, Stanford, UCSF, Geoff Hinton, Databricks, Ali Ghodsi, UC Berkeley, Pharma, FDA, Rare Disease
categories: Non-Fiction
 

June 2021 - Letters to the Nomad Partnership 2001-2013 (Nick Sleep's and Qais Zakaria's Investor Letters)

This month we review a unique source of information - mysterious fund manager Nick Sleep’s investment letters. Sleep had an extremely successful run and identified several very interesting companies and characteristics of those companies which made for great investments. He was early to uncover Amazon, Costco, and others - riding their stocks into the stratosphere over the last 20 years. These letters cover the internet bubble, the 08/09 crisis, and all types of interesting businesses across the world.

The full letters can be found here

The full letters can be found here

Tech Themes

  1. Scale Benefits Shared. Nick Sleep’s favored business model is what he calls Scale Benefits Shared. The idea is straight forward and appears across industries. Geico, Amazon, and Costco all have this business model. Its simple - companies start with low prices and spend only on the most important things. Over time as the company scales (more insured drivers, more online orders, more stores) they pass on the benefits of scale to the customer with even further lower prices. The consumer then buys more with the low-cost provider. This has a devastating effect on competition - it forces companies to exit the industry because the one sharing the scale benefits has to become hyper-efficient to continue to make the business model work. “In the case of Costco scale efficiency gains are passed back to the consumer in order to drive further revenue growth. That way customers at one of the first Costco stores (outside Seattle) benefit from the firm’s expansion (into say Ohio) as they also gain from the decline in supplier prices. This keeps the old stores growing too. The point is that having shared the cost savings, the customer reciprocates, with the result that revenues per foot of retailing space at Costco exceed that at the next highest rival (WalMart’s Sam’s Club) by about fifty percent.” Jeff Bezos was also very focused on this, his 2006 annual letter highlighted as much: “Our judgment is that relentlessly returning efficiency improvements and scale economies to customers in the form of lower prices creates a virtuous cycle that leads over the long-term to a much larger dollar amount of free cash flow, and thereby to a much more valuable Amazon.com. We have made similar judgments around Free Super Saver Shipping and Amazon Prime, both of which are expensive in the short term and – we believe – important and valuable in the long term.” So what companies today are returning scale efficiencies with customers? One recent example is Snowflake - which is a super expensive solution but is at least posturing correctly in favor of this model - the recent earnings call highlighted that they had figured out a better way to store data, resulting in a storage price decrease for customers. Fivetran’s recent cloud data warehouse comparison showed Snowflake was both cheaper and faster than competitors Redshift and Bigquery - a good spot to be in! Another example of this might be Cloudflare - they are lower cost than any other CDN in the market and have millions of free customers. Improvements made to the core security+CDN engine, threat graph, and POP locations result in better performance for all of their free users, which leads to more free users, more threats, vulnerabilities, and location/network demands - a very virtuous cycle!

  2. The Miracle of Compound Growth & Its Obviousness. While appreciated in some circles, compounding is revered by Warren Buffett and Nick Sleep - it’s a miracle worth celebrating every day. Sleep takes this idea one step further, after discussing how the average hold period of stocks has fallen significantly over the past few decades: “The fund management industry has it that owning shares for a long time is futile as the future is unknowable and what is known is discounted. We respectfully disagree. Indeed, the evidence may suggest that investors rarely appropriately value truly great companies.” This is quite a natural phenomenon as well - when Google IPO’d in 2004 for a whopping $23bn, were investors really valuing the company appropriately? Were Visa ($18Bn valuation, largest US IPO in history) and Mastercard ($5.3Bn valuation) being valued appropriately? Even big companies like Apple in 2016 valued at $600Bn were arguably not valued appropriately. Hindsight is obvious, but the durability of compounding in great businesses is truly a myth to behold. That’s why Sleep and Zakaria wound down the partnership in 2014, opting to return LP money and only own Berkshire, Costco, and Amazon for the next decade (so far that’s been a great decision!). While frequently cited as a key investing principle, compounding in technology, experiences, art, and life are rarely discussed, maybe because they are too obvious. Examples of compounding (re-investing interest/dividends and waiting) abound: Moore’s Law, Picasso’s art training, Satya Nadella’s experience running Bing and Azure before becoming CEO, and Beatles playing clubs for years before breaking on the scene. Compounding is a universal law that applies to so much!

  3. Information Overload. Sleep makes a very important but subtle point toward the end of his letters about the importance of reflective thinking:

    BBC Interviewer: “David Attenborough, you visited the North and South Poles, you witnessed all of life in-between from the canopies of the tropical rainforest to giant earthworms in Australia, it must be true, must it not, and it is a quite staggering thought, that you have seen more of the world than anybody else who has ever lived?”

    David Attenborough: “Well…I suppose so…but then on the other hand it is fairly salutary to remember that perhaps the greatest naturalist that ever lived and had more effect on our thinking than anybody, Charles Darwin, only spent four years travelling and the rest of the time thinking.”

    Sleep: “Oh! David Attenborough’s modesty is delightful but notice also, if you will, the model of behaviour he observed in Charles Darwin: study intensely, go away, and really think.”

    There is no doubt that the information age has ushered in a new normal for daily data flow and news. New information is constant and people have the ability to be up to date on everything, all the time. While there are benefits to an always-on world, the pace of information flow can be overwhelming and cause companies and individuals to lose sight of important strategic decisions. Bill Gates famously took a “think week” each year where he would lock himself in a cabin with no internet connection and scan over hundreds of investment proposals from Microsoft employees. A Harvard study showed that reflection can even improve job performance. Sometimes the constant data flow can be a distraction from what might be a very obvious decision given a set of circumstances. Remember to take some time to think!

principal-agent-problem.png
image-13.png

Business Themes

  1. Psychological Mistakes. Sleep touches on several different psychological problems and challenges within investing and business, including the role of Social Proof in decision making. Social proof occurs when individuals look to others to determine how to behave in a given situation. A classic example of Social Proof comes from an experiment done by Psychologist, Stanley Milgram, in which he had groups of people stare up at the sky on a crowded street corner in New York City. When five people were standing and looking up (as opposed to a single person), many more people also stopped to look up, driven by the group behavior. This principle shows up all the time in business and is a major proponent in financial bubbles. People see others making successful investments at high valuations and that drives them to do the same. It can also drive product and strategic decisions - companies launching dot-com names in the 90’s to drive their stock price up, companies launching corporate venture arms in rising markets, companies today deciding they need a down-market “product-led growth” engine. As famed investor Stan Druckenmiller notes, its hard to sit idly by while others (who may be less informed) crush certain types of investments: “I bought $6 billion worth of tech stocks, and in six weeks I had lost $3 billion in that one play. You asked me what I learned. I didn’t learn anything. I already knew that I wasn’t supposed to do that. I was just an emotional basketcase and I couldn’t help myself. So maybe I learned not to do it again, but I already knew that.”

  2. Incentives, Psychology, and Ownership Mindset. Incentives are incredibly powerful in business and its surprisingly difficult to get people to do the right thing. Sleep spends a lot of time on incentives and the so-called Principal-Agent Conflict. Often times the Principal (Owner, Boss, Purchaser, etc.) may employ an Agent (Employee, Contractor, Service) to accomplish something. However the goals and priorities of the principal may not align with that agent. As an example, when your car breaks down and you need to go to a local mechanic to fix it, you (the principal) want to find someone to fix the car as well and as cheaply as possible. However, the agent (the mechanic) may be incentivized to create the biggest bill possible to drive business for their garage. Here we see the potential for misaligned incentives. After 5 years of really strong investment results, Sleep and Zakaria noticed a misaligned incentive of their own: “Which brings me to the subject of the existing performance fee. Eagle-eyed investors will not have failed but notice the near 200 basis point difference between gross and net performance this year, reflecting the performance fee earned. We are in this position because performance for all investors is in excess of 6% per annum compounded. But given historic performance, that may be the case for a very long time. Indeed, we are so far ahead of the hurdle that if the Partnership now earned pass-book rates of return, say 5% per annum, we would continue to “earn” 20% performance fees (1% of assets) for thirty years, that is, until the hurdle caught up with actual results. During those thirty years, which would see me through to retirement, we would have added no value over the money market rates you can earn yourself, but we would still have been paid a “performance fee”. We are only in this position because we have done so well, and one could argue that contractually we have earned the right by dint of performance, but just look at the conflicts!” They could have invested in treasury bonds and collected a performance fee for years to come but they knew that was unfair to limited partners. So the duo created a resetting fee structure, that allowed LPs to claw back performance fees if Nomad did not exceed the 6% hurdle rate for a given year. This kept the pair focused on driving continued strong results through the life of the partnership.

  3. Discovery & Pace. Nick Sleep and Qais Zakaria looked for interesting companies in interesting situations. Their pace is simply astounding: “When Zak and I trawled through the detritus of the stock market these last eighteen months (around a thousand annual reports read and three hundred companies interviewed)…” Sleep and Zakaria put up numbers: 55 annual reports per month (~2 per day), 17 companies interviewed per month (meeting every other day)! That is so much reading. Its partially unsurprising that after a while they started to be able to find things in the annual reports that piqued their interest. Not only did they find retrospectively obvious gems like Amazon and Costco, they also looked all around the world for mispricings and interesting opportunities. One of their successful international investments took place in Zimbabwe, where they noticed significant mispricing involving the Harare Stock Exchange, which opened in 1896 but only started allowing foreign investment in 1993. While Nomad certainly made its name on the Scaled efficiencies shared investment model, Zimbabwe offered Sleep and Zakaria to prioritize their second model: “We have little more than a handful of distinct investment models, which overlap to some extent, and Zimcem is a good example of a second model namely, ‘deep discount to replacement cost with latent pricing power.’” Zimcem was the country’s second-largest cement producer, which traded at a massive discount to replacement cost due to terrible business conditions (inflation growing faster than the price of cement). Not only did Sleep find a weird, mispriced asset, he also employed a unique way of acquiring shares to further increase his margin of safety. “The official exchange rate at the time of writing is Z$9,100 to the U$1. The unofficial, street rate is around Z$17,000 to the U$1. In other words, the Central Bank values its own currency at over twice the price set by the public with the effect that money entering the country via the Central Bank buys approximately half as much as at the street rate. Fortunately, there is an alternative to the Central Bank for foreign investors, which is to purchase Old Mutual shares in Johannesburg, re-register the same shares in Harare and then sell the shares in Harare. This we have done.“ By doing this, Nomad was able to purchase shares at a discounted exchange rate (they would also face the exchange rate on sale, so not entirely increasing the margin of safety). The weird and off the beaten path investments and companies can offer rich rewards to those who are patient. This was the approach Warren Buffett employed early on in his career, until he started focusing on “wonderful businesses” at Charlie Munger’s recommendation.

Dig Deeper

  • Overview of Several Scale Economies Shared Businesses

  • Investor Masterclass Learnings from Nick Sleep

  • Warren Buffett & Berkshire’s Compounding

  • Jim Sinegal (Costco Founder / CEO) - Provost Lecture Series Spring 2017

  • Robert Cialdini - Mastering the Seven Principles of Influence and Persuasion

tags: Costco, Warren Buffett, Berkshire Hathaway, Geico, Jim Sinegal, Cloudflare, Snowflake, Visa, Mastercard, Google, Fivetran, Walmart, Apple, Azure, Bing, Satya Nadella, Beatles, Picasso, Moore's Law, David Attenborough, Nick Sleep, Qais Zakaria, Charles Darwin, Bill Gates, Microsoft, Stanley Druckenmiller, Charlie Munger, Zimbabwe, Harare
categories: Non-Fiction
 

May 2021 - Crossing the Chasm by Geoffrey Moore

This month we take a look at a classic high-tech growth marketing book. Originally published in 1991, Crossing the Chasm became a beloved book within the tech industry although its glory seems to have faded over the years. While the book is often overly prescriptive in its suggestions, it provides several useful frameworks to address growth challenges primarily early on in a company’s history.

Tech Themes

  1. Technology Adoption Life Cycle. The core framework of the book discusses the evolution of new technology adoption. It was an interesting micro-view of the broader phenomena described in Carlota Perez’s Technological Revolutions. In Moore’s Chasm-crossing world, there are five personas that dominate adoption: innovators, early adopters, early majority, late majority, and laggards. Innovators are technologists, happy to accept more challenging user experiences to push the boundaries of their capabilities and knowledge. Early adopters are intuitive buyers that enjoy trying new technologies but want a slightly better experience. The early majority are “wait and see” folks that want others to battle test the technology before trying it out, but don’t typically wait too long before buying. The late majority want significant reference material and usage before buying a product. Laggards simply don’t want anything to do with new technology. It is interesting to think of this adoption pattern in concert with big technology migrations of the past twenty years including: mainframes to on-premise servers to cloud computing, home phones to cell phones to iphone/android, radio to CDs to downloadable music to Spotify, and cash to check to credit/debit to mobile payments. Each of these massive migration patterns feels very aligned with this adoption model. Everyone knows someone ready to apply the latest tech, and someone who doesn’t want anything to do with it (Warren Buffett!).

  2. Crossing the Chasm. If we accept the above as a general way products are adopted by society (obviously its much more of a mish/mash in reality), we can posit that the most important step is from the early adopters to the early majority - the spot where the bell curve (shown below) really opens up. This is what Geoffrey Moore calls Crossing the Chasm. This idea is highly reminiscent of Clay Christensen’s “not good enough” disruption pattern and Gartner’s technology hype cycle. The examples Moore uses (in 1991) are also striking: Neural networking software and desktop video conferencing. Moore lamented: “With each of these exciting, functional technologies it has been possible to establish a working system and to get innovators to adopt it. But it has not as yet been possible to carry that success over to the early adopters.” Both of these technologies have clearly crossed into the mainstream with Google’s TensorFlow machine learning library and video conferencing tools like Zoom that make it super easy to speak with anyone over video instantly. So what was the great unlock for these technologies, that made these commercially viable and successfully adopted products? Well since 1990 there have been major changes in several important underlying technologies - computer storage and data processing capabilities are almost limitless with cloud computing, network bandwidth has grown exponentially and costs have dropped, and software has greatly improved the ability to make great user experiences for customers. This is a version of not-good-enough technologies that have benefited substantially from changes in underlying inputs. The systems you could deploy in 1990 just could not have been comparable to what you can deploy today. The real question is - are there different types of adoption curves for differently technologies and do they really follow a normal distribution as Moore shows here?

  3. Making Markets & Product Alternatives. Moore positions the book as if you were a marketing executive at a high-tech company and offers several exercises to help you identify a target market, customer, and use case. Chapter six, “Define the Battle” covers the best way to position a product within a target market. For early markets, competition comes from non-consumption, and the company has to offer a “Whole Product” that enables the user to actually derive benefit from the product. Thus, Moore recommends targeting innovators and early adopters who are technologist visionaries able to see the benefit of the product. This also mirrors Clayton Christensen’s commoditization de-commoditization framework, where new market products must offer all of the core components to a system combined into one solution; over time the axis of commoditization shifts toward the underlying components as companies differentiate by using faster and better sub-components. Positioning in these market scenarios should be focused on the contrast between your product and legacy ways of performing the task (use our software instead of pen and paper as an example). In mainstream markets, companies should position their products within the established buying criteria developed by pragmatist buyers. A market alternative serves as the incumbent, well-known provider and a product alternative is a near upstart competitor that you are clearly beating. What’s odd here is that you are constantly referring to your competitors as alternatives to your product, which seems counter-intuitive but obviously, enterprise buyers have alternatives they are considering and you need to make the case that your solution is the best. Choosing a market alternative lets you procure a budget previously used for a similar solution, and the product alternative can help differentiate your technology relative to other upstarts. Moore’s simple positioning formula has helped hundreds of companies establish their go-to-market message: “For (target customers—beachhead segment only) • Who are dissatisfied with (the current market alternative) • Our product is a (new product category) • That provides (key problem-solving capability). • Unlike (the product alternative), • We have assembled (key whole product features for your specific application).”

Business Themes

0_KIXz2tAVqXVREkyd.png
Whole-Product-5-PRODUCT-LEVELS-PHILIP-KOTLER.png
Zz0xZTMzMGUxNGRlNWQxMWVhYTYyMTBhMTMzNTllZGE5ZA==.png
  1. What happened to these examples? Moore offers a number of examples of Crossing the Chasm, but what actually happened to these companies after this book was written? Clarify Software was bought in October 1999 by Nortel for $2.1B (a 16x revenue multiple) and then divested by Nortel to Amdocs in October 2001 for $200M - an epic disaster of capital allocation. Documentum was acquired by EMC in 2003 for $1.7B in stock and was later sold to OpenText in 2017 for $1.6B. 3Com Palm Pilot was a mess of acquisitions/divestitures. Palm was acquired by U.S Robotics which was acquired by 3COM in 1997 and then subsequently spun out in a 2000 IPO which saw a 94% drop. Palm stopped making PDA devices in 2008 and in 2010, HP acquired Palm for $1.2B in cash. Smartcard maker Gemplus merged with competitor Axalto in an 1.8Bn euro deal in 2005, creating Gemalto, which was later acquired by Thales in 2019 for $8.4Bn. So my three questions are: Did these companies really cross the chasm or were they just readily available success stories of their time? Do you need to be the company that leads the chasm crossing or can someone else do it to your benefit? What is the next step in the chasm journey after its crossed and why did so many of these companies fail after a time?

  2. Whole Products. Moore leans into an idea called the Whole Product Concept which was popularized by Theodore Levitt’s 1983 book The Marketing Imagination and Bill Davidow’s (of early VC Mohr Davidow) 1986 book Marketing High Technology. Moore explains the idea: “The concept is very straightforward: There is a gap between the marketing promise made to the customer—the compelling value proposition—and the ability of the shipped product to fulfill that promise. For that gap to be overcome, the product must be augmented by a variety of services and ancillary products to become the whole product.” There are four different perceptions of the product: “1. Generic product: This is what is shipped in the box and what is covered by the purchasing contract. 2.Expected product: This is the product that the consumer thought she was buying when she bought the generic product. It is the minimum configuration of products and services necessary to have any chance of achieving the buying objective. For example, people who are buying personal computers for the first time expect to get a monitor with their purchase-how else could you use the computer?—but in fact, in most cases, it is not part of the generic product. 3.Augmented product: This is the product fleshed out to provide the maximum chance of achieving the buying objective. In the case of a personal computer, this would include a variety of products, such as software, a hard disk drive, and a printer, as well as a variety of services, such as a customer hotline, advanced training, and readily accessible service centers. 4. Potential product: This represents the product’s room for growth as more and more ancillary products come on the market and as customer-specific enhancements to the system are made. These are the product features that have maybe expected or additional to drive adoption.” Moore makes a subtle point that after a while, investments in the generic/out-of-the-box product functionality drive less and less purchase behavior, in tandem with broader market adoption. Customers want to be wooed by the latest technology and as products become similar, customers care less about what’s in the product today, and more about what’s coming. Moore emphasizes Whole Product Planning where you can see how you get to those additional features into the product over time - but Moore was also operating in an era when product decisions and development processes were on two-year+ timelines and not in the DevOps era of today, where product updates are pushed daily in some cases. In the bottoms-up/DevOps era, its become clear that finding your niche users, driving strong adoption from them, and integrating feature ideas from them as soon as possible can yield a big success.

  3. Distribution Channels. Moore focuses on each of the potential ways a company can distribute its solutions: Direct Sales, two-tier retail, one-tier retail, internet retail, two-tier value-added reselling, national roll-ups, original equipment manufacturers (OEMs), and system integrators. As Moore puts it, “The number-one corporate objective, when crossing the chasm, is to secure a channel into the mainstream market with which the pragmatist customer will be comfortable.” These distribution types are clearly relics of technology distribution in the early 1990s. Great direct sales have produced some of the best and biggest technology companies of yesterday including IBM, Oracle, CA Technologies, SAP, and HP. What’s so fascinating about this framework is that you just need one channel to reach the pragmatist customer and in the last 10 years, that channel has become the internet for many technology products. Moore even recognizes that direct sales had produced poor customer alignment: “First, wherever vendors have been able to achieve lock-in with customers through proprietary technology, there has been the temptation to exploit the relationship through unfairly expensive maintenance agreements [Oracle did this big time] topped by charging for some new releases as if they were new products. This was one of the main forces behind the open systems rebellion that undermined so many vendors’ account control—which, in turn, decrease predictability of revenues, putting the system further in jeopardy.” So what is the strategy used by popular open-source bottoms up go-to-market motions at companies like Github, Hashicorp, Redis, Confluent and others? Its straightforward - the internet and simple APIs (normally on Github) provide the fastest channel to reach the developer end market while they are coding. When you look at Open Source scaling, it can take years and years to Cross the Chasm because most of these early open source adopters are technology innovators, however, eventually, solutions permeate into massive enterprises and make the jump. With these new go-to-market motions coming on board, driven by the internet, we’ve seen large companies grow from primarily inbound marketing tactics and less direct outbound sales. The companies named above as well as Shopify, Twilio, Monday.com and others have done a great job growing to a massive scale on the backs of their products (product-led growth) instead of a salesforce. What’s important to realize is that distribution is an abstract term and no single motion or strategy is right for every company. The next distribution channel will surprise everyone!

Dig Deeper

  • How the sales team behind Monday is changing the way workplaces collaborate

  • An Overview of the Technology Adoption Lifecycle

  • A Brief History of the Cloud at NDC Conference

  • Frank Slootman (Snowflake) and Geoffrey Moore Discuss Disruptive Innovations and the Future of Tech

  • Growth, Sales, and a New Era of B2B by Martin Casado (GP at Andreessen Horowitz)

  • Strata 2014: Geoffrey Moore, "Crossing the Chasm: What's New, What's Not"

tags: Crossing the Chasm, Github, Hashicorp, Redis, Monday.com, Confluent, Open Source, Snowflake, Shopify, Twilio, Geoffrey Moore, Gartner, TensorFlow, Google, Clayton Christensen, Zoom, nORTEL, Amdocs, OpenText, EMC, HP, CA, IBM, Oracle, SAP, Gemalto, DevOps
categories: Non-Fiction
 

April 2021 - Innovator's Solution by Clayton Christensen and Michael Raynor

This month we take another look at disruptive innovation in the counter piece to Clayton Christensen’s Innovator’s Dilemma, our July 2020 book. The book crystallizes the types of disruptive innovation and provides frameworks for how incumbents can introduce or combat these innovations. The book was a pleasure to read and will serve as a great reference for the future.

Tech Themes

  1. Integration and Outsourcing. Today, technology companies rely on a variety of software tools and open source components to build their products. When you stitch all of these components together, you get the full product architecture. A great example is seen here with Gitlab, an SMB DevOps provider. They have Postgres for a relational database, Redis for caching, NGINX for request routing, Sentry for monitoring and error tracking and so on. Each of these subsystems interacts with each other to form the powerful Gitlab project. These interaction points are called interfaces. The key product development question for companies is: “Which things do I build internally and which do I outsource?” A simple answer offered by many MBA students is “Outsource everything that is not part of your core competence.” As Clayton Christensen points out, “The problem with core-competence/not-your-core-competence categorization is that what might seem to be a non-core activity today might become an absolutely critical competence to have mastered in a proprietary way in the future, and vice versa.” A great example that we’ve discussed before is IBM’s decision to go with Microsoft DOS for its Operating System and Intel for its Microprocessor. At the time, IBM thought it was making a strategic decision to outsource things that were not within its core competence but they inadvertently gave almost all of the industry profits from personal computing to Intel and Microsoft. Other competitors copied their modular approach and the whole industry slugged it out on price. The question of whether to outsource really depends on what might be important in the future. But that is difficult to predict, so the question of integration vs. outsourcing really comes down to the state of the product and market itself: is this product “not good enough” yet? If the answer is yes, then a proprietary, integrated architecture is likely needed just to make the actual product work for customers. Over time, as competitors enter the market and the fully integrated platform becomes more commoditized, the individual subsystems become increasingly important competitive drivers. So the decision to outsource or build internally must be made on the status of product and the market its attacking.

  2. Commoditization within Stacks. The above point leads to the unbelievable idea of how companies fall into the commoditization trap. This happens from overshooting, where companies create products that are too good (which I find counter-intuitive, who thought that doing your job really well would cause customers to leave!). Christensen describes this through the lens of a salesperson “‘Why can’t they see that our product is better than the competition? They’re treating it like a commodity!’ This is evidence of overshooting…there is a performance surplus. Customers are happy to accept improved products, but unwilling to pay a premium price to get them.” At this time, the things demanded by customers flip - they are willing to pay premium prices for innovations along a new trajectory of performance, most likely speed, convenience, and customization. “The pressure of competing along this new trajectory of improvement forces a gradual evolution in product architectures, away from the interdependent, proprietary architectures that had the advantage in the not-good-enough era toward modular designs in the era of performance surplus. In a modular world, you can prosper by outsourcing or by supplying just one element.” This process of integration, to modularization and back, is super fascinating. As an example of modularization, let’s take the streaming company Confluent, the makers of the open-source software project Apache Kafka. Confluent offers a real-time communications service that allows companies to stream data (as events) rather than batching large data transfers. Their product is often a sub-system underpinning real-time applications, like providing data to traders at Citigroup. Clearly, the basis of competition in trading has pivoted over the years as more and more banking companies offer the service. Companies are prioritizing a new axis, speed, to differentiate amongst competing services, and when speed is the basis of competition, you use Confluent and Kafka to beat out the competition. Now let’s fast forward five years and assume all banks use Kafka and Confluent for their traders, the modular sub-system is thus commoditized. What happens? I’d posit that the axis would shift again, maybe towards convenience, or customization where traders want specific info displayed maybe on a mobile phone or tablet. The fundamental idea is that “Disruption and commoditization can be seen as two sides of the same coin. That’s because the process of commoditization initiates a reciprocal process of de-commoditization [somewhere else in the stack].”

  3. The Disruptive Becomes the Disruptor. Disruption is a relative term. As we’ve discussed previously, disruption is often mischaracterized as startups enter markets and challenge incumbents. Disruption is really a focused and contextual concept whereby products that are “not good enough” by market standards enter a market with a simpler, more convenient, or less expensive product. These products and markets are often dismissed by incumbents or even ceded by market leaders as those leaders continue to move up-market to chase even bigger customers. Its fascinating to watch the disruptive become the disrupted. A great example would be department stores - initially, Macy’s offered a massive selection that couldn’t be found in any single store and customers loved it. They did this by turning inventory three times per year with 40% gross margins for a 120% return on capital invested in inventory. In the 1960s, Walmart and Kmart attacked the full-service department stores by offering a similar selection at much cheaper prices. They did this by setting up a value system whereby they could make 23% gross margins but turn inventories 5 times per year, enabling them to earn the industry golden 120% return on capital invested in inventory. Full-service department stores decided not to compete against these lower gross margin products and shifted more space to beauty and cosmetics that offered even higher gross margins (55%) than the 40% they were used to. This meant they could increase their return on capital invested in inventory and their profits while avoiding a competitive threat. This process continued with discount stores eventually pushing Macy’s out of most categories until Macy’s had nowhere to go. All of a sudden the initially disruptive department stores had become disrupted. We see this in technology markets as well. I’m not 100% this qualifies but think about Salesforce and Oracle. Marc Benioff had spent a number of years at Oracle and left to start Salesforce, which pioneered selling subscription, cloud software, on a per-seat revenue model. This meant a much cheaper option compared to traditional Oracle/Siebel CRM software. Salesforce was initially adopted by smaller customers that didn’t need the feature-rich platform offered by Oracle. Oracle dismissed Salesforce as competition even as Oracle CEO Larry Ellison seeded Salesforce and sat on Salesforce’s board. Today, Salesforce is a $200B company and briefly passed Oracle in market cap a few months ago. But now, Salesforce has raised its prices and mostly targets large enterprise buyers to hit its ambitious growth initiatives. Down-market competitors like Hubspot have come into the market with cheaper solutions and more fully integrated marketing tools to help smaller businesses that aren’t ready for a fully-featured Salesforce platform. Disruption is always contextual and it never stops.

Business Themes

1_fnX5OXzCcYOyPfRHA7o7ug.png
  1. Low-end-Market vs. New-Market Disruption. There are two types of established methods for disruption: Low-end-market (Down-market) and New-market. Low-end-market disruption seeks to establish performance that is “not good enough” along traditional lines, and targets overserved customers in the low-end of the mainstream market. It typically utilizes a new operating or financial approach with structurally different margins than up-market competitors. Amazon.com is a quintessential low-end market disruptor compared to traditional bookstores, offering prices so low they angered book publishers while offering unmatched convenience to customers allowing them to purchase books online. In contrast, Robinhood is a great example of a new-market disruption. Traditional discount brokerages like Charles Schwab and Fidelity had been around for a while (themselves disruptors of full-service models like Morgan Stanley Wealth Management). But Robinhood targeted a group of people that weren’t consuming in the market, namely teens and millennials, and they did it in an easy-to-use app with a much better user interface compared to Schwab and Fidelity. Robinhood also pioneered new pricing with zero-fee trading and made revenue via a new financial approach, payment for order flow (PFOF). Robinhood makes money by being a data provider to market makers - basically, large hedge funds, like Citadel, pay Robinhood for data on their transactions to help optimize customers buying and selling prices. When approaching big markets its important to ask: Is this targeted at a non-consumer today or am I competing at a structurally lower margin with a new financial model and a “not quite good enough” product? This determines whether you are providing a low-end market disruption or a new-market disruption.

  2. Jobs To Be Done. The jobs to be done framework was one of the most important frameworks that Clayton Christensen ever introduced. Marketers typically use advertising platforms like Facebook and Google to target specific demographics with their ads. These segments are narrowly defined: “Males over 55, living in New York City, with household income above $100,000.” The issue with this categorization method is that while these are attributes that may be correlated with a product purchase, customers do not look up exactly how marketers expect them to behave and purchase the products expected by their attributes. There may be a correlation but simply targeting certain demographics does not yield a great result. The marketers need to understand why the customer is adopting the product. This is where the Jobs to Be Done framework comes in. As Christensen describes it, “Customers - people and companies - have ‘jobs’ that arise regularly and need to get done. When customers become aware of a job that they need to get done in their lives, they look around for a product or service that they can ‘hire’ to get the job done. Their thought processes originate with an awareness of needing to get something done, and then they set out to hire something or someone to do the job as effectively, conveniently, and inexpensively as possible.” Christensen zeroes in on the contextual adoption of products; it is the circumstance and not the demographics that matter most. Christensen describes ways for people to view competition and feature development through the Jobs to Be Done lens using Blackberry as an example (later disrupted by the iPhone). While the immature smartphone market was seeing feature competition from Microsoft, Motorola, and Nokia, Blackberry and its parent company RIM came out with a simple to use device that allowed for short productivity bursts when the time was available. This meant they leaned into features that competed not with other smartphone providers (like better cellular reception), but rather things that allowed for these easy “productive” sessions like email, wall street journal updates, and simple games. The Blackberry was later disrupted by the iPhone which offered more interesting applications in an easier to use package. Interestingly, the first iPhone shipped without an app store (but as a proprietary, interdependent product) and was viewed as not good enough for work purposes, allowing the Blackberry to co-exist. Management even dismissed the iPhone as a competitor initially. It wasn’t long until the iPhone caught up and eventually surpassed the Blackberry as the world’s leading mobile phone.

  3. Brand Strategies. Companies may choose to address customers in a number of different circumstances and address a number of Jobs to Be Done. It’s important that the Company establishes specific ways of communicating the circumstance to the customer. Branding is powerful, something that Warren Buffett, Terry Smith, and Clayton Christensen have all recognized as durable growth providers. As Christensen puts it: “Brands are, at the beginning, hollow words into which marketers stuff meaning. if a brand’s meaning is positioned on a job to be done, then when the job arises in a customer’s life, he or she will remember the brand and hire the product. Customers pay significant premiums for brands that do a job well.” So what can a large corporate company do when faced with a disruptive challenger to its branding turf? It’s simple - add a word to their leading brand, targeted at the circumstance in which a customer might find themself. Think about Marriott, one of the leading hotel chains. They offer a number of hotel brands: Courtyard by Marriott for business travel, Residence Inn by Marriott for a home away from home, the Ritz Carlton for high-end luxurious stays, Marriott Vacation Club for resort destination hotels. Each brand is targeted at a different Job to Be Done and customers intuitively understand what the brands stand for based on experience or advertising. A great technology example is Amazon Web Services (AWS), the cloud computing division of Amazon.com. Amazon invented the cloud, and rather than launch with the Amazon.com brand, which might have confused their normal e-commerce customers, they created a completely new brand targeted at a different set of buyers and problems, that maintained the quality and recognition that Amazon had become known for. Another great retail example is the SNKRs app released by Nike. Nike understands that some customers are sneakerheads, and want to know the latest about all Nike shoe drops, so Nike created a distinct, branded app called SNKRS, that gives news and updates on the latest, trendiest sneakers. These buyers might not be interested in logging into the Nike app and may become angry after sifting through all of the different types of apparel offered by Nike, just to find new shoes. The SNKRS app offers a new set of consumers and an easy way to find what they are looking for (convenience), which benefits Nike’s core business. Branding is powerful, and understanding the Job to Be Done helps focus the right brand for the right job.

Dig Deeper

  • Clayton Christensen’s Overview on Disruptive Innovation

  • Jobs to Be Done: 4 Real-World Examples

  • A Peek Inside Marriott’s Marketing Strategy & Why It Works So Well

  • The Rise and Fall of Blackberry

  • Payment for Order Flow Overview

  • How Commoditization Happens

tags: Clayton Christensen, AWS, Nike, Amazon, Marriott, Warren Buffett, Terry Smith, Blackberry, RIM, Microsoft, Motorola, iPhone, Facebook, Google, Robinhood, Citadel, Schwab, Fidelity, Morgan Stanley, Oracle, Salesforce, Walmart, Macy's, Kmart, Confluent, Kafka, Citigroup, Intel, Gitlab, Redis
categories: Non-Fiction
 

March 2021 - Payments Systems in the U.S. by Carol Coye Benson, Scott Loftesness, and Russ Jones

This month we dive into the fintech space for the first time! Glenbrook Partners is a famous payments consulting company. This classic book describes the history and current state of the many financial systems we use every day. While the book is a bit dated and reads like a textbook, it throws in some great real-world observations and provides a great foundation for any payments novice!

Tech Themes

  1. Mapping Open-Loop and Closed-Loop Networks. The major credit and debit card providers (Visa, Mastercard, American Express, China UnionPay, and Discover) all compete for the same spots in customer wallets but have unique and differing backgrounds and mechanics. The first credit card on the scene was the BankAmericard in the late 1950’s. As it took off, Bank of America started licensing the technology all across the US and created National BankAmericard Inc. (NBI) to facilitate its card program. NBI merged with its international counterpart (IBANCO) to form Visa in the mid-1970’s. Another group of California banks had created the Interbank Card Association (ICA) to compete with Visa and in 1979 renamed itself Mastercard. Both organizations remained owned by the banks until their IPO’s in 2006 (Mastercard) and 2008 (Visa). Both of these companies are known as open-loop networks, that is they work with any bank and require banks to sign up customers and merchants. As the bank points out, “This structure allows the two end parties to transact with each other without having direct relationships with each other’s banks.” This convenient feature of open-loop payments systems means that they can scale incredibly quickly. Any time a bank signs up a new customer or merchant, they immediately have access to the network of all other banks on the Mastercard / Visa network. In contrast to open-loop systems, American Express and Discover operate largely closed-loop systems, where they enroll each merchant and customer individually. Because of this onerous task of finding and signing up every single consumer/merchant, Amex and Discover cannot scale to nearly the size of Visa/Mastercard. However, there is no bank intermediation and the networks get total access to all transaction data, making them a go-to solution for things like loyalty programs, where a merchant may want to leverage data to target specific brand benefits at a customer. Open-loop systems like Apple Pay (its tied to your bank account) and closed-loop systems like Starbuck’s purchasing app (funds are pre-loaded and can only be redeemed at Starbucks) can be found everywhere. Even Snowflake, the data warehouse provider and subject of last month’s TBOTM is a closed-loop payments network. Customers buy Snowflake credits up-front, which can only be used to redeem Snowflake compute services. In contrast, AWS and other cloud’s are beginning to offer more open-loop style networks, where AWS credits can be redeemed against non-AWS software. Side note - these credit systems and odd-pricing structures deliberately mislead customers and obfuscate actual costs, allowing the cloud companies to better control gross margins and revenue growth. It’s fascinating to view the world through this open-loop / closed-loop dynamic.

  2. New Kids on the Block - What are Stripe, Adyen, and Marqeta? Stripe recently raised at a minuscule valuation of $95B, making it the highest valued private startup (ever?!). Marqeta, its API/card-issuing counterpart, is prepping a 2021 IPO that may value it at $10B. Adyen, a Dutch public company is worth close to $60B (Visa is worth $440B for comparison). Stripe and Marqeta are API-based payment service providers, which allow businesses to easily accept online payments and issue debit and credit cards for a variety of use cases. Adyen is a merchant account provider, which means it actually maintains the merchant account used to run a company’s business - this often comes with enormous scale benefits and reduced costs, which is why large customers like Nike have opted for Adyen. This merchant account clearing process can take quite a while which is why Stripe is focused on SMB’s - a business can sign up as a Stripe customer and almost immediately begin accepting online payments on the internet. Stripe and Marqeta’s API’s allow a seamless integration into payment checkout flows. On top of this basic but highly now simplified use case, Stripe and Marqeta (and Adyen) allow companies to issue debit and credit cards for all sorts of use cases. This is creating an absolute BOOM in fintech, as companies seek to try new and innovative ways of issuing credit/debit cards - such as expense management, banking-as-a-service, and buy-now-pay-later. Why is this now such a big thing when Stripe, Adyen, and Marqeta were all created before 2011? In 2016, Visa launched its first developer API’s which allowed companies like Stripe, Adyen, and Marqeta to become licensed Visa card issuers - now any merchant could issue their own branded Visa card. That is why Andreessen Horowitz’s fintech partner Angela Strange proclaimed: “Every company will be a fintech company.” (this is also clearly some VC marketing)! Mastercard followed suit in 2019, launching its open API called the Mastercard Innovation Engine. The big networks decided to support innovation - Visa is an investor in Stripe and Marqeta, AmEx is an investor in Stripe, and Mastercard is an investor in Marqeta. Surprisingly, no network providers are investors in Adyen. Fintech innovation has always seen that the upstarts re-write the incumbents (Visa and Mastercard are bigger than the banks with much better business models) - will the same happen here?

  3. Building a High Availability System. Do Mastercard and Visa have the highest availability needs of any system? Obviously, people are angry when Slack or Google Cloud goes down, but think about how many people are affected when Visa or Mastercard goes down? In 2018, a UK hardware failure prompted a five-hour outage at Visa: “Disgruntled customers at supermarkets, petrol stations and abroad vented their frustrations on social media when there was little information from the financial services firm. Bank transactions were also hit.” High availability is a measure of system uptime: “Availability is often expressed as a percentage indicating how much uptime is expected from a particular system or component in a given period of time, where a value of 100% would indicate that the system never fails. For instance, a system that guarantees 99% of availability in a period of one year can have up to 3.65 days of downtime (1%).” According to Statista, Visa handles ~185B transactions per year (a cool 6,000 per second), while UnionPay comes in second with 131B and Mastercard in third with 108B. For the last twelve months end June 30, 2020, Visa processed $8.7T in payments volume which means that the average transaction was ~$47. At 6,000 transactions per second, Visa loses $282,000 in payment volume every second it’s down. Mastercard and Visa have always been historically very cagey about disclosing data center operations (the only article I could find is from 2013) though they control their own operations much like other technology giants. “One of the keys to the [Visa] network's performance, Quinlan says, is capacity. And Visa has lots of it. Its two data centers--which are mirror images of each other and can operate interchangeably--are configured to process as many as 30,000 simultaneous transactions, or nearly three times as much as they've ever been asked to handle. Inside the pods, 376 servers, 277 switches, 85 routers, and 42 firewalls--all connected by 3,000 miles of cable--hum around the clock, enabling transactions around the globe in near real-time and keeping Visa's business running.” The data infrastructure challenges that payments systems are subjected to are massive and yet they all seem to perform very well. I’d love to learn more about how they do it!

Business Themes

interchange_fee.jpg
Interchange.png
  1. What is interchange and why does it exist? BigCommerce has a great simple definition for interchange: “Interchange fees are transaction fees that the merchant's bank account must pay whenever a customer uses a credit/debit card to make a purchase from their store. The fees are paid to the card-issuing bank to cover handling costs, fraud and bad debt costs and the risk involved in approving the payment.” What is crazy about interchange is that it is not the banks, but the networks (Mastercard, Visa, China UnionPay) that set interchange rates. On top of that, the networks set the rates but receive no revenue from interchange itself. As the book points out: “Since the card netork’s issuing customers are the recipients of interchange fees, the level of interchange that a network sets is an important element in the network’s competitive position. A higher level of interchange on one network’s card products naturally makes that network’s card products more attractive to card issuers.” The incentives here are wild - the card issuers (banks) want higher interchange because they receive the interchange from the merchant’s bank in a transaction, the card networks want more card issuing customers and offering higher interchange rates better positions them in competitive battles. The merchant is left worse off by higher interchange rates, as the merchant bank almost always passes this fee on to the merchant itself ($100 received via credit card turns out to only be $97 when it gets to their bank account because of fees). Visa and Mastercard have different interchange rates for every type of transaction and acceptance method - making it a complicated nightmare to actually understand their fees. The networks and their issuers may claim that increased interchange fees allow banks to invest more in fraud protection, risk management, and handling costs, but there is no way to verify this claim. This has caused a crazy war between merchants, the card networks, and the card issuers.

  2. Why is Jamie Dimon so pissed about fintechs? In a recent interview, Jamie Dimon, CEO of JP Morgan Chase, recently called fintechs “examples of unfair competition.” Dimon is angry about the famous (or infamous) Durbin Amendment, which was a last-minute addition included in the landmark Dodd-Frank Wall Street Reform and Consumer Protection Act of 2010. The Durbin amendment attempted to cap the interchange amount that could be charged by banks and tier the interchange rates based on the assets of the bank. In theory, capping the rates would mean that merchants paid less in fees, and the merchant would pass these lower fees onto the consumer by giving them lower prices thus spurring demand. The tiering would mean banks with >$10B in assets under management would make less in interchange fees, leveling the playing field for smaller banks and credit unions. “The regulated [bank with >$10B in assets] debit fee is 0.05% + $0.21, while the unregulated is 1.60% + $0.05. Before the Durbin Amendment the fee was 1.190% + $0.10.” While this did lower debit card interchange, a few unintended consequences resulted: 1. Regulators expected that banks would make substantially less revenue, however, they failed to recognize that banks might increase other fees to offset this lost revenue stream: “Banks have cut back on offering rewards for their debit cards. Banks have also started charging more for their checking accounts or they require a larger monthly balance.” In addition, many smaller banks couldn’t recoup the lost revenue amount, leading to many bankruptcies and consolidation. 2. Because a flat rate fee was introduced regardless of transaction size, smaller merchants were charged more in interchange than the prior system (which was pro-rated based on $ amount). “One problem with the Durbin Amendment is that it didn’t take small transactions into account,” said Ellen Cunningham, processing expert at CardFellow.com. “On a small transaction, 22 cents is a bigger bite than on a larger transaction. Convenience stores, coffee shops and others with smaller sales benefited from the original system, with a lower per-transaction fee even if it came with a higher percentage.” These small retailers ended up raising prices in some instances to combat these additional fees - causing the law to have the opposite effect of lowering costs to consumers. Dimon is angry that this law has allowed fintech companies to start charging higher prices for debit card transactions. As shown above, smaller banks earn a substantial amount more in interchange fees. These smaller banks are moving quickly to partner with fintechs, which now power hundreds of millions of dollars in account balances and Dimon believes they are not spending enough attention on anti-money laundering and fraud practices. In addition, fintech’s are making money in suspect ways - Chime makes 21% of its revenue through high out-of-network ATM fees, and cash advance companies like Dave, Branch, and Earnin’ are offering what amount to pay-day loans to customers.

  3. Mastercard and Visa: A history of regulation. Visa and Mastercard have been the subject of many regulatory battles over the years. The US Justice Department announced in March that it would be investigating Visa over online debit-card practices. In 1996, Visa and Mastercard were sued by merchants and settled for $3B. In 1998, the Department of Justice won a case against Visa and Mastercard for not allowing issuing banks to work with other card networks like AmEx and Discover. In 2009, Mastercard and Visa were sued by the European Union and forced to reduce debit card swipe fees by 0.2%. In 2012, Mastercard and Visa were sued for price-fixing fees and were forced to pay $6.25B in a settlement. The networks have been sued by the US, Europe, Australia, New Zealand, ATM Operators, Intuit, Starbucks, Amazon, Walmart, and many more. Each time they have been forced to modify fees and practices to ensure competition. However, this has also re-inforced their dominance as the biggest payment networks which is why no competitors have been established since the creation of the networks in the 1970’s. Also, leave it to the banks to establish a revenue source that is so good that it is almost entirely undefeatable by legislation. When, if ever, will Visa and Mastercard not be dominant payments companies?

Dig Deeper

  • American Banker: Big banks, Big Tech face-off over swipe fees

  • Stripe Sessions 2019 | The future of payments

  • China's growth cements UnionPay as world's largest card scheme

  • THE DAY THE CREDIT CARD WAS BORN by Joe Nocera (Washington Post)

  • Mine Safety Disclosure’s 2019 Visa Investment Case

  • FineMeValue’s Payments Overview

tags: Visa, Mastercard, American Express, Discover, Bank of America, Stripe, Marqeta, Adyen, Apple, Open-loop, Closed-loop, Snowflake, AWS, Nike, BNPL, Andreessen Horowitz, Angela Strange, Slack, Google Cloud, UnionPay, BigCommerce, Jamie Dimon, Dodd-Frank, Durbin Amendment, JP Morgan Chase, Debit Cards, Credit Cards, Chime, Branch, Earnin', US Department of Justice, Intuit, Starbucks, Amazon, Walmart
categories: Non-Fiction
 

February 2021 - Rise of the Data Cloud by Frank Slootman and Steve Hamm

This month we read a new book by the CEO of Snowflake and author of our November 2020 book, Tape Sucks. The book covers Snowflake’s founding, products, strategy, industry specific solutions and partnerships. Although the content is somewhat interesting, it reads more like a marketing book than an actually useful guide to cloud data warehousing. Nonetheless, its a solid quick read on the state of the data infrastructure ecosystem.

Tech Themes

  1. The Data Warehouse. A data warehouse is a type of database that is optimized for analytics. These optimizations mainly revolve around complex query performance, the ability to handle multiple data types, the ability to integrate data from different applications, and the ability to run fast queries across large data sets. In contrast to a normal database (like Postgres), a data warehouse is purpose-built for efficient retrieval of large data sets and not high performance read/write transactions like a typical relational database. The industry began in the late 1970s and early 80’s, driven by work done by the “Father of Data Warehousing” Bill Inmon and early competitor Ralph Kimball, who was a former Xerox PARC designer. In 1986, Kimball launched Redbrick Systems and Inmon launched Prism Solutions in 1991, with its leading product the Prism Warehouse Manager. Prism went public in 1995 and was acquired by Ardent Software in 1998 for $42M while Red Brick was acquired by Informix for ~$35M in 1998. In the background, a company called Teradata, which was formed in the late 1970s by researchers at Cal and employees from Citibank, was going through their own journey to the data warehouse. Teradata would IPO in 1987, get acquired by NCR in 1991; NCR itself would get acquired by AT&T in 1991; NCR would then spin out of AT&T in 1997, and Teradata would spin out of NCR through IPO in 2007. What a whirlwind of corporate acquisitions! Around that time, other new data warehouses were popping up on the scene including Netezza (launched in 1999) and Vertica (2005). Netezza, Vertica, and Teradata were great solutions but they were physical hardware that ran a highly efficient data warehouse on-premise. The issue was, as data began to grow on the hardware, it became really difficult to add more hardware boxes and to know how to manage queries optimally across the disparate hardware. Snowflake wanted to leverage the unlimited storage and computing power of the cloud to allow for infinitely scalable data warehouses. This was an absolute game-changer as early customer Accordant Media described, “In the first five minutes, I was sold. Cloud-based. Storage separate from compute. Virtual warehouses that can go up and down. I said, ‘That’s what we want!’”

  2. Storage + Compute. Snowflake was launched in 2012 by Benoit Dageville (Oracle), Thierry Cruanes (Oracle) and Marcin Żukowski (Vectorwise). Mike Speiser and Sutter Hill Ventures provided the initial capital to fund the formation of the company. After numerous whiteboarding sessions, the technical founders decided to try something crazy, separating data storage from compute (processing power). This allowed Snowflake’s product to scale the storage (i.e. add more boxes) and put tons of computing power behind very complex queries. What may have been limited by Vertica hardware, was now possible with Snowflake. At this point, the cloud had only been around for about 5 years and unlike today, there were only a few services offered by the main providers. The team took a huge risk to 1) bet on the long-term success of the public cloud providers and 2) try something that had never successfully been accomplished before. When they got it to work, it felt like magic. “One of the early customers was using a $20 million system to do behavioral analysis of online advertising results. Typically, one big analytics job would take about thirty days to complete. When they tried the same job on an early version of Snowflake;’s data warehouse, it took just six minutes. After Mike learned about this, he said to himself: ‘Holy shit, we need to hire a lot of sales people. This product will sell itself.’” This idea was so crazy that not even Amazon (where Snowflake runs) thought of unbundling storage and compute when they built their cloud-native data warehouse, Redshift, in 2013. Funny enough, Amazon also sought to attract people away from Oracle, hence the name Red-Shift. It would take Amazon almost seven years to re-design their data warehouse to separate storage and compute in Redshift RA3 which launched in 2019. On top of these functional benefits, there is a massive gap in the cost of storage and the cost of compute and separating the two made Snowflake a significantly more cost-competitive solution than traditional hardware systems.

  3. The Battle for Data Pipelines. A typical data pipeline (shown below) consists of pulling data from many sources, perform ETL/ELT (extract, load, transform and vice versa), centralizing it in a data warehouse or data lake, and connecting that data to visualization tools like Tableau or Looker. All parts of this data stack are facing intense competition. On the ETL/ELT side, you have companies like Fivetran and Matillion and on the data warehouse/data lake side you have Snowflake and Databricks. Fivetran focuses on the extract and load portion of ETL, providing a data integration tool that allows you to connect to all of your operational systems (salesforce, zendesk, workday, etc.) and pull them all together in Snowflake for comprehensive analysis. Matillion is similar, except it connects to your systems and imports raw data into Snowflake, and then transforms it (checking for NULL’s, ensuring matching records, removing blanks) in your Snowflake data warehouse. Matillion thus focuses on the load and transform steps in ETL while Fivetran focuses on the extract and load portions and leverages dbt (data build tool) to do transformations. The data warehouse vs. data lake debate is a complex and highly technical discussion but it mainly comes down to Databricks vs. Snowflake. Databricks is primarily a Machine Learning platform that allows you to run Apache Spark (an open-source ML framework) at scale. Databricks’s main product, Delta Lake allows you to store all data types - structured and unstructured for real-time and complex analytical processes. As Datagrom points out here, the platforms come down to three differences: data structure, data ownership, and use case versatility. Snowflake requires structured or semi-structured data prior to running a query while Databricks does not. Similarly, while Snowflake decouples data storage from compute, it does not decouple data ownership meaning Snowflake maintains all of your data, whereas you can run Databricks on top of any data source you have whether it be on-premise or in the cloud. Lastly, Databricks acts more as a processing layer (able to function in code like python as well as SQL) while Snowflake acts as a query and storage layer (mainly driven by SQL). Snowflake performs best with business intelligence querying while Databricks performs best with data science and machine learning. Both platforms can be used by the same organizations and I expect both to be massive companies (Databricks recently raised at a $28B valuation!). All of these tools are blending together and competing against each other - Databricks just launched a new LakeHouse (Data lake + data warehouse - I know the name is hilarious) and Snowflake is leaning heavily into its data lake. We will see who wins!

An interesting data platform battle is brewing that will play out over the next 5-10 years: The Data Warehouse vs the Data Lakehouse, and the race to create the data cloud

Who's the biggest threat to @snowflake? I think it's @databricks, not AWS Redshifthttps://t.co/R2b77XPXB7

— Jamin Ball (@jaminball) January 26, 2021

Business Themes

Lakehouse_v1.png
architecture-overview.png
  1. Marketing Customers. This book at its core, is a marketing document. Sure, it gives a nice story of how the company was built, the insights of its founding team, and some obstacles they overcame. But the majority of the book is just a “Imagine what you could do with data” exploration across a variety of industries and use cases. Its not good or bad, but its an interesting way of marketing - that’s for sure. Its annoying they spent so little on the technology and actual company building. Our May 2019 book, The Everything Store, about Jeff Bezos and Amazon was perfect because it covered all of the decision making and challenging moments to build a long-term company. This book just talks about customer and partner use cases over and over. Slootman’s section is only about 20 pages and five of them cover case studies from Square, Walmart, Capital One, Fair, and Blackboard. I suspect it may be due to the controversial ousting of their long-time CEO Bob Muglia for Frank Slootman, co-author of this book. As this Forbes article noted: “Just one problem: No one told Muglia until the day the company announced the coup. Speaking publicly about his departure for the first time, Muglia tells Forbes that it took him months to get over the shock.” One day we will hear the actual unfiltered story of Snowflake and it will make for an interesting comparison to this book.

  2. Timing & Building. We often forget how important timing is in startups. Being the right investor or company at the right time can do a lot to drive unbelievable returns. Consider Don Valentine at Sequoia in the early 1970’s. We know that venture capital fund performance persists, in part due to incredible branding at firms like Sequoia that has built up over years and years (obviously reinforced by top-notch talents like Mike Moritz and Doug Leone). Don is a great investor and took significant risks on unproven individuals like Steve Jobs (Apple), Nolan Bushnell (Atari), and Trip Hawkins (EA). But he also had unfettered access to the birth of an entirely new ecosystem and knowledge of how that ecosystem would change business, built up from his years at Fairchild Semiconductor. Don is a unique person and capitalized on that incredible knowledgebase, veritably creating the VC industry. Sequoia is a top firm because he was in the right place at the right time with the right knowledge. Now let’s cover some companies that weren’t: Cloudera, Hortonworks, and MapR. In 2005, Yahoo engineers Doug Cutting and Mike Cafarella, inspired by the Google File System paper, created Hadoop, a distributed file system for storing and accessing data like never before. Hadoop spawned many companies like Cloudera, Hortonworks, and MapR that were built to commercialize the open-source Hadoop project. All of the companies came out of the gate fast with big funding - Cloudera raised $1B at a $4B valuation prior to its 2017 IPO, Hortonworks raised $260M at a $1B valuation prior to its 2014 IPO, and MapR $300M before it was acquired by HPE in 2019. The companies all had one thing in problem however, they were on-premise and built prior to the cloud gaining traction. That meant it required significant internal expertise and resources to run Cloudera, Hortonworks, and MapR software. In 2018, Cloudera and Hortonworks merged (at a $5B valuation) because the competitive pressure from the cloud was eroding both of their businesses. MapR was quietly acquired for less than it raised. Today Cloudera trades at a $5B valuation meaning no shareholder return since the merger and the business is only recently slightly profitable at its current low growth rate. This cautionary case study shows how important timing is and how difficult it is to build a lasting company in the data infrastructure world. As the new analytics stack is built with Fivetran, Matillion, dbt, Snowflake, and Databricks, it will be interesting to see which companies exist 10 years from now. Its probable that some new technology will come along and hurt every company in the stack, but for now the coast is clear - the scariest time for any of these companies.

  3. Burn Baby Burn. Snowflake burns A LOT of money. In the Nine months ended October 31, 2020, Snowflake burned $343M, including $169M in their third quarter alone. Why would Snowflake burn so much money? Because they are growing efficiently! What does efficient growth mean? As we discussed in the last Frank Slootman book - sales and marketing efficiency is a key hallmark to understand the quality of growth a company is experiencing. According to their filings, Snowflake added ~$230M of revenue and spent $325M in sales and marketing. This is actually not terribly efficient - it supposes a dollar invested in sales and marketing yielded $0.70 of incremental revenue. While you would like this number to be closer to 1x (i.e. $1 in S&M yield $1 in revenue - hence a repeatable go-to-market motion), it is not terrible. ServiceNow (Slootman’s old company), actually operates less efficiently - for every dollar it invests in sales and marketing, it generates only $0.55 of subscription revenue. Crowdstrike, on the other hand, operates a partner-driven go-to-market, which enables it to generate more while spending less - created $0.90 for every dollar invested in sales and marketing over the last nine months. However, there is a key thing that distinguishes the data warehouse compared to these other companies and Ben Thompson at Stratechery nails it here: “Think about this in the context of Snowflake’s business: the entire concept of a data warehouse is that it contains nearly all of a company’s data, which (1) it has to be sold to the highest levels of the company, because you will only get the full benefit if everyone in the company is contributing their data and (2) once the data is in the data warehouse it will be exceptionally difficult and expensive to move it somewhere else. Both of these suggest that Snowflake should spend more on sales and marketing, not less. Selling to the executive suite is inherently more expensive than a bottoms-up approach. Data warehouses have inherently large lifetime values given the fact that the data, once imported, isn’t going anywhere.” I hope Snowflake burns more money in the future, and builds a sustainable long-term business.

Dig Deeper

  • Early Youtube Videos Describing Snowflake’s Architecture and Re-inventing the Data Warehouse

  • NCR’s spinoff of Teradata in 2007

  • Fraser Harris of Fivetran and Tristan Handy of dbt speak at the Modern Data Stack Conference

  • Don Valentine, Sequoia Capital: "Target Big Markets" - A discussion at Stanford

  • The Mike Speiser Incubation Playbook (an essay by Kevin Kwok)

tags: Snowflake, Data Warehouse, Oracle, Vertica, Netezza, IBM, Databricks, Apache Spark, Open Source, Fivetran, Matillion, dbt, Data Lake, Sequoia, ServiceNow, Crowdstrike, Cloudera, Hortonworks, MapR, BigQuery, Frank Slootman, Teradata, Xerox, Informix, NCR, AT&T, Benoit Dageville, Mike Speiser, Sutter Hill Ventures, Redshift, Amazon, ETL, Hadoop, SQL
categories: Non-Fiction
 

January 2021 - Technological Revolutions and Financial Capital: The Dynamics of Bubbles and Golden Ages by Carlota Perez

This month we read Carlota Perez’s understudied book covering the history of technology breakthroughs and revolutions. This book marries the role of financing and technology breakthrough so seamlessly in an easy to digest narrative style.

Tech Themes

  1. The 5 Technology Revolutions. Perez identifies the five major technological revolutions: The Industrial Revolution (1771-1829), The Age of Steam and Railways (1829-1873), The Age of Steel, Electricity and Heavy Engineering (1875-1918), The Age of Oil, the Automobile and Mass Production (1908-1974), and The Age of Information and Telecommunications (1971-Today). When looking back at these individual revolutions, one can recognize how powerful it is to view the world and technology in these incredibly long waves. Many of these periods lasted for over fifty years while their geographic dispersion and economic effects fully came to fruition. These new technologies fundamentally alter society - when it becomes clear that the revolution is happening, many people jump on the bandwagon. As Perez puts it, “The great clusters of talent come forth after the evolution is visible and because it is visible.” Each revolution produces a myriad of change in society. The industrial revolution popularized factory production, railways created national markets, electricity created the power to build steel buildings, oil and cars created mass markets and assembly lines, and the microprocessor and internet created amazing companies like Amazon and Airbnb.

  2. The Phases of Technology Revolution. After a decently long gestation period during which the old revolution has permeated across the world, the new revolution normally starts with a big bang, some discovery or breakthrough (like the transistor or steam engine) that fundamentally pushed society into a new wave of innovation. Coupled with these big bangs, is re-defined infrastructure from the prior eras - as an example, the Telegraph and phone wires were created along the initial railways, as they allowed significant distance of uninterrupted space to build on. Another example is electricity - initially, homes were wired to serve lightbulbs, it was only many years later that great home appliances came into use. This initial period of application discovery is called the Irruption phase. The increasing interest in forming businesses causes a Frenzy period like the Railway Mania or the Dot-com Boom, where everyone thinks they can get rich quick by starting a business around the new revolution. As the first 20-30 years of a revolution play themselves out, there grows a strong divide between those who were part of the revolution and those who were not; there is an economic, social, and regulatory mismatch between the old guard and the new revolution. After an uprising (like the populism we have seen recently) and bubble collapse (Check your crystal ball), regulatory changes typically foster a harmonious future for the technology. Following these changes, we enter the Synergy phase, where technology can fully flourish due to accommodating and clear regulation. This Synergy phase propagates outward across all countries until even the lagging adopters have started the adoption process. At this point the cycle enters into Maturity, waiting for the next big advance to start the whole process over again.

  3. Where are we in the cycle today? We tweeted at Carlota Perez to answer this question AND SHE RESPONDED! My question to Perez was: With the recent wave of massive, transformational innovation like the public cloud providers, and the iPhone, are we still in the Age of Information? These technological waves are often 50-60 years and yet we’ve arguably been in the same age for quite a while. This wave started in 1971, exactly 50 years ago, with Intel and the creation of the microprocessor. Are we in the Frenzy phase with record amounts of investment capital, an enormous demand for early stage companies, and new financial innovations like Affirm’s debt securitizations? Or have we not gotten to the Frenzy phase yet? Is the public cloud or the iPhone the start of a new big bang and we have overlapping revolutions for the first time ever? Obviously identifying the truly breakthrough moments in technology history is way easier after the fact, so maybe we are too new to know what really is a seminal moment. Perez’s answer, though only a few words, fully provides scope to the question. Perez suggests we are still in the installation phase (Irruption and Frenzy) of the new technology and that makes a lot of sense. Sure, internet usage is incredibly high in the US (96%) but not in other large countries. China (the world’s largest country by population) has only 63% using the internet and India (the world’s second-largest country) has only 55% of its population using the internet. Ethiopia, with a population of over 100M people only has 18% using the internet. There is still a lot of runway left for the internet to bloom! In addition, only recently have people been equipped with a powerful computing device that fits in their pocket - and low-priced phones are now making their way to all parts of the world led by firms like Chinese giant Transsion. Added to the fact that we are not fully installed with this revolution, is the rise of populism, a political movement that seeks to mobilize ordinary people who feel disregarded by the elite group. Populism has reared its ugly head across many nations like the US (Donald Trump), UK (Brexit), Brazil (Bolsonaro) and many other countries. The rise of populism is fueled by the growing dichotomy between the elites who have benefitted socially and monetarily from the revolution and those who have not. In the 1890’s, anti-railroad sentiment drove the creation of the populist party. More recently, people have become angry at tech giants (Facebook, Google, Amazon, Apple, Twitter) for unfair labor practices, psychological manipulation, and monopolistic tendencies. The recent movie, the Social Dilemma, which suggests a more humane and regulatory focused approach to social media, speaks to the need for regulation of these massive companies. It is also incredibly ironic to watch a movie about how social media is manipulating its users while streaming a movie that was recommended to me on Netflix, a company that has popularized incessant binge-watching through UX manipulation, not dissimilar to Facebook and Google’s tactics. I expect these companies to get regulated soon -and I hope that once that happens, we enter into the Synergy phase of growth and value accruing to all people.

Yes, I do. I will find the time to reply to you properly. But just quickly, I think installation was prolonged by QE &casino finance; we are at the turning point (the successful rise of populism is a sign) and maybe post-Covid we'll go into synergy.

— Carlota Perez (@CarlotaPrzPerez) January 17, 2021

Business Themes

saupload_31850821249d4eb762b6cc.png
tumblr_63436aee14331420f570d452241e94ad_197e0e8c_500.png
tech-lifecycle.png
1920px-LongWavesThreeParadigms.jpg
images.jpg
  1. The role of Financial Capital in Revolutions. As the new technology revolutions play themselves out, financial capital appears right alongside technology developments, ready to mold the revolution into the phases suggested by Perez. In the irruption phase, as new technology is taking hold, financial capital that had been on the sidelines waiting out the Maturity phase of the previous revolution plows into new company formation and ideas. The financial sector tries to adopt the new technology as soon as possible (we are already seeing this with Quantum computing), so it can then espouse the benefits to everyone it talks to, setting the stage for increasing financing opportunities. Eventually, demand for financing company creation goes crazy, and you enter into a Frenzy phase. During this phase, there is a discrepancy between the value of financial capital and production capital, or money used by companies to create actual products and services. Financial capital believes in unrealistic returns on investment, funding projects that don’t make any sense. Perez notes: “In relation to the canal Mania of the 1790s, disorder and lack of coordination prevailed in investment decisions. Canals were built ‘with different widths and depths and much inefficient routing.’ According to Dan Roberts at the Financial Times, in 2001 it was estimated that only 1 to 2 percent of the fiber optic cable buried under Europe and the United States had so far been turned on.” These Frenzy phases create bubbles and further ingrain regulatory mismatch and political divide. Could we be in one now with deals getting priced at 125x revenue for tiny companies? After the institutional reckoning, the Technology revolution enters the Synergy phase where production capital has really strong returns on investment - the path of technology is somewhat known and real gains are to be made by continuing investment (especially at more reasonable asset prices). Production capital continues to go to good use until the technology revolution fully plays itself out, entering into the Maturity phase.

  2. Casino Finance and Prolonging Bubbles. One point that Perez makes in her tweet, is that this current bubble has been prolonged by QE and casino finance. Quantitative easing is a monetary policy where the federal reserve (US’s central bank) buys government bonds issued by the treasury department to inject money into the financial ecosystem. This money at the federal reserve can purchase bank loans and assets, offering more liquidity to the financial system. This process is used to create low-interest rates, which push individuals and corporations to invest their money because the rate of interest on savings accounts is really really low. Following the financial crisis and more recently COVID-19, the Federal Reserve lowered interest rates and started quantitative easing to help the hurting economy. In Perez’s view, these actions have prolonged the Irruption and Frenzy phases because it forces more money into investment opportunities. On top of quantitative easing, governments have allowed so-called Casino Capitalism - allowing free-market ideals to shape governmental policies (like Reagan’s economic plan). Uninterrupted free markets are in theory economically efficient but can give rise to bad actors - like Enron’s manipulation of California’s energy markets after deregulation. By engaging in continual quantitative easing and deregulation, speculative markets, like collateralized loan obligations during the financial crisis, are allowed to grow. This creates a risk-taking environment that can only end in a frenzy and bubble.

  3. Synergy Phase and Productive Capital Allocation. Capital allocation has been called the most important part of being a great investor and business leader. Think about being the CEO of Coca Cola for a second - you have thousands of competing projects, vying for budget - how do you determine which ones get the most money? In the investing world, capital allocation is measured by conviction. As George Soros’s famous quote goes: “It's not whether you're right or wrong, but how much money you make when you're right and how much you lose when you're wrong.” Clayton Christensen took the ideas of capital allocation and compared them to life investments, coming to the conclusion: “Investments in relationships with friends and family need to be made long, long before you’ll see any sign that they are paying off. If you defer investing your time and energy until you see that you need to, chances are it will already be too late.” Capital and time allocation are underappreciated concepts because they often seem abstract to the everyday humdrum of life. It is interesting to think about capital allocation within Perez’s long-term framework. The obvious approach would be to identify the stage (Irruption, Frenzy, Synergy, Maturity) and make the appropriate time/money decisions - deploy capital into the Irruption phase, pull money out at the height of the Frenzy, buy as many companies as possible at the crash/turning point, hold through most of the Synergy, and sell at Maturity to identify the next Irruption phase. Although that would be fruitful, identifying market bottoms and tops is a fool’s errand. However, according to Perez, the best returns on capital investment typically happen during the Synergy phase, where production capital (money employed by firms through investment in R&D) reigns supreme. During this time, the revolutionary applications of recently frenzied technology finally start to bear fruit. They are typically poised to succeed by an accommodating regulatory and social environment. Unsurprisingly, after the diabolic grifting financiers of the frenzy phase are exposed (see Worldcom, Great Financial Crisis, and Theranos), social pressures on regulators typically force an agreement to fix the loopholes that allowed these manipulators to take advantage of the system. After Enron, the Sarbanes-Oxley act increased disclosure requirements and oversight of auditors. After the GFC, the Dodd-Frank act mandated bank stress tests and introduced financial stability oversight. With the problems of the frenzy phase "fixed” for the time being, the social attitude toward innovation turns positive once again and the returns to production capital start to outweigh financial capital which is now reigned in under the new rules. Suffice to say, we are probably in the Frenzy phase in the technology world, with a dearth of venture opportunities, creating a massive valuation increase for early-stage companies. This will change eventually and as Warren Buffett says: “It’s only when the tide goes out that you learn who’s been swimming naked.” When the bubble does burst, regulation of big technology companies will usher in the best returns period for investors and companies alike.

Dig Deeper

  • The Financial Instability Hypothesis: Capitalist Processes and the Behavior of the Economy

  • Bubbles, Golden Ages, and Tech Revolutions - a Podcast with Carlota Perez

  • Jeff Bezos: The electricity metaphor (2007)

  • Where Does Growth Come From? Clayton Christensen | Talks at Google

  • A Spectral Analysis of World GDP Dynamics: Kondratieff Waves, Kuznets Swings, Juglar and Kitchin Cycles in Global Economic Development, and the 2008–2009 Economic Crisis

tags: Telegraph, Steam Engine, Steel, Transistor, Intel, Railway Mania, Dot-com Boom, Carlota Perez, Affirm, Irruption, Frenzy, Synergy, Maturity, iPhone, Apple, China, Ethiopia, Theranos, Populism, Twitter, Netflix, Warren Buffett, George Soros, Quantum Computing, QE, Reagan, Enron, Clayton Christensen, Worldcom
categories: Non-Fiction
 

December 2020 - Do Androids Dream of Electric Sheep? (Blade Runner) by Phillip K. Dick

This month we read the classic sci-fi novel, Do Androids Dream of Electric Sheep? The book follows Rick Deckard, a bounty hunter searching out android robots who are pretending to be human beings. Along the journey, the reader is asked to consider: what does it mean to be alive? Philip K. Dick was a crazy sci-fi writer, producing many books and stories that became famous like The Man in the High Castle, Minority Report, and Total Recall. Although his writing career was prolific, Dick was a troubled individual. He was a heavy drug user, he married five times, he experienced drug-induced “paranormal activities” and he was physically abusive to at least two of his wives. While

Tech Themes

The common, modern depiction of a Turing Test

The common, modern depiction of a Turing Test

  1. Are you an android? In 1950, British computer scientist Alan Turing conceived of the Turing Test, a hypothetical test to determine whether a machine can display intelligent behavior. Turing asked the question, “Can machines think?” and attempted to define a test whereby a human might be tricked into believing a machine was human. The test design is fairly complex but involves a human asking written questions to a machine in another room. If the machine can convince the interrogator that it’s human, then machines can “think.” This Turing test is mirrored in the Voigt-Kampff test used throughout the book. It’s unclear if the test works, and Rick Deckard almost misdiagnoses Rachel in the book's early parts. At the end of the book, the test is turned on its head, with Rick impersonating John Isidore (another human), trying to convince machines (in another room) to let him in. This role-reversal and the questioning of who is an android happens throughout the novel - at times, Rick, Phil Resh, and Harry Bryant might all be androids. These questions are the centerpiece of sci-fi lore. They are also explored in a similar style in the famous movie Ghost in The Shell, where people have now have some organs and limbs replaced by electric parts. When a cyber-attacker named the Puppet Master takes over the machine network of technological parts, it’s unclear who is human, who is an android, and who is possessed by the Puppet Master. In the video game world, this idea has also recently been explored in Detroit: Become Human. In the game, which is set up in choose-your-own-adventure style, players can play as humans or androids and choose whether they stay in character or break out of their controlled, android state. The idea of an interrogator or bounty hunter snooping out rogue machines has been explored across books, film, and video games. As technology has become more prevalent in our lives, the cultural mediums may have changed, but the classic philosophical question - what does it mean to be alive? - remains.

  2. Predicting the future. The Blade Runner movie is famously set in Los Angeles, 2019, while the book is set in 1992 in San Francisco. The book itself was written in 1968, and the movie Blade Runner debuted 14 years later in 1982. In 2019, Blade Runner experienced a comic resurgence as its dark, bleak futuristic society of flying cars, fully intelligent artificial beings, and international space travel never happened. Today, predictions of computing and artificial intelligence abound. In his original Imitation Game paper, Alan Turing made one of the most famous AI predictions: “I believe that in about fifty years’ time it will be possible to programme computers, with a storage capacity of about 10^9, to make them play the imitation game so well that an average interrogator will not have more than 70 percent, chance of making the right identification after five minutes of questioning.” It’s tough to know if this prediction came true (other than the 10^9 part because that is only 1 GB), with some places claiming to have built algorithms that beat the Turing Test. Interestingly, one common theme emerges about these computing predictions - both experts and non-experts typically predict about 15-25 years out. In the Innovators, Walter Issacson posited that this was enough time to allow people to engage in imaginative thinking. Roy Amara, co-founder of the Institute for the Future, probably put it best: “We tend to overestimate the effect of a technology in the short run and underestimate the effect in the long run.” How long run is the long run, though? As John Maynard Keynes proclaimed: “In the long run we are all dead. Economists set themselves too easy, too useless a task if, in tempestuous seasons, they can only tell us that when the storm is long past the ocean is flat again.” It is seriously hard to estimate the combination of changing technologies and infrastructures, which unlock completely new and cost-effective ways of building things. Will we have self-driving cars in 20 years? Will we have Artificial General Intelligence? Will we have quantum computing? I have no idea.

  3. Technology and nature. One theme repeatedly explored throughout the novel is this balance or tension between technology and nature. World War Terminus has caused a layer of radioactive dust to fall over the world, killing animal life and changing the environment. Mechanical animals are the norm, and Rick dreams about procuring a real horse, ostrich, or goat one day. He regularly checks his Sidney’s Animal & Fowl Catalogue like a stockbroker checking the latest price change. A real animal is significantly more expensive than a mechanical version, despite it being nearly impossible to figure out whether an animal is real or fake. This mirror’s the book's whole premise - a real human is more important and valuable than an Android despite increasingly small differences between Androids and humans. Rick realizes this at the end of the book: “The spider Mercer gave the chickenhead, Isidore; it probably was artificial, too. But it doesn't matter. The electric things have their lives, too. Paltry as those lives are." Technology and nature have a tradeoff in today’s world as well. Cloud computing is certainly energy-intensive, but according to the companies that run those clouds (like Google Cloud or Microsoft Azure), it is significantly less intensive than having companies run their own data centers. Beyond the environmental impact, the behavior of nature is something to consider when operating a data center. A few years ago, Facebook data centers went down when a Snake chewed through a switchboard and took down all services. In 2014, a shark bit through an underwater Google fiber cable, and in 2012 a squirrel took down a Yahoo data center. Animals, technology, and nature are constantly interacting, sometimes in unexpected ways.

Business Themes

Screenshot 2020-12-24 092236.png
  1. Status seeking and the growth of e-commerce. In the battle to achieve status, real animals are a highly sought after status symbol. Early on in the book, Rick engages in a jealous conversation over his neighbor’s real horse: “‘Ever thought of selling your horse?’ Rick asked. He wished to god he had a horse, in fact any animal.” After revealing that his sheep was electric, Rick’s neighbor kindly remarks that he won’t tell the other people in the apartment complex, suggesting that if people knew Rick had an electric sheep (rather than a real one), they would look down on him. While this interaction seems weird, it parallels so many interactions people have today. Vance Packard offered a description of “status seekers” in 1959: “People who are continually straining to surround themselves with visible evidence of the superior rank they are claiming.” As general consumption and wealth rose after World-War II in the US, luxury goods became more attainable for more classes. Globalization of supply chains also increased this trend. When commerce moved online, new shopping styles and behaviors emerged. E-commerce purchases can frequently replace feelings and there is even a psychological disorder caused by excessive purchasing: Buying-shopping disorder (BSD) is characterized by extreme preoccupations with and craving for buying/shopping and by irresistible and identity-seeking urges to possess consumer goods. Patients with BSD buy more consumer goods than they can afford, and those are neither needed nor frequently used. The excessive purchasing is primarily used to regulate emotions, e.g. to get pleasure, relief from negative feelings, or coping with self-discrepancy.” Dick may be signaling that humans seek status and importance compared to their reference groups, regardless of setting or what indicates that status to others, whether it be an expensive handbag or a goat.

  2. Buy goat now, pay-later. 2020 saw the emergence of buy-now, pay-later (BNPL) vendors like Affirm, Klarna, and Afterpay. These companies typically offer zero-interest loans to consumers and get paid a 5% merchant fee for increasing purchases at e-commerce stores. The stores (like Peloton for example) increase sales and the consumers benefit from not having to pay a significant upfront payment. The other way these companies make money is by charging interest payments on specific types of purchases (likely where the merchant doesn’t want to give away a fee). These interest rates can be really, really high - averaging around 10-30% depending on the purchase. This is not a new concept and the idea of payday loans at predatorily high-interest rates has been around for over 30 years. Luckily, the purchases that these BNPL providers are financing tend to not be really high-value products, but it’s still concerning that some people are buying things without understanding the true value they will have to pay in interest. When Rick purchases a real goat, after killing three androids, he finances it, paying $3,000 upfront and entering into a three-year payment contract. Rick’s wife Iran is outraged at the cost of the goat: "‘What are the monthly payments on the goat?’ She held out her hand; reflexively he got out the contract which he had signed, passed it to her. ‘That much,’ she said in a thin voice. ‘The interest; good god — the interest alone. And you did this because you were depressed. Not as a surprise for me, as you originally said.” With BNPL providers now securitizing these consumer loans and selling them off to banks, I wonder if we will see any new regulation come to bear for the benefit of consumers. If people are not careful, they could be locked into long contracts with significant interest over time.

  3. Two case studies in electric animals. Electric animals have actually been invented and while they may not be the equivalent of Goddard from Jimmy Neutron yet, they are pretty funny and interesting case studies. Sony released the AIBO dog in 1999 after many years of research. The original robot dog cost $2,100 (~$3,500 in today’s dollars) and sold about 65,000 units. The programmable software allowed the dogs to be used in a variety of situations including an AI soccer world cup. The initial popularity of the dogs waned, and price wars with new rivals caused sales to decline. In 2006, the AIBO dog was discontinued. In 2018, it made a resurgence and is now a barking flexible model that you can pet, play games with, and feed. Another tale of odd mechanic animals is Boston Dynamics. The company that spun out of MIT in 1992 produced massive quadruped animals including one called BigDog, that was capable of balancing, walking up-hill, and carrying significant amounts of equipment. The Company had trouble selling products though and was acquired by Google in 2013 for an undisclosed sum. This came at a time when Google was pushing heavily into robotics with Google Glass and what would become Waymo - they literally titled this Project Replicant (the name used for Android in the Blade Runner film). After some more years of underperformance, Google sold Boston Dynamics to Softbank in 2017. After years of development, the company finally released a product to consumers for a whopping $75,000. The dog is still pretty creepy and comes without a real face, unlike the Aibo. In 2020, it was announced that Hyundai had acquired an 80% stake in the business at a $1.1B valuation. We are still years away from having electric animals that mimic real-life animals and that may be a good thing.

Dig Deeper

  • Blade Runner: How Its Problems Made It a Better Movie

  • Does Buy Now, Pay Later Threaten Credit Card Issuers?

  • Predicting a Future Where the Future Is Routinely Predicted

  • An Overview of the latest Affirm Consumer Loan Securitization

  • Snakes in a Facebook Data Center

tags: Alan Turing, Ghost in the Shell, Blade Runner, Philip K. Dick, Sony, AI, AGI, Google, Microsoft, Yahoo, BNPL, Affirm, Klarna, Afterpay, e-Commerce, Securitization, Jimmy Neutron, AIBO, Boston Dynamics, Softbank, Hyundai, Facebook, Waymo, Rick Deckard, Detroit: Become Human, Los Angeles, San Francisco
categories: Fiction
 

November 2020 - Tape Sucks: Inside Data Domain, A Silicon Valley Growth Story by Frank Slootman

This month we read a short, under-discussed book by current Snowflake and former ServiceNow and Data Domain CEO, Frank Slootman. The book is just like Frank - direct and unafraid. Frank has had success several times in the startup world and the story of Data Domain provides a great case study of entrepreneurship. Data Domain was a data deduplication company, offering a 20:1 reduction of data backed up to tape casettes by using new disk drive technology.

Tech Themes

Data Domain’s 2008 10-K prior to being acquired

Data Domain’s 2008 10-K prior to being acquired

  1. First time CEO at a Company with No Revenue. Frank is an immigrant to the US, coming from the Netherlands shortly after graduating from the University of Rotterdam. After being rejected by IBM 10+ times, he joined Burroughs corporation, an early mainframe provider which subsequently merged with its direct competitor Sperry for $4.8B in 1986. Frank then spent some time at Compuware and moved back to the Netherlands to help it integrate the acquisition of Uniface, an early customizable report building software. After spending time there, he went to Borland software in 1997, working his way up the product management ranks but all the while being angered by time spent lobbying internally, rather than building. Frank joined Data Domain in the Spring of 2003 - when it had no customers, no revenue, and was burning cash. The initial team and VC’s were impressive - Kai Li, a computer science professor on sabbatical from Princeton, Ben Zhu, an EIR at USVP, and Brian Biles, a product leader with experience at VA Linux and Sun Microsystems. The company was financed by top-tier VC’s New Enterprise Associates and Greylock Partners, with Aneel Bhusri (Founder and current CEO of Workday) serving as initial CEO and then board chairman. This was a stacked team and Slootman knew it: “I’d bring down the average IQ of the company by joining, which felt right to me.” The Company had been around for 18 months and already burned through a significant amount of money when Frank joined. He knew he needed to raise money relatively soon after joining and put the Company’s chances bluntly: “Would this idea really come together and captivate customers? Nobody knew. We, the people on the ground floor, were perhaps, the most surprised by the extraordinary success we enjoyed.”

  2. Playing to his Strengths: Capital Efficiency. One of the big takeaways from the Innovators by Walter Issacson was that individuals or teams at the nexus of disciplines - primarily where the sciences meet the humanities, often achieved breakthrough success. The classic case study for this is Apple - Steve Jobs had an intense love of art, music, and design and Steve Wozniak was an amazing technologist. Frank has cultivated a cross-discipline strength at the intersection of Sales and Technology. This might be driven by Slootman’s background is in economics. The book has several references to economic terms, which clearly have had an impact on Frank’s thinking. Data Domain espoused capital efficiency: “We traveled alone, made few many-legged sales calls, and booked cheap flights and hotels: everybody tried to save a dime for the company.” The results showed - the business went from $800K of revenue in 2004 to $275 million by 2008, generating $75M in cash flow from operations. Frank’s capital efficiency was interesting and broke from traditional thinking - most people think to raise a round and build something. Frank took a different approach: “When you are not yet generating revenue, conservation of resource is the dominant theme.” Over time, “when your sales activity is solidly paying for itself,” the spending should shift from conservative to aggressive (like Snowflake is doing this now). The concept of sales efficiency is somewhat talked about, but given the recent fundraising environment, is often dismissed. Sales efficiency can be thought of as: “How much revenue do I generate for every $1 spent in sales and marketing?” Looking at the P&L below, we see Data Domain was highly efficient in its sales and marketing activity - the company increased revenue $150M in 2008, despite spending $115M in sales and marketing (a ratio of 1.3x). Contrast this with a company like Slack which spent $403M to acquire $230M of new revenue (a ratio of 0.6x). It gets harder to acquire customers at scale, so this efficiency is supposed to come down over time but best in class is hopefully above 1x. Frank clearly understands when to step on the gas with investing, as both ServiceNow and Snowflake have remained fairly efficient (from a sales perspective at least) while growing to a significant scale.

  3. Technology for Technology’s Sake. “Many technologies are conceived without a clear, precise notion of the intended use.” Slootman hits on a key point and one that the tech industry has struggled to grasp throughout its history. So many products and companies are established around budding technology with no use case. We’ve discussed Magic Leap’s fundraising money-pit (still might find its way), and Iridium Communications, the massive satellite telephone that required people to carry a suitcase around to use it. Gartner, the leading IT research publication (which is heavily influenced by marketing spend from companies) established the Technology Hype Cycle, complete with the “Peak of inflated expectations,” and the “Trough of Disillusionment” for categorizing technologies that fail to live up to their promise. There have been several waves that have come and gone: AR/VR, Blockchain, and most recently, Serverless. Its not so much that these technologies were wrong or not useful, its rather that they were initially described as a panacea to several or all known technology hindrances and few technologies ever live up to that hype. Its common that new innovations spur tons of development but also lots of failure, and this is Slootman’s caution to entrepreneurs. Data Domain was attacking a problem that existed already (tape storage) and the company provided what Clayton Christensen would call a sustaining innovation (something that Slootman points out). Whenever things go into “winter state”, like the internet after the dot-com bubble, or the recent Crpyto Winter which is unthawing as I write; it is time to pay attention and understand the relevance of the innovation.

Business Themes

5dacqibnz_funnelvs.pipeline.png
Inside-Sales-Team-Structure.png
  1. Importance of Owning Sales. Slootman spends a considerable amount of this small book discussing sales tactics and decision making, particularly with respect to direct sales and OEM relationships. OEM deals are partnerships with other companies whereby one company will re-sell the software, hardware, or service of another company. Crowdstrike is a popular product with many OEM relationships. The Company drives a significant amount of its sales through its partner model, who re-sell on behalf of Crowdstrike. OEM partnerships with big companies present many challenges: “First of all, you get divorced from your customer because the OEM is now between you and them, making customer intimacy challenging. Plus, as the OEM becomes a large part of your business, for all intents and purposes they basically own you without paying for the privilege…Never forget that nobody wants to sell your product more than you do.” The challenges don’t end there. Slootman points out that EMC discarded their previous OEM vendor in the data deduplication space, right after acquiring Data Domain. On top of that, the typical reseller relationship happens at a 10-20% margin, degrading gross margins and hurting ability to invest. It is somewhat similar to the challenges open-source companies like MongoDB and Elastic have run into with their core software being…free. Amazon can just OEM their offering and cut them out as a partner, something they do frequently. Partner models can be sustainable, but the give and take from the big company is a tough balance to strike. Investors like organic adoption, especially recently with the rise of freemium SaaS models percolating in startups. Slootman’s point is that at some point in enterprise focused businesses, the Company must own direct sales (and relationships) with its customers to drive real efficiency. After the low cost to acquire freemium adopters buy the product, the executive team must pivot to traditional top down enterprise sales to drive a successful and enduring relationship with the customer.

  2. In the Thick of Things. Slootman has some very concise advice for CEOs: be a fighter, show some humanity, and check your ego at the door. “Running a startup reduces you to your most elementary instincts, and survival is on your mind most of the time…The CEO is the ‘Chief Combatant,’ warrior number one.” Slootman views the role of CEO as a fighter, ready to be the first to jump into the action, at all times. And this can be incredibly productive for business as well. Tony Xu, the founder and CEO of Doordash, takes time out every month to do delivery for his own company, in order to remain close to the customer and the problems of the company. Jeff Bezos famously still responds and views emails from customers at jeff@amazon.com. Being CEO also requires a willingness to put yourself out there and show your true personality. As Slootman puts it: “People can instantly finger a phony. Let them know who you really are, warts and all.” As CEO you are tasked with managing so many people and being involved in all aspects of the business, it is easy to become rigid and unemotional in everyday interactions. Harvard Business School professor and former leader at Uber distills it down to a simple phrase: “Begin With Trust.” All CEO’s have some amount of ego, driving them to want to be at the top of their organization. Slootman encourages CEO’s to be introspective, and try to recognize blind spots, so ego doesn’t drive day-to-day interactions with employees. One way to do that is simple: use the pronoun “we” when discussing the company you are leading. Though Slootman doesn’t explicitly call it out - all of these suggestions (fighting, showing empathy, getting rid of ego) are meant to build trust with employees.

  3. R-E-C-I-P-E for a Great Culture. The last fifth of the book is all focused on building culture at companies. It is the only topic Slootman stays on for more than a few chapters, so you know its important! RECIPE was an acronym created by the employees at Data Domain to describe the company’s values: Respect, Excellence, Customer, Integrity, Performance, Execution. Its interesting how simple and focused these values are. Technology has pushed its cultural delusion’s of grandeur to an extreme in recent years. The WeWork S-1 hilariously started with: “We are a community company committed to maximum global impact. Our mission is to elevate the world’s consciousness.” But none of Data Domain’s values were about changing the world to be a better place - they were about doing excellent, honest work for customers. Slootman is lasered focused on culture, and specifically views culture as an asset - calling it: “The only enduring, sustainable form of differentiation. These days, we don’t have a monopoly for very long on talent, technology, capital, or any other asset; the one thing that is unique to us is how we choose to come together as a group of people, day in and day out. How many organizations are there that make more than a halfhearted attempt at this?” Technology companies have taken different routes in establishing culture: Google and Facebook have tried to create culture by showering employees with unbelievable benefits, Netflix has focused on pure execution and transparency, and Microsoft has re-vamped its culture by adopting a Growth Mindset (has it really though?). Google originally promoted “Don’t be evil,” as part of its Code of Conduct but dropped the motto in 2018. Employees want to work for mission-driven organizations, but not all companies are really changing the world with their products, and Frank did not try to sugarcoat Data Domain’s data-duplication technology as a way to “elevate the world’s consciousness.” He created a culture driven by performance and execution - providing a useful product to businesses that needed it. The culture was so revered that post-acquisition, EMC instituted Data Domain’s performance management system. Data Domain employees were looked at strangely by longtime EMC executives, who had spent years in a big and stale company. Culture is a hard thing to replicate and a hard thing to change as we saw with the Innovator’s Dilemma. Might as well use it to help the company succeed!

Dig Deeper

  • How Data Domain Evolved in the Cloud World

  • Former Data Domain CEO Frank Slootman Gets His Old Band Back Together at ServiceNow

  • The Contentious Take-over Battle for Data Domain: Netapp vs. EMC

  • 2009 Interview with Frank Slootman After the Acquisition of Data Domain

tags: Snowflake, DoorDash, ServiceNow, WeWork, Data Domain, EMC, Netapp, Frank Slootman, Borland, IBM, Burroughs, Sperry, NEA, Greylock, Workday, Aneel Bhusri, Sun Microsystems, USVP, Uber, Netflix, Facebook, Google, Microsoft, Amazon, Jeff Bezos, Tony Xu, MongoDB, Elastic, Crowdstrike, Crypto, Gartner, Hype Cycle, Slack, Apple, Steve Jobs, Steve Wozniak, Magic Leap, batch2
categories: Non-Fiction
 

October 2020 - Working in Public: The Making and Maintenance of Open Source Software by Nadia Eghbal

This month we covered Nadia Eghbal’s instant classic about open-source software. Open-source software has been around since the late seventies but only recently it has gained significant public and business attention.

Tech Themes

The four types of open source communities described in Working in Public

The four types of open source communities described in Working in Public

  1. Misunderstood Communities. Open source is frequently viewed as an overwhelmingly positive force for good - taking software and making it free for everyone to use. Many think of open source as community-driven, where everyone participates and contributes to making the software better. The theory is that so many eyeballs and contributors to the software improves security, improves reliability, and increases distribution. In reality, open-source communities take the shape of the “90-9-1” rule and act more like social media than you could think. According to Wikipedia, the "90–9–1” rule states that for websites where users can both create and edit content, 1% of people create content, 9% edit or modify that content, and 90% view the content without contributing. To show how this applies to open source communities, Eghbal cites a study by North Carolina State Researchers: “One study found that in more than 85% of open source projects the research examined on Github, less than 5% of developers were responsible for 95% of code and social interactions.” These creators, contributors, and maintainers are developer influencers: “Each of these developers commands a large audience of people who follow them personally; they have the attention of thousands of developers.” Unlike Instagram and Twitch influencers, who often actively try to build their audiences, open-source developer influencers sometimes find the attention off-putting - they simply published something to help others and suddenly found themselves with actual influence. The challenging truth of open source is that core contributors and maintainers give significant amounts of their time and attention to their communities - often spending hours at a time responding to pull requests (requests for changes / new features) on Github. Evan Czaplicki’s insightful talk entitled “The Hard Parts of Open Source,” speaks to this challenging dynamic. Evan created the open-source project, Elm, a functional programming language that compiles Javascript, because he wanted to make functional programming more accessible to developers. As one of its core maintainers, he has repeatedly been hit with requests of “Why don’t you just…” from non-contributing developers angrily asking why a feature wasn’t included in the latest release. As fastlane creator, Felix Krause put it, “The bigger your project becomes, the harder it is to keep the innovation you had in the beginning of your project. Suddenly you have to consider hundreds of different use cases…Once you pass a few thousand active users, you’ll notice that helping your users takes more time than actually working on your project. People submit all kinds of issues, most of them aren’t actually issues, but feature requests or questions.” When you use open-source software, remember who is contributing and maintaining it - and the days and years poured into the project for the sole goal of increasing its utility for the masses.

  2. Git it? Git was created by Linus Torvalds in 2005. We talked about Torvalds last month, who also created the most famous open-source operating system, Linux. Git was born in response to a skirmish with Larry McAvoy, the head of proprietary tool BitKeeper, over the potential misuse of his product. Torvalds went on vacation for a week and hammered out the most dominant version control system today - git. Version control systems allow developers to work simultaneously on projects, committing any changes to a centralized branch of code. It also allows for any changes to be rolled back to earlier versions which can be enormously helpful if a bug is found in the main branch. Git ushered in a new wave of version control, but the open-source version was somewhat difficult to use for the untrained developer. Enter Github and GitLab - two companies built around the idea of making the git version control system easier for developers to use. Github came first, in 2007, offering a platform to host and share projects. The Github platform was free, but not open source - developers couldn’t build onto their hosting platform - only use it. GitLab started in 2014 to offer an alternative, fully-open sourced platform that allowed individuals to self-host a Github-like tracking program, providing improved security and control. Because of Github’s first mover advantage, however, it has become the dominant platform upon which developers build: “Github is still by far the dominant market player: while it’s hard to find public numbers on GitLab’s adoption, its website claims more than 100,000 organizations use its product, whereas GitHub claims more than 2.9 million organizations.” Developers find GitHub incredibly easy to use, creating an enormous wave of open source projects and code-sharing. The company added 10 million new users in 2019 alone - bringing the total to over 40 million worldwide. This growth prompted Microsoft to buy GitHub in 2018 for $7.5B. We are in the early stages of this development explosion, and it will be interesting to see how increased code accessibility changes the world over the next ten years.

  3. Developing and Maintaining an Ecosystem Forever. Open source communities are unique and complex - with different user and contributor dynamics. Eghbal tries to segment the different types of open source communities into four buckets - federations, clubs, stadiums, and toys - characterized below in the two by two matrix - based on contributor growth and user growth. Federations are the pinnacle of open source software development - many contributors and many users, creating a vibrant ecosystem of innovative development. Clubs represent more niche and focused communities, including vertical-specific tools like astronomy package, Astropy. Stadiums are highly centralized but large communities - this typically means only a few contributors but a significant user base. It is up to these core contributors to lead the ecosystem as opposed to decentralized federations that have so many contributors they can go in all directions. Lastly, there are toys, which have low user growth and low contributor growth but may actually be very useful projects. Interestingly, projects can shift in and out of these community types as they become more or less relevant. For example, developers from Yahoo open-sourced their Hadoop project based on Google’s File System and Map Reduce papers. The initial project slowly became huge, moving from a stadium to a federation, and formed subprojects around it, like Apache Spark. What’s interesting, is that projects mature and change, and code can remain in production for a number of years after the project’s day in the spotlight is gone. According to Eghbal, “Some of the oldest code ever written is still running in production today. Fortran, which was first developed in 1957 at IBM, is still widely used in aerospace, weather forecasting, and other computational industries.” These ecosystems can exist forever, but the costs of these ecosystems (creation, distribution, and maintenance) are often hidden, especially the maintenance aspect. The cost of creation and distribution has dropped significantly in the past ten years - with many of the world’s developers all working in the same ecosystem on GitHub - but it has also increased the total cost of maintenance, and that maintenance cost can be significant. Bootstrap co-creator Jacob Thornton likens maintenance costs to caring for an old dog: “I’ve created endlessly more and more projects that have now turned [from puppies] into dogs. Almost every project I release will get 2,000, 3,000 watchers, which is enough to have this guilt, which is essentially like ‘I need to maintain this, I need to take care of this dog.” Communities change from toys to clubs to stadiums to federations but they may also change back as new tools are developed. Old projects still need to be maintained and that code and maintenance comes down to committed developers.

Business Themes

1_c7udbm7fJtdkZEE6tl1mWQ.png
  1. Revenue Model Matching. One of the earliest code-hosting platforms was SourceForge, a company founded in 1999. The Company pioneered the idea of code-hosting - letting developers publish their code for easy download. It became famous for letting open-source developers use the platform free of charge. SourceForge was created by VA Software, an internet bubble darling that saw its stock price decimated when the bubble finally burst. The challenge with scaling SourceForge was a revenue model mismatch - VA Software made money with paid advertising, which allowed it to offer its tools to developers for free, but meant its revenue model was highly variable. When the company went public, it was still a small and unproven business, posting $17M in revenue and $31M in costs. The revenue model mismatch is starting to rear its head again, with traditional software as a service (SaaS) recurring subscription models catching some heat. Many cloud service and API companies are pricing by usage rather than a fixed, high margin subscription fee. This is the classic electric utility model - you only pay for what you use. Snowflake CEO Frank Slootman (who formerly ran SaaS pioneer ServiceNow) commented: “I also did not like SaaS that much as a business model, felt it not equitable for customers.” Snowflake instead charges based on credits which pay for usage. The issue with usage-based billing has traditionally been price transparency, which can be obfuscated with customer credit systems and incalculable pricing, like Amazon Web Services. This revenue model mismatch was just one problem for SourceForge. As git became the dominant version control system, SourceForge was reluctant to support it - opting for its traditional tools instead. Pricing norms change, and new technology comes out every day, it’s imperative that businesses have a strong grasp of the value they provide to their customers and align their revenue model with customers, so a fair trade-off is created.

  2. Open Core Model. There has been enormous growth in open source businesses in the past few years, which typically operate on an open core model. The open core model means the Company offers a free, normally feature limited, version of its software and also a proprietary, enterprise version with additional features. Developers might adopt the free version but hit usage limits or feature constraints, causing them to purchase the paid version. The open-source “core” is often just that - freely available for anyone to download and modify; the core's actual source code is normally published on GitHub, and developers can fork the project or do whatever they wish with that open core. The commercial product is normally closed source and not available for modification, providing the business a product. Joseph Jacks, who runs Open Source Software (OSS) Capital, an investment firm focused on open source, displays four types of open core business model (pictured above). The business models differ based on how much of the software is open source. Github, interestingly, employs the “thick” model of being mostly proprietary, with only 10% of its software truly open-sourced. Its funny that the site that hosts and facilitates the most open source development is proprietary. Jacks nails the most important question in the open core model: “How much stays open vs. How much stays closed?” The consequences can be dire to a business - open source too much and all of a sudden other companies can quickly recreate your tool. Many DevOps tools have experienced the perils of open source, with some companies losing control of the project it was supposed to facilitate. On the flip side, keeping more of the software closed source goes against the open-source ethos, which can be viewed as organizations selling out. The continuous delivery pipeline project Jenkins has struggled to satiate its growing user base, leading to the CEO of the Jenkins company, CloudBees, posting the blog post entitled, “Shifting Gears”: “But at the same time, the incremental, autonomous nature of our community made us demonstrably unable to solve certain kinds of problems. And after 10+ years, these unsolved problems are getting more pronounced, and they are taking a toll — segments of users correctly feel that the community doesn’t get them, because we have shown an inability to address some of their greatest difficulties in using Jenkins. And I know some of those problems, such as service instability, matter to all of us.” Striking this balance is incredibly tough, especially in a world of competing projects and finite development time and money in a commercial setting. Furthermore, large companies like AWS are taking open core tools like Elastic and MongoDB and recreating them in proprietary fashions (Elasticsearch Service and DocumentDB) prompting company CEO’s to appropriately lash out. Commercializing open source software is a never-ending battle against proprietary players and yourself.

  3. Compensation for Open Source. Eghabl characterizes two types of funders of open-source - institutions (companies, governments, universities) and individuals (usually developers who are direct users). Companies like to fund improved code quality, influence, and access to core projects. The largest groups of contributors to open source projects are mainly corporations like Microsoft, Google, Red Hat, IBM, and Intel. These corporations are big enough and profitable enough to hire individuals and allow them to strike a comfortable balance between time spent on commercial software and time spent on open source. This also functions as a marketing expense for the big corporations; big companies like having influencer developers on payroll to get the company’s name out into the ecosystem. Evan You, who authored Vue.js, a javascript framework described company backed open-source projects: “The thing about company-backed open-source projects is that in a lot of cases… they want to make it sort of an open standard for a certain industry, or sometimes they simply open-source it to serve as some sort of publicity improvement to help with recruiting… If this project no longer serves that purpose, then most companies will probably just cut it, or (in other terms) just give it to the community and let the community drive it.” In contrast to company-funded projects, developer-funded projects are often donation based. With the rise of online tools for encouraging payments like Stripe and Patreon, more and more funding is being directed to individual open source developers. Unfortunately though, it is still hard for many open source developers to pursue open source on individual contributions, especially if they work on multiple projects at the same time. Open source developer Sindre Sorhus explains: “It’s a lot harder to attract company sponsors when you maintain a lot of projects of varying sizes instead of just one large popular project like Babel, even if many of those projects are the backbone of the Node.js ecosystem.” Whether working in a company or as an individual developer, building and maintaining open source software takes significant time and effort and rarely leads to significant monetary compensation.

Dig Deeper

  • List of Commercial Open Source Software Businesses by OSS Capital

  • How to Build an Open Source Business by Peter Levine (General Partner at Andreessen Horowitz)

  • The Mind Behind Linux (a talk by Linus Torvalds)

  • What is open source - a blog post by Red Hat

  • Why Open Source is Hard by PHP Developer Jose Diaz Gonzalez

  • The Complicated Economy of Open Source

tags: Github, Gitlab, Google, Twitch, Instagram, E;, Elm, Javascript, Open Source, Git, Linus Torvalds, Linux, Microsoft, MapReduce, IBM, Fortran, Node, Vue, SourceForge, VA Software, Snowflake, Frank Slootman, ServiceNow, SaaS, AWS, DevOps, CloudBees, Jenkins, Intel, Red Hat, batch2
categories: Non-Fiction
 

September 2020 - Women of Color in Tech by Susanne Tedrick

This month we dove into Susanne Tedrick’s new book, Women of Color in Tech. Tedrick provides an excellent overview of the challenges many women of color face when trying to enter into and stay in the technology industry. The mix of real-world advice, personal experience, and industry stories combine to form a comprehensive resource for anyone in technology or looking to enter the field.

Tech Themes

  1. The Current State. Tedrick starts the book with uncomfortable statistics. Only 26% of computing roles are held by women; Black women hold 3% and Hispanic women hold 2% of computing roles. In addition, the trends aren’t positive - 26% is a 9% decrease since 1990. According to the Ascend Foundation, a Pan-Asian organization for business professionals, from 2007 to 2015, black women experienced a 13% decrease in professional roles in technology. While distressing, there are some green shoots, a 2012 paper by Heather Gonzalez and Jeffrey Kuenzi pointed out that science and engineering graduate program enrollments grew 65%, 55%, and 50% for Hispanic/Latino, American Indian/Alaska Native, and African American students, respectively. So why is this? Tedrick acknowledges that there is no one single answer, instead, its a combination of circumstances starting at early adolescence. Tedrick introduces the idea of “STEM Deserts” or areas where STEM education is not offered. These deserts disproportionally affect high poverty schools (schools where 75% or more of the students are eligible for free lunch and breakfast). Almost half of these schools contain large Black and Hispanic populations. Once women of color arrive at college it gets harder: “Coupling [student debt] with professor’s biases, a lack of meaningful support at home or within their community, and few to no peers with whom they can identify in their academic programs, many young women of color struggle to get through their programs.” For the few that conquer all of these challenges, the workplace introduces a whole new set of issues. Tedrick cites the Kapor Center’s Tech Leavers Study: “Thirty percent of women of color respondents claimed that they were passed over for promotions and 24% report being stereotyped.” According to a Harvard Business Review article written by feminist legal scholar Joan Williams, “77% of black women report having to prove themselves over and over; their success discounted and their expertise questioned.” When you compile all of these challenges throughout a lifetime, it becomes an incredibly difficult journey for black women in tech.

  2. Technical Roles and the Building Blocks of the Internet. Tedrick introduces many key organizational roles in technology including business analysis, consulting, data science, information security, product management, project management, software development, technical sales, technical support, user experience design, and web design. After introducing each one, she provides a prescriptive guide for individuals looking to learn more - hitting on key skills, educational requirements, and the latest trends. While I can’t cover every role here, one underappreciated position / sub-segment of technology Tedrick discusses is computer networking. Ultimately, networking was the benefit that unlocked the internet to the masses. Protocols like TCP/IP, VoIP, and HTTP are crucial to the functioning internet. These protocols offer ways for computers to communicate with one another in a consistent manner. The IP (Internet Protocol) provides basic addressing for computers and TCP provides the continual delivery of ordered and reliable bytes from one computer to another in what are called packets. A packet is a pre-defined standard for sending data. VoIP is an extension of this protocol specifically for transcoding audio and video voice signals into packets. HTTP is the way you request the data found at a location: http://techbookofthemonth.com tells the browser to fetch the website at that URL. A lot of basic networking features are typically baked into the operating system, which for most consumers today is Linux. Linux is an open-source operating system that handles all of the things that makes your computer run: memory, CPU, connected devices, graphics, desktop environment, and the ability to run applications. However, Linux programming is still not a commonly learned skill. Tedrick quotes Tameika Reed, a senior infrastructure engineer and founder of Women in Linux: “We have people who are getting degrees and PhDs and so on. . . . When it comes down to Linux, which runs in 90 percent of most companies, and it’s time to troubleshoot something, they don’t know how to troubleshoot the basics of the foundation. I look at Linux as the foundations of getting into tech.” Red Hat, which was acquired by IBM for $34 billion in 2019, offers an enterprise version of Linux which comes with support, guaranteed versioning, and additional security. While computer networking is not a flashy industry, it underpins so much that it remains very interesting.

  3. Technology Skills. Chapter six lays out a great way to assess your own skills and understand where you need improvement. These skills can require additional schooling via college, trade schools, or massive-open-online-courses (MOOCs) like Coursera but other ways to complement this learning include hackathons, conferences, networking, and volunteering. Tedrick wanted to improve her own skills so she volunteered to help set up a conference: “To improve my web design, WordPress, and conference organization skills, I volunteered my services for a leadership conference being held by IEEE Women in Engineering for four months in 2016. I helped to build and maintain the event website using WordPress, as well as helped people with registration and refunds. This experience greatly improved my understanding of web design, search engine optimization (SEO), event promotion, and collaborating with remote teams (I was based in Chicago, while much of the event team and registrants were based in and around Detroit, Michigan). In the process, I learned more about the different fields of engineering and broadened my network with incredible engineering students and professionals.” The book is incredibly helpful for skill-building - it gives you the exact things you need to learn to be successful in specific positions and it even clears up some myths of the technology industry. One common myth is that “Tech Careers Require Constant, Hands-On Programming.” As evidenced by the myriad of roles listed above, the technology industry involves so much more than programming. In addition, Tech careers exist outside of the top five big-name companies like Microsoft, Google, Facebook, Amazon, and Netflix and even exist at non-tech companies too. One critical skill that Tedrick highlights for a number of different technical roles is communication. Communication is not often mentioned when discussing software engineering, but Tedrick picks up on its huge importance, and the necessary ability to communicate to technical and non-technical audiences. On top of sharing with non-technical audiences, engineers need to know how to communicate accurate deadlines to managers and ask for help when unsure of how to implement a challenging new feature. Communication is not just speaking, its also listening and empathetically understanding where others are coming from, to establish common ground and grow mutual understanding.

Business Themes

equal-pay-by-race_new-website_50-50_900x700_acf_cropped-1.png
Screenshot 2020-10-10 145811.png
  1. Tedrick’s Story and Grit. Susanne’s personal stories appear throughout the book and perfectly complement the substantial amount of how-to information and advice. Chapter nine talks about the daily challenges of many women of color in tech and their lack of support to solve those challenges. Susanne’s own story is one of incredible determination and perseverance: “My mother had been diagnosed with a brain tumor when I was very young. This initial tumor led to more health issues for her over the years, including a decline into dementia, a loss of some of her short-term memory, and impacted mobility. The latter half of her life was spent in and out of hospitals, having numerous operations and medical incidents. My father was left to care for me and my sister, while also supporting several other family members in one house. Between work and caring for my mom, he couldn’t be around much, and fortunately, some nearby relatives and family friends helped to raise and care for us. As there was only one income (already too high to qualify for most public assistance programs) and my mother needed many medications, there were times where a choice had to be made between eating, having phone service, making critical house repairs, or having the lights stay on. This went on for nearly two decades, up until my mother’s death. It wasn’t until well into my adult life that I realized I was living in ‘survival mode’ and just trying to exist. I was spending most of my time trying to find happiness in my life; having a meaningful and engaging career was not an immediate goal or one I thought was achievable for me.” After working in administrative roles and taking on a couple of different jobs, she managed to attend Northwestern while continuing to work. “I used much of my vacation and holiday time from work not only to study but to attend conferences, interviews, boot camps, and the like. I did homework during lunch breaks or before the start of a full workday, only to go to class for several hours in the same evening.” Tedrick has risen to be an award-winning public speaker, author, and technologist at IBM (oh and she’s also run a couple of marathons). Her story is truly inspirational!

  2. Culture, Intersectionality, and Bias. We’ve discussed Clayton Christensen’s Resources-Processes-Values framework before and how they impact the discovery of emerging technologies. Often the processes create a culture and set of habitual routines that can be difficult to change. The culture of big technology has been anti-women for a long time. As Tedrick points out, women of color not only have to deal with this challenge but also repeated racial abuse, microaggressions, and tokenism. Kimberlé Crenshaw called this intersectionality, or the idea that a person's social identities (e.g., gender, caste, sex, race, class, sexuality, religion, disability, physical appearance, height, etc.) combine to create unique modes of discrimination and privilege. Tedrick points out an example of this with Sheryl Sandberg’s famous novel, Lean In. The book became a bestseller and made Sheryl Sandberg a household name (to those that didn’t already know her as COO of Facebook). However, as Tedrick points out: “The central problem with the book, which Sandberg herself later acknowledged, is that it assumed that the reader had certain privileges that many women of color do not have: completely supportive households that don’t require much of their time and attention, work cultures that allow expression of their thoughts without fear of being fired or held back, and access to career mentors to help them become stronger leaders. This lack of understanding of where the reader may be coming from and experiencing caused much of Sandberg’s advice to ring hollow for women of color.” The book ignores the structural challenges that many women of color face. Michelle Obama put it bluntly: “It’s not always enough to lean in, because that shit doesn’t work all the time.” When building culture at an organization, it’s super important to think about how that culture addresses each social identity at the company. Furthermore, it’s not the responsibility of diverse individuals to build that culture. Tedrick sums it up well: “Addressing tokenism, much like addressing bias, unfortunately, is not something that you alone can address. It is also not our responsibility to address this. It is up to organizations and their leaders to correct and address tokenism so that women of color are fully engaged.”

  3. Negotiating Compensation. Understanding pay and compensation are critical to understanding any job offer. Frequently job candidates are remiss to ask for additional compensation because they fear retribution like the offer is pulled and given to someone else and worry about sounding greedy before even joining a new company. As Susanne found out after receiving her first traditional job, this can lead to lower salaries, especially when adjusting for location. In addition, Susanne points out the enormous gender pay gap that occurs at organizations: “It’s no secret that women—and specifically, women of color—are underpaid in about every industry, not just tech. While it is on companies to fix their approaches to compensation, it is our right and duty to demand fair compensation for our work.” A study of the technology industry done by job search marketplace, Hired, shows that black women were paid $0.89 on the dollar compared to white males. This is the lowest across White, Asian, Black, and Hispanic men and women in the technology sector. For LGBTQI+ individuals, the wage gap is $0.90 to $1 of compensation for non-LGBTQI+. While pay gap detail for black LGBTQI+ community is under-studied, according to The National LGBTQ Task Force’s 2011, 48% of trans and gender non-conforming black individuals experienced discrimination in the hiring process. Outside of the technology industry, the pay gap is even more stark with Black women earning $0.62 for every dollar earned by a White male. To address many of these challenges, and ensure that candidates get as close to a fair offer as possible, Tedrick lays out a framework for considering a new job, from pay to benefits to location. Tedrick advises individuals to first research local salaries for the role they are taking on. Armed with data, Tedrick suggests candidates try to be confident, respectful, and flexible in all discussions and to emphasize the unique value they bring to the organization.

Dig Deeper

  • Work Smart & Start Smart: Salary Negotiation for Women of Color

  • Anita Borg and the history of one of the largest professional organizations for women in technology

  • How the World’s most prevalent operating system was built by a 21-year old in Finland

  • Black Girls Code: Empowering Young Black Women to Become Innovators

  • Tedrick’s Twitter, website, and talk with the Women’s National Book Association

tags: TCP/IP, VoIP, HTTP, Computer Networking, Linux, Red Hat, IBM, Susanne Tedrick, Coursera, IEEE Women in Engineering, Grit, Culture, Diversity, Women in Tech, Intersectionality, Facebook, Sheryl Sandberg, Michelle Obama, Gender Pay Gap, batch2
categories: Non-Fiction
 

August 2020 - Venture Deals by Brad Feld and Jason Mendelson

This month we checked out an excellent book for founders, investors, and those interested in private company financings. The book hits on a lot of the key business and legal terms that aren’t discussed in typical startup books, making it useful no matter what stage of the entrepreneurial journey you are on.

Tech Themes

  1. The Rise of Founder Friendly VC. Writing on his blog, Feld Thoughts, which was the original genesis for Venture Deals, Brad Feld mentioned that: “From 2010 forward, the entire VC market shifted into a mode that many describe as ‘founder friendly.’ Investor reputation mattered at both the angel and VC level.” In the 80’s and 90’s, because there was so little competition among venture capital firms, it was common for firms to dictate terms to company founders. The VC firms were the ones with the cash, and the founders didn’t have many options to choose from. If you wanted to build a big, profitable, public company, the only way to get there was by taking venture capital money. This trend started to unwind during the internet bubble, when founders started to maintain more and more of their businesses before the IPO. In fact, as this Harvard Business Review article points out, it was actually common to fire the founder/CEO prior to a public offering in favor of more seasoned leaders. This trend was bucked by Netscape, which eschewed traditional wisdom, going public less than a year from founding, with an unprofitable business. The Netscape IPO was clearly a royal coming-together of technology history. Tracing it all the way back - George Winthrop Fairchild started IBM in 1911; in the late 50’s, Arthur Rock convinced Fairchild’s son, Sherman to fund the traitorous eight (eight employees who left competitor Shockley Semiconductor) to start Fairchild Semiconductor; Eugene Kleiner (one of the traitorous eight) starts Kleiner Perkins, a venture capital firm that eventually invested in Netscape. Kleiner Perkins would also invest in Google (frequently regarded as one of the best and riskiest startup investments ever). Google was the first internet company to go public with a dual-class share structure where the founders would own a disproportionate amount of the voting rights of the company. Marc Andreessen, the founder of Netscape, loved this idea and eventually launched his own venture capital firm called Andreessen Horowitz, which ushered in a new generation of founder-friendly investing. At one point Andreessen was even quoted saying: “It is unsafe to go public today without a dual-class share structure.” Some notable companies with dual class shares include several Andreessen companies such as Facebook, Zynga, Box, and Lyft. Recently some have questioned whether founder friendly terms have pushed too far with some major flameouts from companies with the structure including Theranos, WeWork, and Uber.

  2. How to Raise Money. Feld has several recommendations for fundraising that are important including having a target round size, demo, financial projections, and VC syndicate. Feld contends that CEOs who offer a range of varying round sizes to VC’s don’t really understand their business goals and use of proceeds. By having a concrete round size it shows that the CEO understands roughly how much money it will take to get to the next milestone or said another way, it shows the CEO understands the runway (in months) needed to build that new product or feature. It shows command of the financing and vision of the business. Feld encourages founders to provide a demo, because: “while never required, many investors respond to things we can play with, so even if you are an early stage company, a prototype or demo is desirable.” Beyond the explicit point here, the demo shows confidence in the product and at least some ability to sell, which is obviously a key aspect in eventually scaling the business. Another aspect of scaling the business is the financial model, but as Feld states, “the only thing that can be known about a pre-revenue company’s financial projections is that they are wrong.” While the numbers are meaningless for really early stage companies, for those that have a few customers it can be helpful to get a sense of long-term gross margins and aspects of the company you hope to invest in and / or change over time. Lastly, Feld gives advice for building a VC syndicate, or group of VC investors. Frequently lead investors will commit a certain dollar amount of the round, and it will be up to the founder/CEO to go find a way to build out the round. This can be incredibly challenging as detailed by Moz founder, Rand Fishkin, who thought he had a deal in hand only to see it be taken away. There are multiple bids in the VC fundraising process, one called an indication of interest, which is non-binding and normally provides a range on valuation, one called a letter of intent, which is slightly more detailed and may include legal terms of the deal such as board representation, liquidation preference, and governance terms, and then final legal documentation. A lot of time, the early bids can be withdrawn based off of poor market feedback or when a company misses its financial projections (like Moz did in its process). Understanding the process and the materials needed to complete the deal is helpful at setting expectations for founders.

  3. Warrants, SPACs, and IPOs. With SPACMania in full-swing, we wanted to dive into SPACs and see how they work. We’ve discussed SPACs before, with regards to Chamath’s Social Capital merger with Virgin Galactic. But how do traditional SPAC financings work and why is there a rush of famous people, such as LinkedIn founder Reid Hoffman, to raise them? A SPAC or Specialty Purpose Acquisition Company is a blank-check company which goes public with the goal of acquiring a business, thereby taking it public. SPACs can be focused on industry or size of company and they are most frequently led by operational leaders and / or private equity firms. The reason SPACs have been gaining in popularity is that public markets investors are seeking more risk and a few high profile SPAC deals, namely DraftKings and Nikola, have traded better than expected. Most companies that are going public today are older, more mature businesses, and the public markets have been generally favorable to somewhat suspect ventures (Nikola is an electric truck company that has never produced a single truck, but is worth $14B on hype alone). VC firms and companies see the ability to get outsized returns on their investments because so many people are clamoring to find returns above the basically 0% offered by treasury bonds. The S&P 500 P/E ratio is now at around 26x compared to a historical average around 16x, meaning the market seems to be overvalued compared to prior times. SPACs typically come with an odd structure. A unit in a SPAC normally consists of one common share of stock and one warrant, which is the ability to purchase shares for $0.01 after a SPAC merges with its target company. The founders of the SPAC also receive founder shares, normally 20% of the business. Once the target is found, SPACs will often coordinate a PIPE (Private Investment in Public Equity), where a large private investor will invest mainly primary (cash to the balance sheet) capital into the business. This has emerged as a hip, new alternative to traditional IPOs, keeping with the theme of innovation in public offerings like direct listings, however, its unclear that this really benefits the company going public. Often the merged companies are the subject of substantial dilution by the SPAC sponsors and PIPE investors, lowering the overall equity piece management maintains. However, given the somewhat high valuations companies are receiving in the public markets (Zoom at 80x+ LTM Revenue, Shopify at 59x LTM Revenue), it may be worth the dilution.

Business Themes

graph-6.jpg
screen-shot-2016-12-21-at-13-57-20_orig.png
screen-shot-2016-12-21-at-13-59-11_orig.png
  1. How VC’s Make Money. In VC, the typical fund structure includes a general partnership (GP) and limited partners (LPs). The GP is the investors at the VC firm and the limited partners are the institutional investors that provide the money for the VC firm to invest. A typical structure involves the GP investing 1% of their own money (99% comes from LPs) and then getting paid an annual 2% management fee as well as 20% carried interest, or the profit made from investments. Using the example from the book: “Start with the $100 million fund. Assume that it's a successful fund and returns 3× the capital, or $300 million. In this case, the first $100 million goes back to the LPs, and the remaining profit, or $200 million, is split 80 percent to the LPs and 20 percent to the GPs. The VC firm gets $40 million in carried interest and the LPs get the remaining $160 million. And yes, in this case everyone is very happy.” Understanding how investors make money can help the entrepreneur better understand why VC’s pressure companies. As Feld points out, sometimes VC’s are trying to raise a new fund or have invested the majority of the fund already and thus do not care as much about some investments.

  2. Growth at all costs. There has been a concerted focus in VC on the get big quick motto. Nobody better exemplifies this than Masayoshi Son and the $100B VC his firm Softbank raised a few years ago. With notable big bets on current losers like WeWork and Oyo, which are struggling during this pandemic, its unclear whether this motto remains true. Eric Paley, a Managing Partner at Founder Collective, expertly quantifies the potential downsides of a risk-it-all strategy: “Investors today have overstuffed venture funds, and lots of capital is sloshing around the startup ecosystem. As a result, young startups with strong teams, compelling products and limited traction can find themselves with tens of millions of dollars, but without much real validation of their businesses. We see venture investors eagerly investing $20 million into a promising company, valuing it at $100 million, even if the startup only has a few million in net revenue. Now the investors and the founders have to make a decision — what should determine the speed at which this hypothetical company, let’s call it “Fuego,” invests its treasure chest of money in the amazing opportunity that motivated the investors? The investors’ goal over the next roughly 24 months is for the company to become worth at least three times the post-money valuation — so $300 million would be the new target pre-money valuation for Fuego’s next financing. Imagine being a company with only a few million in sales, with a success hurdle for your next round of $300 million pre-money. Whether the startup’s model is working or not, the mantra becomes ‘go big or go home.’” This issue is key when negotiating term sheets with investors and understanding board dynamics. As Feld calls out: “The voting control issues in the early stage deals are only amplified as you wrestle with how to keep control of your board when each lead investor per round wants a board seat. Either you can increase your board size to seven, nine, or more people (which usually effectively kills a well-functioning board), or more likely the board will be dominated by investors.” As an entrepreneur, you need to be cognizant of the pressure VC firms will put on founders to grow at high rates, and this pressure is frequently applied by a board. Often late stage startups have 10 people+ on their board. UiPath, a private venture-backed startup that has raised over $1B and is valued at $10B, has 12 people on its board. With all of the different firms having their own goals, boards can become ineffective. Whenever startups are considering fundraising, it’s important to realize the person you are raising from will be an ongoing member of the company and voice on the board and will most likely push for growth.

  3. Liquidation Preference. One of the least talked about terms in venture capital among startup circles is liquidation preference. Feld describes liquidation preference as: “a certain multiple of the original investment per share is returned to the investor before the common stock receives any consideration.” Startup culture has tended to view fundraises as stamps of approval and success, but thats not always the case. As the book discusses, preference can lead to very negative outcomes for founders and employes. For example, let’s say a company at $10M in revenue raises $100 million with a 1x liquidation preference at a $400 million pre-money valuation ($500M post money). The company is pressured by its VCs to grow quickly but it has issues with product market fit and go to market; five years go by and the company is at $15M in revenue. At this point the VCs are not interested in funding any more, and the board decides to try to sell the company. A buyer offers $80 million and the board accepts it. At this point, all $80M has to go back to the original investors who had the 1x liquidation preference. All of the common stockholders and the founders, get nothing. Its not the desired outcome by any means, but its important to know. Some companies have not heeded this advice and continued to raise at massive valuations including Notion which has raised $10M at a $800 million valuation, despite being rumored to be around $15M in revenue. The company raised at a $1.6B valuation (an obvious 2x) after being rumored to be at $30M in revenue. While not taking dilution is nice as a founder, it also sets up a massive hurdle for the company and seriously cramps returns. A 3x return (which is low for VC investors) means selling the company for $4.8B, which is no small feat.

Dig Deeper

  • Feld Thoughts: Brad Feld’s Blog

  • The Ultimate Guide to Liquidation Preferences

  • Startup Boards: A deep dive by Mark Suster, VC at Upfront Ventures

  • The meeting that showed me the truth about VCs on TechCrunch

  • SPOTAK: The Six Traits Marc Lore Looks for When Hiring

tags: Uber, WeWork, Theranos, Fairchild Semiconductor, Netscape, Marc Andreessen, SPAC, Chamath Palihapitiya, Zynga, Box, Facebook, Brad Feld, Nikola, Draftkings, Zoom, Shopify', Warrants, Liquidation Preference, VC, Founder Collective, Oyo, UiPath, Notion, Softbank, batch2
categories: Non-Fiction
 

July 2020 - Innovator's Dilemma by Clayton Christensen

This month we review the technology classic, the Innovator’s Dilemma, by Clayton Christensen. The book attempts to answer the age-old question: why do dominant companies eventually fail?

Tech Themes

  1. The Actual Definition of Disruptive Technology. Disruption is a term that is frequently thrown around in Silicon Valley circles. Every startup thinks its technology is disruptive, meaning it changes how the customer currently performs a task or service. The actual definition, discussed in detail throughout the book, is relatively specific. Christensen re-emphasizes this distinction in a 2015 Harvard Business Review article: "Specifically, as incumbents focus on improving their products and services for their most demanding (and usually most profitable) customers, they exceed the needs of some segments and ignore the needs of others. Entrants that prove disruptive begin by successfully targeting those overlooked segments, gaining a foothold by delivering more-suitable functionality—frequently at a lower price. Incumbents, chasing higher profitability in more-demanding segments, tend not to respond vigorously. Entrants then move upmarket, delivering the performance that incumbents' mainstream customers require, while preserving the advantages that drove their early success. When mainstream customers start adopting the entrants' offerings in volume, disruption has occurred." The book posits that there are generally two types of innovation: sustaining and disruptive. While disruptive innovation focuses on low-end or new, small market entry, sustaining innovation merely continues markets along their already determined axes. For example, in the book, Christensen discusses the disk drive industry, mapping out the jumps which pack more memory and power into each subsequent product release. There is a slew of sustaining jumps for each disruptive jump that improves product performance for existing customers but doesn't necessarily get non-customers to become customers. It is only when new use cases emerge, like rugged disk usage and PCs arrive, that disruption occurs. Understanding the specific definition can help companies and individuals better navigate muddled tech messaging; Uber, for example, is shown to be a sustaining technology because its market already existed, and the company didn't offer lower prices or a new business model. Understanding the intricacies of the definition can help incumbents spot disruptive competitors.

  2. Value Networks. Value networks are an underappreciated and somewhat confusing topic covered in The Innovator's Dilemma's early chapters. A value network is defined as "The context within which a firm identifies and responds to customers' needs, solves problems, procures input, reacts to competitors, and strives for profit." A value network seems all-encompassing on the surface. In reality, a value network serves to simplify the lens through which an organization must make complex decisions every day. Shown as a nested product architecture, a value network attempts to show where a company interacts with other products. By distilling the product down to its most atomic components (literally computer hardware), we can see all of the considerations that impact a business. Once we have this holistic view, we can consider the decisions and tradeoffs that face an organization every day. The takeaway here is that organizations care about different levels of performance for different products. For example, when looking at cloud computing services at AWS, Azure, or GCP, we see Amazon EC2 instances, Azure VMs, and Google Cloud VMs with different operating systems, different purposes (general, compute, memory), and different sizes. General-purpose might be fine for basic enterprise applications, while gaming applications might need compute-optimized, and real-time big data analytics may need a memory-optimized VM. While it gets somewhat forgotten throughout the book, this point means that organizations focused on producing only compute-intensive machines may not be the best for memory-intensive, because the customers of the organization may not have a use for them. In the book's example, some customers (of bigger memory providers) looked at smaller memory applications and said there was no need. In reality, there was massive demand in the rugged, portable market for smaller memory disks. When approaching disruptive innovation, it's essential to recognize your organization's current value network so that you don't target new technologies at those who don't need it.

  3. Product Commoditization. Christensen spends a lot of time describing the dynamics of the disk drive industry, where companies continually supplied increasingly smaller drives with better performance. Christensen's description of commoditization is very interesting: "A product becomes a commodity within a specific market segment when the repeated changes in the basis of competition, completely play themselves out, that is, when market needs on each attribute or dimension of performance have been fully satisfied by more than one available product." At this point, products begin competing primarily on price. In the disk drive industry, companies first competed on capacity, then on size, then on reliability, and finally on price. This price war is reminiscent of the current state of the Continuous Integration / Continuous Deployment (CI/CD) market, a subsegment of DevOps software. Companies in the space, including Github, CircleCI, Gitlab, and others are now competing primarily on price to win new business. Each of the cloud providers has similar technologies native to their public cloud offerings (AWS CodePipeline and CloudFormation, GitHub Actions, Google Cloud Build). They are giving it away for free because of their scale. The building block of CI/CD software is git, an open-source version control system founded by Linux founder Linus Torvalds. With all the providers leveraging a massive open-source project, there is little room for true differentiation. Christensen even says: "It may, in fact, be the case that the product offerings of competitors in a market continue to be differentiated from each other. But differentiation loses its meaning when the features and functionality have exceeded what the market demands." Only time will tell whether these companies can pivot into burgeoning highly differentiated technologies.

Business Themes

Innovator Dilemma.png
R1512B_BIG_MODEL-1200x1035.png
  1. Resources-Processes-Value (RPV) Framework. The RPV framework is a powerful lens for understanding the challenges that large businesses face. Companies have resources (people, assets, technology, product designs, brands, information, cash, relationships with customers, etc.) that can be transformed into greater value products and services. The way organizations go about converting these resources is the organization's processes. These processes can be formal (documented sales strategies, for example) or informal (culture and habitual routines). Processes are the big reasons organizations struggle to deal with emerging technologies. Because culture and habit are ingrained in the organization, the same process used to launch a mature, slow-growing market may be applied to a fast-growing, dynamic sector. Christensen puts it best: "This means the very mechanisms through which organizations create value are intrinsically inimical to change." Lastly, companies have values, or "the standards by which employees make prioritization decisions." When there is a mismatch between the resources, processes, and values of an organization and the product or market that an organization is chasing, its rare the business can be successful in competing in the disruptive market. To see this misalignment in action, Christensen describes a meeting with a CEO who had identified the disruptive change happening in the disk-drive market and had gotten a product to market to meet the growing market. In response to a publication showing the fast growth of the market, the CEO lamented to Christensen: "I know that's what they think, but they're wrong. There isn't a market. We've had that drive in our catalog for 18 months. Everyone knows we've got it, but nobody wants it." The issue was not the product or market demand, but the organization's values. As Christensen continues, "But among the employees, there was nothing about an $80 million, low-end market that solved the growth and profit problems of a multi-billion dollar company – especially when capable competitors were doing all they could to steal away the customers providing those billions. And way at the other end of the company there was nothing about supplying prototype companies of 1.8-inch drives to an automaker that solved the problem of meeting the 1994 quotas of salespeople whose contacts and expertise were based so solidly in the computer industry." The CEO cared about the product, but his team did not. The RPV framework helps evaluate large companies and the challenges they face in launching new products.

  2. How to manage through technological change. Christensen points out three primary ways of managing through disruptive technology change: 1. "Acquire a different organization whose processes and values are a close match with the new task." 2. "Try to change the processes and values of the current organization." 3. "Separate out an independent organization and develop within it the new processes and values that are required to solve the new problem." Acquisitions are a way to get out ahead of disruptive change. There are so many examples but two recent ones come to mind: Microsoft's acquisition of Github and Facebook's acquisition of Instagram. Microsoft paid a whopping $7.5B for Github in 2018 when the Github was rumored to be at roughly $200M in revenue (37.5x Revenue multiple!). Github was undoubtedly a mature business with a great product, but it didn't have a ton of enterprise adoption. Diane Greene at Google Cloud, tried to get Sundar Pichai to pay more, but he said no. Github has changed Azure's position within the market and continued its anti-Amazon strategy of pushing open-source technology. In contrast to the Github acquisition, Instagram was only 13 employees when it was acquired for $1B. Zuckerberg saw the threat the social network represented to Facebook, and today the acquisition is regularly touted as one of the best ever. Instagram was developing a social network solely based on photographs, right at the time every person suddenly had an excellent smartphone camera in their pocket. The acquisition occurred right when the market was ballooning, and Facebook capitalized on that growth. The second way of managing technological change is through changing cultural norms. This is rarely successful, because you are fighting against all of the processes and values deeply embedded in the organization. Indra Nooyi cited a desire to move faster on culture as one of her biggest regrets as a young executive: "I’d say I was a little too respectful of the heritage and culture [of PepsiCo]. You’ve got to make a break with the past. I was more patient than I should’ve been. When you know you have to make a change, at some point you have to say enough is enough. The people who have been in the company for 20-30 years pull you down. If I had to do it all over again, I might have hastened the pace of change even more." Lastly, Christensen prescribes creating an independent organization matched to the resources, processes, and values that the new market requires. Three great spin-out, spin-in examples with different flavors of this come to mind. First, Cisco developed a spin-ins practice whereby they would take members of their organization and start a new company that they would fund to develop a new process. The spin-ins worked for a time but caused major cultural issues. Second, as we've discussed, one of the key reasons AWS was born was that Chris Pinkham was in South Africa, thousands of miles away from Amazon Corporate in Seattle; this distance and that team's focus allowed it to come up with a major advance in computing. Lastly, Mastercard started Mastercard Labs a few years ago. CEO Ajay Banga told his team: "I need two commercial products in three years." He doesn't tell his CFO their budget, and he is the only person from his executive team that interacts with the business. This separation of resources, processes, and values allows those smaller organizations to be more nimble in finding emerging technology products and markets.

  3. Discovering Emerging Markets.

    The resources-processes-values framework can also show us why established firms fail to address emerging markets. Established companies rely on formal budgeting and forecasting processes whereby resources are allocated based on market estimates and revenue forecasts. Christensen highlights several important factors for tackling emerging markets, including focusing on ideas, failure, and learning. Underpinning all of these ideas is the impossibility of predicting the scale and growth rate of disruptive technologies: "Experts' forecasts will always be wrong. It is simply impossible to predict with any useful degree of precision how disruptive products will be used or how large their markets will be." Because of this challenge, relying too heavily on these estimates to underpin financial projections can cause businesses to view initial market development as a failure or not worthy of the companies time. When HP launched a new 1.3-inch disk drive, which could be embedded in PDAs, the company mandated that its revenues had to scale up to $150M within three years, in line with market estimates. That market never materialized, and the initiative was abandoned as a failed investment. Christensen argues that because disruptive technologies are threats, planning has to come after action, and thus strategic and financial planning must be discovery-based rather than execution-based. Companies should focus on learning their customer's needs and the right business model to attack the problem, rather than plan to execute their initial vision. As he puts it: "Research has shown, in fact, that the vast majority of successful new business ventures, abandoned their original business strategies when they began implementing their initial plans and learned what would and would not work." One big fan of Christensen's work is Jeff Bezos, and its easy to see why with Amazon's focus on releasing new products in this discovery manner. The pace of product releases is simply staggering (~almost one per day). Bezos even talked about this exact issue in his 2016 shareholder letter: "The senior team at Amazon is determined to keep our decision-making velocity high. Speed matters in business – plus a high-velocity decision making environment is more fun too. We don't know all the answers, but here are some thoughts. First, never use a one-size-fits-all decision-making process. Many decisions are reversible, two-way doors. Those decisions can use a light-weight process. For those, so what if you're wrong? I wrote about this in more detail in last year's letter. Second, most decisions should probably be made with somewhere around 70% of the information you wish you had. If you wait for 90%, in most cases, you're probably being slow." Amazon is one of the first large organizations to truly embrace this decision-making style, and clearly, the results speak for themselves.

Dig Deeper

  • What Jeff Bezos Tells His Executives To Read

  • Github Cuts Subscription Price by More Than Half

  • Ajay Banga Opening Address at MasterCard Innovation Forum 2014

  • Clayton Christensen Describing Disruptive Innovation

  • Why Cisco’s Spin-Ins Never Caught On

tags: Amazon, Google Cloud, Microsoft, Azure, Github, Gitlab, CircleCI, Pepsi, Jeff Bezos, Indra Nooyi, Mastercard, Ajay Banga, HP, Uber, RPV, Facebook, Instagram, Cisco, batch2
categories: Non-Fiction
 

June 2020 - Bad Blood by John Carreyrou

This month we review John Carreyrou’s chilling story of the epic meltdown of a company, Theranos. We explore bad decision making, the limits of technology and the importance of strong corporate governance. The saddest thing and the reason Bad Blood hits so hard is that Theranos was a startup that seemed to have everything: a breakthrough blood analyzer, tons of funding, excellent board representation, and a smart, visionary female CEO. But underneath, it was a twisted cult of distrust with an evil leader.

Tech Themes

  1. The limits of technology. Sometimes technology sounds too good to be true. Theranos’ Edison and miniLab blood analyzers were supposed to tell you everything you could ever want to know about your blood. But they didn’t work and never had a shot to work. Stanford professor Phyllis Gardener even told Elizabeth Holmes (Theranos’ founder/CEO) early-on that an early patch-like design of the product would never work: “[Holmes] just kind of blinked and nodded and left. It was just a 19-year-old talking who’d taken one course in microfluidics, and she thought she was gonna make something of it.” It was debunked by almost every scientist as wild fantasy even prior to its commercial use and subsequent fall from grace. There is something so human about wanting to believe there are no limits to technology. In today’s day of fake technology marketing, it’s easy for messaging to slowly take over a company if left unchecked. Think about Snap’s famous declaration, “Snap Inc. is a camera company.” or Dropbox’s S-1 mission statement: “Unleash the world’s creative energy by designing a more enlightened way of working.” These statements ignore what these businesses fundamentally do - advertising and storage. Sometimes there are massive leaps forward, like the transistor, networked computing, and the internet, but even these took many many years to push to fruition. When humans hear a compelling pitch, it is natural to want to remove those limits of technology because the result is so astounding, but we have to remain skeptical or risk another Theranos.

  2. The reality distortion field. Elizabeth Holmes was obsessed with Steve Jobs. Mired in this deep fixation, she also managed to subscribe to one of Jobs’ interesting habits: the reality-distortion field. While we’ve discussed the reality distortion field before in relation to Jobs, Holmes seemed to take it to a new level. Jobs would demand something incredible be done and a lot of times his amazing team could come up with the solution. Holmes also believed this but failed to consider two things: fundamental biology and her team. Biology, at its core, is just not as flexible as the hardware and software that Apple was building. Jobs demanded an excellent product, Holmes demanded a biological impossibility. Beyond searching to enable a biological impossibility, which to be frank, can pop up after years of research (see CRISPR), Holmes operated the Theranos cult as a dictator, ruthlessly seeking out dissenters and punishing or firing them. While Jobs challenged his team repeatedly while being a huge asshole, the team, for the most part, stayed in tact (Phil Schiller, Tony Fadell, Jony Ive, Scott Forstall, and Eddy Cue). There were certainly those who got fired or left, but Holmes active rooting out of non-believers severely limited the chances of success at the company. The additional levels of secrecy were even extreme for a stealth technology startup. Startup founders need to drink the kool-aid sometimes, it comes with being visionary, but getting so drunk on power and image can only lead to personal and business demise as was the case with Theranos.

  3. When startups turn bad. Tons of startups fail, but only a few turn truly malicious. Theranos was one of those few. The company tested people’s blood and gave individuals fake, untested medical results, including indicators of cancer diagnoses! Even when reviewing other major business failures and frauds - Jeff Skilling at Enron and Bernie Madoff’s Ponzi Scheme - nothing compares to Theranos. While it could be argued that Enron and Madoff’s schemes did more and broader financial hurt to society, at least they were never physically endangering individuals. The only comparisons that may be warranted are Boeing and the Fyre Festival. The brainchild of famous clown, Billy McFarland, the Fyrefest certainly endangered people by marooning them on an island with little food. Furthermore, Boeing’s incredibly incoherent internal review process which knowingly led to the production of a faulty airline software system, also endangered people - including two flights that crashed because of its system. Did Elizabeth Holmes set out to build a dangerous device, knowingly defraud investors, and endanger the public? Probably not. It was one decision after another. It was firing CFO Henry Mosley who called out fake projections; it was hiring Boies Schiller to pressure former employees; it was enlisting Sunny Balwani to “run” the company. It was what Clayton Christensen calls marginal thinking - the idea that the incremental bad decision or the incremental costs of doing something frequently outweigh the full costs of doing something. The incremental cost of firing the CFO who wouldn’t make fake numbers was simply easier than facing the difficult reality that the product sucked, and they had pushed through too much investor money to start again. When things turn bad, at startups or other businesses, a trail of marginal decision making can normally be found.

Business Themes

elizabeth-holmes1.0.jpg
AYX.png
PS.png
  1. The Pressure to Succeed. Stress seems to be a part of business, but the pressure can sometimes get too big to handle. Public companies, in particular, face growth targets from wall street analysts and investors. One earnings miss or even a more modest beat than expected can completely derail a stock (See pluralsight and alteryx graphs to the right). Public company CEOs and CFOs can be fired or have compensation withheld for poor stock performance. So when a young hot biotechnology startup wanted to launch a partnership with Walgreens, Dr. J and the Walgreens team were more than ready to fast track the potential partnership. Despite not being allowed to use the bathroom, see the lab or see a partial demo of the product, Walgreens pushed through a deal so that longtime competitor, CVS, wouldn’t get the deal. As then head of the Theranos/Walgreens pilot said, "We can’t not pursue this. We can’t risk a scenario where CVS has a deal with them in six months and it ends up being real.” When the partnership was announced, even the press release sounded oddly formulaic: “Theranos’ proprietary laboratory infrastructure minimizes human error through extensive automation to produce high quality results.” There was no demo. There was no product. There was only pressure at Walgreens to beat CVS and pressure at Theranos to make something from a fake device.

  2. The Importance of Corporate Governance. Corporate Governance has historically rarely been discussed outside of academic settings but has come into sharper focus over the past few years. Some have recently tried to bring some of the prominent corporate governance issues such as member compensation and option grants for executives to the forefront. Warren Buffet even commented on boards in his 2019 annual shareholder letter: “Director compensation has now soared to a level that inevitably makes pay a subconscious factor affecting the behavior of many non-wealthy members. Think, for a moment, of the director earning $250,000-300,000 for board meetings consuming a pleasant couple of days six or so times a year. And job security now? It’s fabulous. Board members may get politely ignored, but they seldom get fired. Instead, generous age limits – usually 70 or higher – act as the standard method for the genteel ejection of directors.” Boards are meant to help guide the company through strategic challenges, ensure the business is focused on the right things, and evaluate the CEO. Theranos’ Board of Directors was a laughable hodgepodge of old white men: George P. Shultz (former U.S. Secretary of State), William Perry (former U.S. Secretary of Defense), Henry Kissinger (former U.S. Secretary of State), Sam Nunn (former U.S. Senator), Bill Frist (former U.S. Senator and heart-transplant surgeon), Gary Roughead (Admiral, USN, retired), James Mattis (General, USMC), Richard Kovacevich (former Wells Fargo Chairman and CEO), and Riley Bechtel. The average age of the directors in 2012 was ~72 years old and few of these men could offer real strategic guidance in pursuing novel biotechnology. On top of that, as Carreyrou points out, “In December 2013, [Holmes] forced through a resolution that assigned one hundred votes to every share she owned, giving her 99.7% of the voting rights.” George Shultz even said later in a deposition, “We never took any votes at Theranos. It was pointless. Elizabeth was going to decide whatever she decided.” The episode brings more clarity to those CEOs and companies who hide behind their Board of Directors, who promise governance for investors, but rarely deliver on anything beyond pandering to the CEO’s whims. In another ludicrous comparison, Apple and Steve Jobs specifically have also been accused of shoddy corporate governance. In 2007, Apple famously backdated Jobs options, allowing him to make an instant profit, and did not even bother to report that it had issued the options. The best companies are not immune, and investors and employees should be aware of the qualifications and monetary interests of a company’s board members.

  3. Search and Destroy. Only the Paranoid Survive, right? Wrong. There is such thing as too much paranoia. When you combine that paranoia with a manipulative persona, you get Elizabeth Holmes. It’s hard to believe that any startup or founder would need the level of security and secrecy that dominated the culture at Theranos. The list of weird security and legal gray areas include: personal security for Holmes, laboratory developed tests (instead of FDA approved tests), copious and vigorously enforced NDAs, siloed teams with no communication, and false representation in the media. Organizations are often secret and many startups operate in stealth to not give away details to competitors. Some larger companies launch new divisions in separate locations from their office, like Amazon a9. The Company hired private investigators (through its powerful law firm Boies Schiller) to threaten and track former employees including Erika Chung and Tyler Schulz. Tyler Schulz, grandson of board member George Schulz, was one of the key informants to author John Carreyrou. After he accused Elizabeth and Sunny of lying and potentially harming patients, he resigned and tried to convince his grandfather that it was all a sham. His grandfather agreed to speak with him one-on-one and at the end of the conversation surprised Tyler with two attorneys from Boies Schiller who almost forced Tyler to sign a confidentiality agreement. Tyler refused, which eventually led to the publication of Carreyrou’s first article. As early board member Avie Tevanian put it, “I had seen so many things that were bad go on. I would never expect anyone would behave the way that she behaved as a CEO. And believe me, I worked for Steve Jobs. I saw some crazy things. But Elizabeth took it to a new level.” Again, sadly, while Theranos may be the pinnacle of secrecy, paranoia and threatening behavior, eBay recently fired six employees for threatening online reviewers. On top of sending live spiders to the reviewers’ household, eBay team members would knock on their doors day or night, to scare the reviewers. How could these employees think this was ok? How could Elizabeth partake in this threatening and manipulative behavior? As Organizational Behavior professor Roderick Kramer reminds us: “‘Reality’ is not a fixed entity but rather a tissue of facts, impressions, and interpretations that can be manipulated and perverted by clever and devious businesses and governments.” Theranos’ fake Edison tests are reminiscent of Enron’s fake trading floor, where 70 low level employees once pretended to be busy to impress wall street analysts. Paranoia and secrecy are powerful weapons when left unchecked, and clearly Theranos' wielded those weapons to the fullest extent.

Dig Deeper

  • HBO Documentary: “The Inventor: Out for Blood in Silicon Valley” has many interviews and deep analysis on Theranos

  • When Paranoia Makes Sense by Organizational Behavior Professor Roderick Kramer

  • Theranos criminal trial set to begin March 9, 2021

  • Ex-Theranos CEO Elizabeth Holmes says 'I don't know' 600-plus times in never-before-broadcast deposition tapes

  • Holmes’ famous Mad Money Interview: “First they think you're crazy, then they fight you, and then all of a sudden you change the world.”

  • Theranos’ still active Twitter account

tags: Theranos, Elizabeth Holmes, Sunny Balwani, Apple, Steve Jobs, Snap, Dropbox, Stanford, Reality distortion field, Fyre Festival, Boeing, Billy McFarland, Jeff Skilling, Enron, Boies Schiller, Clayton Christensen, Walgreens, CVS, Warren Buffett, George Schulz, batch2
categories: Non-Fiction
 

May 2020 - Hitchhiker's Guide to the Galaxy by Douglas Adams

We want to recognize the craziness of the world today and the saddening police brutality and systemic racism that continues to occur in the US. This month we opted for a fiction book that may provide a minor break from that current, depressing reality. We want to acknowledge that our reality is messed up, and as a book club we are committed to reading more books about diversity in tech and more books written by a diverse set of authors.

Tech Themes

  1. The Computer knows the answer. There is an overwhelming feeling in society today, that the computer should be able to tell us the answer. Predictive models are everywhere, from personalized AI workflows to sports gambling. Society has become accustomed to the idea that computers will solve problems for us. Interestingly, the novel portrays technology in the opposite light. Marvin, the robot on Zaphod Beeblebrox’s ship is so knowledgeable that even the most complex task seems meaninglessly easy. As a result, Marvin is constantly depressed. Deep Thought, the most powerful computer in history, takes seven million years to come up with an answer to the question of what life is all about. The simplistic forty-two answer, prompts the crowd to ask what the question was to which the answer is forty-two. The computer suggests that earth will provide that question. These examples somewhat reverse the expectations of technology to the reader. We normally think of technology as providing the answer, simplifying our lives and dehumanizing us. At the end of the story it is not Marvin’s heroism that saves the crew from being killed by the Blagulon Kappa cops who are after the Heart of Gold, it is his depression. When Marvin seizes control of the cops computer and explains his life-view, they commit suicide. In these instances, the role of technology is reversed - it is emotion and human nature that can help save the world and provide the answers to the universe.

  2. Not so obvious, Space Travel and Towels. “A towel, it says, is about the most massively useful thing an interstellar hitchhiker can have.” Something so simple as a towel - which seems relatively unimportant in everyday life - is an absolute necessity for space travel and hitchhiking through the galaxy. Frequently throughout technological history, the simple and unimportant things are overlooked in favor of tackling more complex problems and solutions. The largest data breach in history occurred when Equifax overlooked an expired certificate. During early development of the ENIAC, one of the first computing machines, software was looked at as unimportant and was relegated to early female programmers. Little did these sexist hardware programmers realize that software would become the most important aspect of computing. When the first iPhone released, Microsoft CEO Steve Ballmer laughed at the the device, saying it was too expensive and unable to cater to business customers because it didn’t have a keyboard. The incredibly sad, failed launch of space shuttle Challenger was due to cold temperatures causing rubber joint rings to become too stiff for appropriate sealing. Sometimes the value of a technology or a towel is not inherently obvious.

  3. The Guide, the Whole Earth Catalog and the Internet. “The reason why it was published in the form of a micro sub meson electronic component is that if it were printed in normal book form, an interstellar hitchhiker would require several inconveniently large buildings to carry it around in.” The Hitchhiker’s Guide to The Galaxy is a massive electronic guide to help hitchhikers move throughout space. This interestingly mirrors the current state of the internet, which didn’t exist when Douglas Adams wrote Hitchhiker’s Guide to the Galaxy in the early 70s. Prior to the internet, this type of alternative information could be found in the Whole Earth Catalog, a famous magazine that Steve Jobs once called “Google in paperback form, thirty-five years before Google came along.” The Whole Earth Catalog was created by Stewart Brand, a famous writer and technologist, who actually participated with Douglas Englebart in the Mother of All Demos, which featured the introduction of the mouse and video conferencing. Brand wanted a way to publish material that wouldn’t be found in traditional textbooks, including product reviews of the latest technology. When the internet was starting to launch, Brand created The WELL (Whole Earth ‘Lectronic Link) to continue to provide interesting alternative articles and essays. The WELL is credited with being one of the first internet forums, which was originally accessed via dial-up bulletin board system. The internet today very much mirrors the Hitchhiker’s guide to the galaxy: its content is enormous, it isn’t necessarily factual (the Guide is not completely factual either, but based on experience), and its content spans all possible information needed to survive. On top of that, the packaging is described as suspiciously similar to modern smartphones: “He also had a device which looked rather like a largish electronic calculator. This had about a hundred tiny flat press buttons and a screen about four inches square on which any one of a million ‘pages’ could be summoned at a moment's notice.” The internet and mobile computing have come a long way in 50 years; it will be great to watch what happens in the next 50!

Business Themes

Financial-Table1-768x432.jpg
StrategyTowardsIndependance_May2020_YoleDeveloppement.png.jpg
  1. The Business of Space: SpaceX / Virgin Galactic. Elon Musk and Chamath Palihapitiya are outspoken, visionary billionaires. Elon has an incredible track record of under-delivering but still exceeding most people’s wildest expectations. Chamath was an early employee at Facebook and is now a part owner of the Golden State Warriors. He is CEO of a VC-firm turned “technological holding company” and the creator of three public SPACs, one of which now represents Virgin Galactic. A SPAC or Specialty Purpose Acquisition Company is a blank-check company with no commercial operations. A SPAC is normally led by experts in a specific space like software or real estate and these executives raise money to acquire a company. The money raised in an IPO sits in an interest bearing account until the blank-check company has found a company to acquire. If no deal is completed after two years, the SPAC will give money back to their investors. Chamath purchased 49% of Richard Branson's Virgin Galactic space company in 2019. Space is impossibly big and its natural to think that someone who can develop the technology to unlock that vastness to humans would also unlock a fortune. As the Guide puts it: “‘Space,’ it says, ‘is big. Really big. You just won’t believe how vastly, hugely, mindbogglingly big it is. I mean, you may think it’s a long way down the road to the chemist’s, but that’s just peanuts to space.’” But the business of space is in its earliest days. SpaceX relies almost completely on government contracted work which means the company needs an incredible amount of funding to survive because of the capital investment and the uncertain, non-recurring nature of these space contracts. Interestingly, the development of early commercial air travel, in the 1920’s, also had a similar funding issue, and it was up to the Guggenheim family, rich from mining profits, to set up a fund to exclusively contribute to the development of Western Air Express, the world’s first commercial airliner. Virgin Galactic is taking a piece out of Tesla’s playbook by selling future space rides ahead of any commercial launch. Public markets investors including reddit’s wallstreetbets community is piling into Virgin Galactic at the literal moonshot risk of it becoming the space company (Income statement above). Space has always been a billionaire passion, the question remains - can it be a business?

  2. Moore’s Law and Murphy’s Law. Murphy’s law states: “Anything that can go wrong, will go wrong.” Hitchhiker’s Guide to the Galaxy explores this notion repeatedly as Arthur continually finds himself in unbelievably bad circumstances; his house is demolished, his planet is destroyed, he is captured by Vogons, and sure-death missles approach the ship as the crew descends on Magarathea. Arthur continues to survive these dangers with the help of the improbability drive, which the book states is a “a wonderful new method of crossing interstellar distances in a few seconds; without all that tedious mucking about in hyperspace. As the Improbability Drive reaches infinite improbability, it passes through every conceivable point in every conceivable universe almost simultaneously. In other words, you're never sure where you'll end up or even what species you'll be when you get there. It's therefore important to dress accordingly.” In comparison to Murphy’s law, Moore’s Law is the idea that computing power doubles every 18 months. A 2006 Economist article explained Moore’s Law as the opposite of Murphy’s Law: “But his law seems safe for at least another decade—or two to three chip generations—which is as far as he has ever dared to look into the future. As things are made at scales approaching individual atoms, he says, there will surely be limitations. Then again, the law has often met obstacles that appeared insurmountable, before soon surmounting them. In that sense, Mr Moore says, he now sees his law as more beautiful than he had realised. “Moore's Law is a violation of Murphy's Law. Everything gets better and better.” While Moore’s Law has surely reached its current limitations, the question remains where do chips go from here? Some have posited that chips will push towards function specific hardware or purpose built for specific computing tasks like NVIDIA’s graphics cards. The space is large and complex - with companies like Apple licensing ARM technology to build their famous A13 chip while other companies have focused on specific parts of the value chain like TSMC. A big question that still remains is how cloud companies will scale hardware to meet continuing demand from customers. Arthur Dent, like Elon Musk, continues to benefit from infinite improbability - maybe quantum computing is the only way to know if Elon will succeed and what happens next in chip design.

  3. Mentorship. Slartibarfast is a wise, old, planet creator who is plopped into the story to provide Arthur with answers to so many incredible questions. Slartibartfast explains the creation of earth and the interaction with Deep Thought. The interactions between Arthur and Slartibartfast are somewhat akin to traditional business mentorship - when you have none of the answers or you have preconceived ideas of how everything came to be, a mentor can quickly dispel your ideas and provide deep answers. Mentorship has been popular in Silicon Valley, with Bill Campbell mentoring Steve Jobs and several others. Bill was also instrumental in several decisions Ben Horowitz contemplated as he took Opsware through its spinout and sale of its managed services division. Mentors help change perspective and provide guidance.

Dig Deeper

  • Discussion of how the Whole Earth Catalog pushed 1960s CounterCulture

  • List of the Latest OpenAI models for predictive image generation and interaction prediction

  • Chamath says “Let Them Get Wiped Out!” when talking about hedge funds during the coronarvirus downturn

  • The resurgence of a business model formerly considered fraud - SPACs

  • Apple releases A13 bionic chip and it works incredibly fast

tags: Equifax, Microsoft, Steve Ballmer, Elon Musk, Steve Jobs, WELL, Stewart Brand, Chamath Palihapitiya, Facebook, Virgin Galactic, SPAC, Moore's Law, TSMC, ARM, NVIDIA, Ben Horowitz, Bill Campbell, batch2
categories: Fiction
 
Newer / Older

About Contact Us | Recommend a Book Disclaimer