Blog
Mashing up disciplines in new ways can transform industries
It’s hard to believe today, but phones didn’t always have touch screens.
In fact, some of us grew up in houses where phones were wired to the wall. We found out where we were by reading street signs and a map. We played games by taking a bat and ball to the middle of the street and forming teams with our friends. We listened to music by putting a record on a record player. Some of us, if we were lucky, owned 20 or 30 records. I had two, total, growing up.
And then came the iPhone. Sure, we’d seen convergence before the iPhone, but the preeminence of the smartphone over all other technologies was far from assured. At first, the mashup of an iPod and a phone was the big draw. But once Jobs introduced the App Store to go with the iPhone 3G, there was no stopping the revolution.
Walt Disney, too, mixed disciplines. Back in the 1930s, there were cartoons and there were movies. There were not movies made from cartoons. Movies told stories. Cartoons played fun gags. Movies not only were moving images, they moved audiences to laugh – and to cry. Cartoons were worthy of a chuckle or two.
But Walt wanted more. He loved animation and saw a future far beyond the slapstick antics of a simple cartoon. He saw the ability to tell fully realized stories far beyond anything audiences had ever seen before.
Remember that back in the 1930s, most movies were black and white. Computer animation didn’t exist. And while special effects had been a part of movie making since the dawn of movies, they always required some physical representation. In other words, you had to have a physical trick or gimmick to make the effect work.
Animation didn’t require that. If you wanted to suck something into a spinning vortex, you didn’t have to somehow simulate it with water and dye. You drew it, frame by laborious frame. It took a tremendous amount of work with an unprecedented number of animators, but if you could imagine it, you could draw it, and audiences could watch it.
For years, Snow White and the Seven Dwarfs was thought of as Disney’s folly. It was incredibly expensive, far over budget, and behind schedule. Besides, it was a cartoon. Who would sit through 75 minutes for a cartoon?
But Disney saw the confluence of movies and cartoons, and saw how far he could go once he mixed them together. He created a film classic, a cartoon that could make people laugh and cry. It was a film that actually scared kids, and yet made it okay to be scared. He created an entirely new art form as the outgrowth of two previous art forms.
Disney also decided to use old fairy tales and remake them in his own image. So with Snow White, he launched an empire of cartoons mixed with feature length movies mixed with rewritten classic stories. And, of course, the rest is history.
So, as we close out this discussion of two of the greatest innovators of the last 100 years, consider this: what disciplines – not just technologies, but ideas, areas of study, and forms of art – can you mix together to create something new, wonderful, compelling, and world-changing?
Betting on new technologies ahead of the curve can be a strong differentiator
Jobs often brought existing, bleeding edge new technologies to consumers as a way to differentiate his products. While work on WiFi began as early as 1988, it wasn’t until Apple incorporated WiFi in its clamshell-style iBook in 1999 that the standard started gaining traction.For Apple, the company (which didn’t use the term 802.11 or WiFi, but instead “AirPort”) used WiFi to overcome one key limitation in previous laptop computers: the need to have a wire to connect to a network.Today that capability may seem like nothing special, but back in 1999, the ability to use a laptop with battery power and no network cables was incredibly freeing. It gave the iBook serious legs, even though it suffered from some other design limitations.
Walt Disney had been moderately successful with Oswald the Rabbit, but he didn’t own the character’s IP (intellectual property). As a result, Charles Mintz of Winkler Pictures was able to perform what was essentially a hostile takeover. Mintz made offers to many of Walt’s animators.
Since Disney was often as abusive to his animators as Steve was to his engineers, many of the animators walked, defecting to Mintz. Disney wound up with both no character and no animators (except Ub).
In this way, Disney, like Jobs, had his creation taken away from him. Disney, like Jobs, also planned a comeback through the development of a superior product. In Disney’s case, it was to be Mickey Mouse in the seven minute and 42 second film Steamboat Willie.
Disney was concerned, though, about trying to compete against his former animation team and the popular character of Oswald (as well as other popular characters of the time, like Felix the Cat).
Disney’s technological advance was the use of synchronized sound. Background sound and music had long been used in cartoons. Sounds that synchronized to actions in the animation were an innovation. The steamboat horns, for example, compressed and expanded in time with the steam sounds.
This was a big risk technologically and financially, but it paid off. Steamboat Willie was a hit – leading not only to ticket sales, but the first Disney merchandising of Mickey himself.
Neither Apple nor Disney Studios invented the technologies they popularized. Instead, what they did was find technologies that fit, augmented their offerings, and gave them a differentiated advantage.
There are lessons here for you as well. First, don’t be afraid to adopt technologies you haven’t invented. Second, look at whether they add real value to the offering you’re providing.
In the case of WiFi, laptops became truly portable and wireless for the first time. In the case of Steamboat Willie, the new technology transformed a character into a beloved international icon.
What product are you developing that can benefit from an innovative ahead-of-your-competitors use of a burgeoning new technology?
Perfectionism, if you can survive it, can create deep customer loyalty
Then, of course, there was Steve Jobs. Jobs was a brutal perfectionist. He produced incredible products, and chewed up and spit out employees with nearly reckless abandon. Jobs was more polite, but no less perfectionist with employees of other companies.
Vic Gundotra, the former Google Senior VP who headed up Google+, tells the story of the Sunday afternoon when Jobs called up because the shade of yellow in the Google logo on the iPhone wasn’t quite right. Other Jobs perfectionist stories abound.
Disney, too, was a perfectionist. During the three years it took to produce the 1937 masterpiece Snow White and the Seven Dwarfs, Walt went over budget numerous times in his quest for perfection of both image and motion.
He left the responsibility to his brother Roy to cajole backers at Bank of America to continue loaning the company money. Walt wound up mortgaging his home to get the film finished. But in a time when no one had seen a feature length animation, Snow White was perfect. To this day, it is watchable and entertaining.
In the case of both Disney and Jobs, the perfectionism inherent in their products catalyzed the appeal of their offerings to eager throngs of consumers. That perfectionism has driven an almost cult-like level of customer loyalty, while simultaneously creating a barrier of entry their competitors haven’t been able to equal.
The lesson here isn’t quite as clear cut as the previous two, because most of us are not now, and never will be, Steve Jobs or Walt Disney. Perfection can be costly and difficult to attain, so great caution should be taken before you try to bet your company or your house on that perfect shade of yellow or that ideal motion capture.
There are times it’s worth throwing caution to the wind for a dream you strongly believe in – but you damned well better be able to produce.
Finding the right creative partner can be a force multiplier
No, I’m not talking about Steve Wozniak, John Sculley or Mike Markkula – or Roy Disney for that matter. Instead, I’m talking about Jony Ive and Ub Iwerks (whose name I desperately want to spell as iWerks).
You’re most likely familiar with Apple’s Chief Design Officer Jony Ive, the man whose deft touch led the design of the iPod, the original iMac, and the iPhone. It’s said that Ive became Jobs alter-ego. He manifested the same level of design simplicity Jobs favored, but also had the hands-on designer chops to make those designs real.
Walt Disney, too, had a design touch. In fact, he started off with an ability to draw cartoonish pictures, and made his early living as a cartoonist. In one of these first jobs, he met Ub Iwerks, the man who took Disney’s visions into the stratosphere.
Walt Disney came up with the character of Mortimer Mouse (which was Mickey’s original name until Disney’s wife Lillian suggested a friendlier name). But it was Ub who brought Mickey to life, and it was Ub who headed up many of the animations that made Walt Disney into Walt Disney.
Partnering can be a force multiplier. The classic partnership for young innovative entrepreneurs is with folks with business skills, and both Jobs and Disney did that as well.
But don’t forget that innovation takes effort all on its own, and sometimes it’s necessary to bring in a person who can focus solely on the creative side, can go beyond you in your skills and abilities, and take your vision and lift it to a new level. For Disney and Jobs, that was Iwerks and Ive.
And no, your creative partner’s last name doesn’t need to begin with “I,” although that might help.
Don't give up
Most of us are familiar with the Steve Jobs resurrection story. He lost control of the Mac team under John Sculley, wandered in the “wilderness” of NeXT (as Sculley describes it), and then returned to Apple – building it into one of the most amazing companies in history.
Walt Disney, too, had his share of failure. Well before the Walt Disney Company (which was originally called Disney Brothers in partnership with Roy Disney), Walt founded Laugh-O-Gram studios.
Laugh-O-Gram produced animation shorts, including a series of break-even cartoons for a local theater. There were a number of business errors. Then, when a major New York City theater chain failed to pay for a series of expensive animations, Laugh-O-Gram went bankrupt. All Walt Disney had left was enough to buy a train ticket to LA.
As we well know, Jobs and Disney went on to create towering enterprises that have become integral parts of our culture. The lesson here is that even if you fail, that doesn’t make you a failure. Keep trying. Even when things are at their darkest, there may be light in your future.
In the February of 1999, the Google Company
moved out of its garage office to its first mountain-view office, with just eight employees. This number is incomparable to the current staff size of Google.
In the year 1997, Yahoo rejected an offer to buy Google for
$1 million and now the company is worth $20 billion, whereas Google has grown up to $200 Billion. This is perhaps one of the most interesting financial losses of the IT industry.
The Birth of Google
It began with an argument. When he first met Larry Page in the summer of 1995, Sergey Brin was a second-year grad student in the computer science department at Stanford University. Gregarious by nature, Brin had volunteered as a guide of sorts for potential first-years – students who had been admitted, but were still deciding whether to attend. His duties included showing recruits the campus and leading a tour of nearby San Francisco. Page, an engineering major from the University of Michigan, ended up in Brin’s group.
It was hardly love at first sight. Walking up and down the city’s hills that day, the two clashed incessantly, debating, among other things, the value of various approaches to urban planning. “Sergey is pretty social; he likes meeting people,” Page recalls, contrasting that quality with his own reticence. “I thought he was pretty obnoxious. He had really strong opinions about things, and I guess I did, too.”
“We both found each other obnoxious,” Brin counters when I tell him of Page’s response. “But we say it a little bit jokingly. Obviously we spent a lot of time talking to each other, so there was something there. We had a kind of bantering thing going.” Page and Brin may have clashed, but they were clearly drawn together – two swords sharpening one another.
When Page showed up at Stanford a few months later, he selected human-computer interaction pioneer Terry Winograd as his adviser. Soon thereafter he began searching for a topic for his doctoral thesis. It was an important decision. As Page had learned from his father, a computer science professor at Michigan State, a dissertation can frame one’s entire academic career. He kicked around 10 or so intriguing ideas, but found himself attracted to the burgeoning World Wide Web.
Page didn’t start out looking for a better way to search the Web. Despite the fact that Stanford alumni were getting rich founding Internet companies, Page found the Web interesting primarily for its mathematical characteristics. Each computer was a node, and each link on a Web page was a connection between nodes – a classic graph structure. “Computer scientists love graphs,” Page tells me. The World Wide Web, Page theorized, may have been the largest graph ever created, and it was growing at a breakneck pace. Many useful insights lurked in its vertices, awaiting discovery by inquiring graduate students. Winograd agreed, and Page set about pondering the link structure of the Web.
Citations and Back Rubs It proved a productive course of study. Page noticed that while it was trivial to follow links from one page to another, it was nontrivial to discover links back. In other words, when you looked at a Web page, you had no idea what pages were linking back to it. This bothered Page. He thought it would be very useful to know who was linking to whom.
Why? To fully understand the answer to that question, a minor detour into the world of academic publishing is in order. For professors – particularly those in the hard sciences like mathematics and chemistry – nothing is as important as getting published. Except, perhaps, being cited.
Academics build their papers on a carefully constructed foundation of citation: Each paper reaches a conclusion by citing previously published papers as proof points that advance the author’s argument. Papers are judged not only on their original thinking, but also on the number of papers they cite, the number of papers that subsequently cite them back, and the perceived importance of each citation. Citations are so important that there’s even a branch of science devoted to their study: bibliometrics.
Fair enough. So what’s the point? Well, it was Tim Berners-Lee’s desire to improve this system that led him to create the World Wide Web. And it was Larry Page and Sergey Brin’s attempts to reverse engineer Berners-Lee’s World Wide Web that led to Google. The needle that threads these efforts together is citation – the practice of pointing to other people’s work in order to build up your own.
Which brings us back to the original research Page did on such backlinks, a project he came to call BackRub.
He reasoned that the entire Web was loosely based on the premise of citation – after all, what is a link but a citation? If he could divine a method to count and qualify each backlink on the Web, as Page puts it “the Web would become a more valuable place.”
At the time Page conceived of BackRub, the Web comprised an estimated 10 million documents, with an untold number of links between them. The computing resources required to crawl such a beast were well beyond the usual bounds of a student project. Unaware of exactly what he was getting into, Page began building out his crawler.
The idea’s complexity and scale lured Brin to the job. A polymath who had jumped from project to project without settling on a thesis topic, he found the premise behind BackRub fascinating. “I talked to lots of research groups” around the school, Brin recalls, “and this was the most exciting project, both because it tackled the Web, which represents human knowledge, and because I liked Larry.”
The Audacity of Rank In March 1996, Page pointed his crawler at just one page – his homepage at Stanford – and let it loose. The crawler worked outward from there.
Crawling the entire Web to discover the sum of its links is a major undertaking, but simple crawling was not where BackRub’s true innovation lay. Page was naturally aware of the concept of ranking in academic publishing, and he theorized that the structure of the Web’s graph would reveal not just who was linking to whom, but more critically, the importance of who linked to whom, based on various attributes of the site that was doing the linking. Inspired by citation analysis, Page realized that a raw count of links to a page would be a useful guide to that page’s rank. He also saw that each link needed its own ranking, based on the link count of its originating page. But such an approach creates a difficult and recursive mathematical challenge – you not only have to count a particular page’s links, you also have to count the links attached to the links. The math gets complicated rather quickly.
Fortunately, Page was now working with Brin, whose prodigious gifts in mathematics could be applied to the problem. Brin, the Russian-born son of a NASA scientist and a University of Maryland math professor, emigrated to the US with his family at the age of 6. By the time he was a middle schooler, Brin was a recognized math prodigy. He left high school a year early to go to UM. When he graduated, he immediately enrolled at Stanford, where his talents allowed him to goof off. The weather was so good, he told me, that he loaded up on nonacademic classes – sailing, swimming, scuba diving. He focused his intellectual energies on interesting projects rather than actual course work.
Together, Page and Brin created a ranking system that rewarded links that came from sources that were important and penalized those that did not. For example, many sites link to IBM.com. Those links might range from a business partner in the technology industry to a teenage programmer in suburban Illinois who just got a ThinkPad for Christmas. To a human observer, the business partner is a more important link in terms of IBM’s place in the world. But how might an algorithm understand that fact?
Page and Brin’s breakthrough was to create an algorithm – dubbed PageRank after Page – that manages to take into account both the number of links into a particular site and the number of links into each of the linking sites. This mirrored the rough approach of academic citation-counting. It worked. In the example above, let’s assume that only a few sites linked to the teenager’s site. Let’s further assume the sites that link to the teenager’s are similarly bereft of links. By contrast, thousands of sites link to Intel, and those sites, on average, also have thousands of sites linking to them. PageRank would rank the teen’s site as less important than Intel’s – at least in relation to IBM.
This is a simplified view, to be sure, and Page and Brin had to correct for any number of mathematical culs-de-sac, but the long and the short of it was this: More popular sites rose to the top of their annotation list, and less popular sites fell toward the bottom.
As they fiddled with the results, Brin and Page realized their data might have implications for Internet search. In fact, the idea of applying BackRub’s ranked page results to search was so natural that it didn’t even occur to them that they had made the leap. As it was, BackRub already worked like a search engine – you gave it a URL, and it gave you a list of backlinks ranked by importance. “We realized that we had a querying tool,” Page recalls. “It gave you a good overall ranking of pages and ordering of follow-up pages.”
Page and Brin noticed that BackRub’s results were superior to those from existing search engines like AltaVista and Excite, which often returned irrelevant listings. “They were looking only at text and not considering this other signal,” Page recalls. That signal is now better known as PageRank. To test whether it worked well in a search application, Brin and Page hacked together a BackRub search tool. It searched only the words in page titles and applied PageRank to sort the results by relevance, but its results were so far superior to the usual search engines – which ranked mostly on keywords – that Page and Brin knew they were onto something big.
Not only was the engine good, but Page and Brin realized it would scale as the Web scaled. Because PageRank worked by analyzing links, the bigger the Web, the better the engine. That fact inspired the founders to name their new engine Google, after googol, the term for the numeral 1 followed by 100 zeroes. They released the first version of Google on the Stanford Web site in August 1996 – one year after they met.
Among a small set of Stanford insiders, Google was a hit. Energized, Brin and Page began improving the service, adding full-text search and more and more pages to the index. They quickly discovered that search engines require an extraordinary amount of computing resources. They didn’t have the money to buy new computers, so they begged and borrowed Google into existence – a hard drive from the network lab, an idle CPU from the computer science loading docks. Using Page’s dorm room as a machine lab, they fashioned a computational Frankenstein from spare parts, then jacked the whole thing into Stanford’s broadband campus network. After filling Page’s room with equipment, they converted Brin’s dorm room into an office and programming center.
The project grew into something of a legend within the computer science department and campus network administration offices. At one point, the BackRub crawler consumed nearly half of Stanford’s entire network bandwidth, an extraordinary fact considering that Stanford was one of the best-networked institutions on the planet. And in the fall of 1996 the project would regularly bring down Stanford’s Internet connection.
“We’re lucky there were a lot of forward-looking people at Stanford,” Page recalls. “They didn’t hassle us too much about the resources we were using.”
A Company Emerges As Brin and Page continued experimenting, BackRub and its Google implementation were generating buzz, both on the Stanford campus and within the cloistered world of academic Web research.
One person who had heard of Page and Brin’s work was Cornell professor Jon Kleinberg, then researching bibliometrics and search technologies at IBM’s Almaden center in San Jose. Kleinberg’s hubs-and-authorities approach to ranking the Web is perhaps the second-most-famous approach to search after PageRank. In the summer of 1997, Kleinberg visited Page at Stanford to compare notes. Kleinberg had completed an early draft of his seminal paper, “Authoritative Sources,” and Page showed him an early working version of Google. Kleinberg encouraged Page to publish an academic paper on PageRank.
Page told Kleinberg that he was wary of publishing. The reason? “He was concerned that someone might steal his ideas, and with PageRank, Page felt like he had the secret formula,” Kleinberg told me. (Page and Brin eventually did publish.)
On the other hand, Page and Brin weren’t sure they wanted to go through the travails of starting and running a company. During Page’s first year at Stanford, his father died, and friends recall that Page viewed finishing his PhD as something of a tribute to him. Given his own academic upbringing, Brin, too, was reluctant to leave the program.
Brin remembers speaking with his adviser, who told him, “Look, if this Google thing pans out, then great. If not, you can return to graduate school and finish your thesis.” He chuckles, then adds: “I said, ‘Yeah, OK, why not? I’ll just give it a try.’”
Steve’s Principle
1. They value people. Their choices. Their issues. Their problems.
People – respected and valued, always go the extra mile to outperform even the highest of client/management expectations.
2. Hire A players. A players bring in A+ players, B players bring in Cs, and C players bring in Ds, and so on.
Soon, your A players have left the company; and you’re left surrounded by what Guy Kawasaki calls – “industry rejects”.
Mediocre people who can’t secure a job anywhere else, and stick with you on the pretext of saying, they are loyal to the company. Steve Jobs used to say, ‘One bozo gets another bozo’. Soon, you’re surrounded by bozos. Make sure you hire A players always – whatever the costs – for preventing this to happen.
3. Invest in People. Identify A players in existing groups – promote them – provide them with faith, trust and hand over responsibility to them.
If they are A players, they will NOT fail.
Hire for attitude, and then train them for skill(s).
4. Allow people to Experiment. And fail. If required.
The more people experiment – and have an entrepreneurial attitude, they will be more satisfied with their work, the projects they work in, people they work with; and most importantly, have a more positive outlook on their work lives.
Click to see what Steve Jobs says about failure.
5. Allow people to Grow.
Some of the Indian companies that I worked for in the past – had amazing people – who soon left. They were extremely well paid, but they left, cause they didn’t see any final aim/goal/vision for the company as such.
Good people thrive on Best Ideas.
6. Last, let BEST IDEAS WIN always.
Steve Jobs when once asked – he said, “You have to let the BEST IDEAS win“. Some ideas will be good, some fucking amazingly good, some dopey, and some fucking horrible. Let people be honest, debate, and let the best ideas win always. I worked for some companies in the past – wherein to question someone in a meeting was considered derogatory; management often said, ‘hey man, you’re putting down people! They feel rejected – sad!!’. I was like – dude, what the fuck? If the person felt put down, maybe he needs to read up before a presentation, or have a depth of understanding before a debate. If its a presentation for the sake of one, then, you’re thriving in a company full of bozos.
Travis Kalanick
It has been nothing short of a bumpy ride for Travis Kalanick but we still can’t get enough of him. Kalanick’s controversies began long before Uber started ticking off cab companies across the globe. His first venture, Scour.com, morphed from a web search engine to a successful peer-to-peer file exchange–that is, until 2000 when more than 30 media companies sued Scour for $250 million for copyright infringement.
No stranger to opposition, Kalanick is not afraid to rub people the wrong way in the name of pursuing his vision. Uber faced a battle with the government back in 2010 when it was called UberCab and things are not very different now with recent women’s safety issues and a ban against the company in Germany.
How does the brash, young entrepreneur keep his conviction in the face of criticism and resistance? Check out the five quotes below from Kalanick himself.
1. On holding your ground:
“Stand by your principles and be comfortable with confrontation. So few people are, so when the people with the red tape come, it becomes a negotiation.”–January 25, 2013, Wall Street Journal
2.On dominating the market:
“What we maybe should’ve realized sooner was that we are running a political campaign and the candidate is Uber. And this political race is happening in every major city in the world. And because this isn’t about a democracy, this is about a product, you can’t win 51 to 49. You have to win 98 to 2.”–November 4, 2014. Vanity Fair
3. On pioneering your business model before someone else does:
“If there is to be a low-cost Uber, Uber will be the low-cost Uber.”–January 25, 2013, Wall Street Journal
4.On taking charge of disruption and innovation:
“It’s not Pinterest where people are putting up pins. You’re changing the way cities work, and that’s fundamentally a third rail.”–May 28, 2014, Code Conference
5. On not letting go of his successful company:
“You’re asking somebody who has a wife and is really happily married, ‘So, what’s your next wife going to be like?’ And I’m like, 'What?’”–November 4, 2014, Vanity Fair
nor the most intelligent, but the one most responsive to change.