This could be the year that Docker hits the big time

At Docker's DockerCon conference in San Francisco on June 22.

Docker is hot. That much was clear from the buzz — and the news — around this year’s DockerCon. The evidence suggests that the billion-dollar company responsible for accelerating the “container revolution” is at a tipping point.

550 people attended DockerCon in 2014. This year, there were at least 2,100. Docker chief executive Ben Golub shared a few more stats during his keynote to show how much the ecosystem has grown since the last event:

  • The number of open-source contributors to Docker went from 460 to 1,300, a 183 percent surge.
  • The number of Docker projects on GitHub went from 6,500 to 40,000, up 515 percent.
  • And perhaps most impressively, the number of Docker-related job listings went from 2,500 to 43,000, an increase of 1,720 percent.

VentureBeat’s Jordan Novet has been covering Docker longer and better than almost anyone else. His coverage of DockerCon 2015 is a great starting point for learning what’s going on with this technology.

Here’s the CliffsNotes version, based on Jordan’s coverage and my conversations with various people this week: Linux containers are a technology for running multiple applications in isolated instances on a single physical server. Docker made containers far easier to implement, and in the process turned them into a tool used by many developers. With containers, you can build application in one place (say, on your laptop) and then easily move it to a server in your test environment, and from there to another server (say, in production, or in a public cloud).

Container code is open-source, lightweight, and easy to use — but it’s also missing some basic features that more complicated virtualization technologies have. For example, containers don’t have built-in networking capabilities, so they can’t easily communicate with one another. (Or, they didn’t — until this week, when Docker added native software-defined networking features to enable that.)

As developers have embraced Docker, they’ve forced companies to reckon with these missing features. And those needs are giving rise to a small but ambitious crop of startups interested in helping enterprises plug the holes.

  • WeaveWorks, for instance, offers software-defined networking and network-management tools to connect Docker containers to one another. It has raised $5 million to date.
  • Portworx provides software-defined storage for Docker containers, so they can easily connect to data sources in the cloud or in companies’ own data centers. And it plans to expand beyond storage to other spheres as well. Portworx raised $8.5 million, it announced this week.
  • Rancher Labs provides a lightweight Linux operating system designed for containers, along with management tools. It raised $10 million, it announced earlier this month.

Rancher “makes it simple to run Docker containers in production,” its tagline proclaims, and that’s a clue to where these startups are all aiming. In fact, that was the theme of DockerCon too.

Docker’s popularity is now putting pressure on IT departments to ensure that containers are well-supported in production environments.

Companies that supply enterprise technology to IT departments are feeling the pressure, too, and are beginning to respond. For example, Microsoft has aggressively embraced Docker, with container support on Windows Server since last October; a Microsoft’s Mark Russinovich gave a demonstration at DockerCon showing how he could deploy code simultaneously to a Linux container and to a Windows container. And VMware showed off its own Project Bonneville, which lets people run Docker containers inside VMware virtual machines. Since, as VMware pointed out, a container is sort of like a virtual machine, that’s not as bizarre as it might sound.

In short, this has all the hallmarks of a classic, bottom-up enterprise technology transformation. Compare it to the shift from mainframes and minicomputers to PCs, which was driven by employees bringing tools they needed into the office, and which ultimately forced IT departments to build client-server networks around the new tools. Or compare it to the shift from client-server to Internet architectures, where developers using what they’d learned building websites and Web applications gradually forced their IT departments to see the wisdom of based almost everything on HTTP and TCP/IP. Or compare it to the (still ongoing) shift to the cloud, where developers and business managers, impatient with waiting for their tech teams to implement something critical, just put down a credit card and begin using a software-as-a-service (SaaS) application instead — eventually forcing enterprise IT to accommodate and embrace the cloud, too.

Will Docker succeed in shifting the architecture of enterprise IT? It’s too soon to tell. Dozens of things could go wrong for Docker in the next couple of years. But it’s certainly on the right track, and 2015 may in retrospect look like the moment when a new type of IT infrastructure really started to take off.

 

Originally published on VentureBeat » Dylan Tweney: http://ift.tt/1SOHs08

This could be the year that Docker hits the big time

Here comes the industrial Internet — and enormous amounts of data

Data lakes will look nothing like this.

The “Internet of Things” is a trendy term that probably makes you think about connected toasters and smart refrigerators. But for GE, it also includes jet engines and power generators.

You might not think of these giant artifacts of industrial culture as part of the Internet of Things, but they will be — and their impact is going to be far bigger than adding some superfluous level of convenience to your dishwasher.

Ibrahim Gokcen, a technology executive at GE, told me recently how GE jet engines include sensors to monitor temperature, pressure, fuel consumption, and more. The sensors themselves aren’t new, but the quantity of data they’re generating is.

Ten years ago, each sensor would generate about 30KB of data per flight, sampling conditions at takeoff, when the plane reached cruising altitude, and again at landing.

Today, the sensors embedded in a GE jet engine sample conditions continuously and generate 500GB of data — per engine — for each flight.

Airlines are interested in that data, of course, since it can help identify problems before they become critical. Whenever a jet engine is in the shop (“off wing” in industry parlance), it’s not helping the airline earn money, and if you’re paying billions of dollars for your jet engines, you need them to be on airplane wings, earning money.

So airlines save all the data, from each engine, for each flight. But that’s where the problems start.

Multiply 500GB by each airplane’s number of engines (two to four, in most modern passenger jets) and by the number of daily flights, and you’ve got a staggering amount of data being generated just by one component of an airline’s infrastructure.

For now, most industrial sensor data is simply dumped into a database, just like any other data generated by a corporation. There’s no guarantee that the database used for one component of a business (jet engines) will be compatible with the database used to track data from another component (airplane maintenance records, or human resources records). In fact, it’s a pretty safe bet that any large industrial corporation is going to have dozens if not hundreds of data repositories, or one giant “data lake,” an apt term in that it conjures up an image of a huge, stagnant pool of fetid data, just sitting there doing nothing.

And yet, using that data can confer significant advantages. If you can eke another few percentage points of efficiency out of a power-generating gas turbine, that translates directly into reduced costs and increased margin. Keeping jet engines in service longer is a win, even if it’s only a slight improvement. When you’re a billion-dollar business, every percent of efficiency improvement or cost reduction is worth $10 million, so it’s worth diving deep into the data lakes to pull out the gems.

This is where it gets tricky, because the number of sensors and the amount of data is about to get ridiculous.

Cisco estimated a few years ago that 8.7 billion “things” — including computers — were connected to the Internet. Looking just at devices (not computers or phones), Gartner estimated there would be 4.9 billion connected IoT “things” by 2015. But that number is set to explode: GE estimates there will be 50 billion devices attached to the Internet by 2020.

These devices will mostly be invisible. They won’t inspire page after page of glowing commentary from Internet pundits in high-profile tech news sites. But they will make an enormous difference to the way industry works in the modern world.

If companies are going to make use of all this data they’re generating, they’ll need to start pulling them together. That’s where big-data analysis tools like Apache Spark, data-management and storage tools like Hadoop, and search engines like Maana and Elasticsearch come in.

GE, for its part, has its own platform for managing data from the industrial Internet, called GE Predix, which currently generates about $1 billion in revenue. Gokcen said he expects that business to grow to $4 billion to $5 billion over the next few years.

GE is hedging its bets with investments in Maana, Predixion, Ayasdi, and others; it also owns 10 percent of big-data company Pivotal.

In a way, that makes an industrial manufacturing giant like GE into a software company. It’s still making turbines, but the software to analyze all that data is an increasingly important part of the sale, so GE has to make the software, too, and invest in tools that can help it and its customers get the analyses done that they need.

It’s a transition many industrial companies will have to make as they learn how to wrestle with terabytes and petabytes of sensor data.

 

Originally published on VentureBeat » Dylan Tweney: http://ift.tt/1GkA049

Here comes the industrial Internet — and enormous amounts of data

TWiT TV on Apple, Twitter, Reddit, and more

TWiT episode 514

I was on This Week in Tech — also known as TWiT — this past weekend, along with Harry McCracken, Patrick Norton, and Merlin Mann, plus host Leo LaPorte.

It’s a two-hour live show, but the time went by fast: Leo keeps things moving along and everybody had something interesting to say about the week’s biggest tech news stories.

Merlin and I joined via Skype, and they put our heads on big monitors on either side of the in-studio guests, so we were sort of virtually at the table. It felt a little strange at first but it worked.

Plus, I got to talk about that time Neil deGrasse Tyson yelled at me.

You can watch the show, get related story links, and download the audio-only podcast version on the TWiT website.

TWiT TV on Apple, Twitter, Reddit, and more

Ex-Cisco CTO Padmasree Warrior uses haiku and painting to find balance in work, life

Former Cisco CTO Padmasree Warrior (right) speaks with Bloomberg's Stephanie Mehta.

SAN FRANCISCO — Former Cisco chief technical officer Padmasree Warrior is a painter and a haiku poet, as well as one of the leading female executives in the technology world.

For her, art and technology are interlinked. Creating art — and she emphasized that she is an amateur artist — involves some of the same challenges of communication and innovation as creating new tech products, she said.

She also spoke about how art helped her do weekly “digital detox” and get some much-needed perspective on her job, at a time when she was working seven days a week without a break. In that way, doing art is similar to getting exercise or other recreational activities: You might not feel like you have time to do it, but it actually helps.

“If you do take that time off, and you come back, you’re move effective,” Warrior said.

“Whatever allows you to not think about work — art, running, unstructured time — makes you a better person,” she said.

Warrior spoke onstage at the Bloomberg Tech conference today, in a talk titled “The Art and Science of Code.”

Warrior has been at Cisco since 2008. In recent years, she’s been working on acquisitions, and singled out Cisco’s purchases of Meraki and Sourcefire as particular highlights.

“The time is right in the industry to do something different,” she said. “For myself, I want to go do something that’s very different.”

Some of her haiku, which she often posts to her Twitter account:

Where thoughts come to rest
Every spot a memory
Blissful with flaws, Home

Rumble of ocean
Footprints gone, not forgotten
Same sand, a new path

Warrior took a seat on Box’s board of directors in March. She stepped down as Cisco’s CTO on June 2, but remains as a strategic advisor to the company.

Originally published on VentureBeat » Dylan Tweney: http://ift.tt/1JVEsff

Ex-Cisco CTO Padmasree Warrior uses haiku and painting to find balance in work, life

How Harvard Business Review and VentureBeat achieve growth online

I spoke with Josh Macht (Group Publisher, Harvard Business Review) on an episode of Hubspot’s podcast The Growth Show to talk about how our publications have been able to survive — and thrive — in the digital age.

It’s an interesting 30-minute conversation about how digital publications can grow, attract the right audiences, and monetize. Some of the things we discuss include:

  • How Harvard Business Review has transformed into a digital media brand attracting top-ranking executives from around the globe
  • The importance of creating content that is valuable enough to attract subscribers
  • Why today’s media brands need to diversify their revenue streams and not rely on a single business model
  • Thoughts on the changing landscape of tech media world, including GigaOm,Recode/Vox Media and AOL/Verizon
  • What the next few years for Harvard Business Review and VentureBeat will look like, including how they optimize for mobile

Thanks to Hubspot for inviting me onto this podcast. Here’s the audio:

Audio

No, Neil deGrasse Tyson didn’t say Apple’s App Store is a ‘watershed moment in civilization’

Astrophysicist Neil deGrasse Tyson, in a video shown by Apple on June 8, 2015.

Astrophysicist and Cosmos host Neil deGrasse Tyson made a brief appearance in an Apple event earlier this week, where he seemed to say that Apple’s App Store was one of humanity’s crowning achievements.

In a recorded video about the App Store — which recently marked its 100 billionth app download, according to Apple — Tyson says something that made my ears prick up.

Right after Apple executive Phil Schiller talks about how huge the App Store has become in the seven years since its launch, Tyson’s voice comes on.

“Apps plus handheld devices — I think that’s a watershed moment in civilization,” Tyson says in the video. “I put it up there with the invention of the microscope and the telescope. Here we live in a time where the most powerful tools ever imagined to investigate and probe our world are in the hand of essentially everyone.”

(The App Store video, called “The incredible impact of developers,” is on this page about Apple’s June 8 event. In the main event video it starts at about minute 78.)

So I contacted Tyson to clarify that statement. And I’m glad I did, because despite Apple’s framing, Tyson was not actually raving about the App Store — or even about apps in general.

“What I said was: Apps alone are pretty useless,” Tyson told me. “The value of apps derives primarily from the ingenious hardware that has been attached to them.”

“Without the hardware,” he continued, “apps would be little more than video games.”

Tyson went on to say that the lines Apple quoted came from a 45-minute interview, during most of which he focused on the amazing achievements of smartphone hardware makers.

And then he laid into me for misconstruing what he actually said.

“I never mentioned Apple, or the App Store,” Tyson told me. Yes, fair point: He didn’t. I’d wager that many who watched the keynote came away with that impression, though, which suggests Apple wasn’t 100 percent straightforward in the way it was presenting his quote.

And when I asked him about a story from a year ago, in which he seemed to slam app developers, he laid into me again.

At the time, he said, “Society has bigger problems than what can be solved with your next app, in transportation, and energy and health.” He didn’t mean that apps are bad, or that they didn’t solve problems, or that app makers were cave-dwellers.

What he meant, Tyson told me, is that apps are dependent on more fundamental innovations in hardware and infrastructure. It is those innovations, combined with the apps that use them, that will help solve fundamental problems — problems that, if we don’t solve them, will have us headed back to the caves of our ancestors.

“People praise an app because it tells you where you are on Earth.” But, he said, “It can only do that because the military put up a system of satellites” — the GPS on which your phone depends to determine its location.

“It’s all about the hardware,” Tyson said. “That hardware is not just, ‘I got my latest smartphone.’ That hardware has stuff in it, and accesses stuff, that enables me to probe and investigate my environment.

“That’s what scientists do.”

But are phones really helping make people into scientists, I wondered? He gave me a long and forceful answer, pointing to a bunch of examples from popular culture showing that science is more significant than ever before. The most popular show on television, he said, is Big Bang Theory — showing “scientists and geeky people doing geeky things.” Fox aired his show, Cosmos, a 13-part documentary, on primetime. (Yes, it has an app.) On Twitter, 3.7 million people follow him. When he commented about physics gaffes in the movie Gravity, it became a subject for the Today show, evening news shows, and late-night talk shows. CSI, he explained, shows people solving crimes using “real science” — chemistry, geology, physics, biology — not just “Sherlock Holmesian” deductions.

“Yes, I think the public is learning how to think like a scientist. You don’t have to be a scientist, just learn what it means to take a measurement. That’s an important thing for an informed democracy,” Tyson said.

“Anything that gets people paying attention to their environment and measuring things connects you to nature as never before.”

“I’m praising the hardware people,” he said. “The software guys will be there forever … but at the end of the day they’ll just be writing video games without the extraordinary miniaturized hardware that’s contained in our handheld devices.”

And not just processors: The hardware includes sensors, like cameras, microphones, accelerometers, and more.

During our conversation, he gave me a wishlist for future sensors that smartphone makers could incorporate, which would in turn enable even more apps. An ultraviolet light sensor, for instance, could enable an app that would tell you what strength of sunscreen you needed to put on. A molecular analyzer could be used to make a breathalyzer app, a smoke detector app, or a pollution-sensing app.

Because the smartphone crams a huge amount of processing and sensing power into such a small package, and because it’s connected to the Internet and to GPS, it’s incredibly powerful, and enables a host of powerful apps.

Tyson may not be an App Store fan. But he is an unabashed fan of handheld devices, and he is not backing off his claim that they are as significant to civilization as the microscope and telescope.

“I think the smartphone is the most amazing thing there ever was,” Tyson said. “It’s the most amazing object ever created.”

Is he an Apple fan? Actually, yes: He’s been using Apple products since 1985, he said. But he was careful to point out that, although Apple paid him for the interview, the company is not paying him to pitch its products.

“I don’t mind if people use what I say, and if it happens to suit their needs, fine,” he told me.

But he was very clear that he wasn’t endorsing Apple or its App Store, and he repeated his enthusiasm for hardware makers above software makers.

“There are no developers without the hardware,” he told me. “Somebody needs to say that.”

And maybe, he thought, this clarification “will get people to think about the future of building things with hardware with no less vigor than they think about making software.”

——-

Originally published on VentureBeat: http://ift.tt/1S92WEx

No, Neil deGrasse Tyson didn’t say Apple’s App Store is a ‘watershed moment in civilization’

It’s time for new adventures

It's been a wild ride, and it's not over yet.

Four years ago, Matt Marshall and Alicia Saribalis hired me to lead VentureBeat’s news team, which at that comprised about five people. One of them immediately left. (Thanks for the vote of confidence, Anthony.) Most of us sat in a small office in San Francisco, arrayed around a collection of mismatched second-hand desks. There was one salesperson and one tech person. I think the entire staff was under twelve. We reached about 1.5 million people a month.

Fast-forward to today, and we have a newsroom with 16 staffers, spread out between San Francisco, New York, Toronto, London, and Toulouse, France, among other locations. VentureBeat’s sales, tech, and events teams have all grown, and the company numbers almost 50. We have matching Ikea desks like any other startup. We reach 6.5 million people monthly. The company produces an excellent set of events every year, and we’re building a data-driven research business that holds huge potential.

At the core of it, though, is the news site. I’m proud of everything the news team has done here in the past four years, and I’m proud of the talent we’ve been able to attract during that time.

As we pause and ready ourselves for our next big growth spurt, the time seemed right for me to step back and take stock. One of the things I’ve enjoyed most about being here at VB is the opportunity I have to talk with investors and startups, learn about their technologies, talk with their founders, and then tell their stories. Yet I’ve had frustratingly little time to do that in recent months.

So Matt and I easily came to agreement this week about a new role for me at VB.

Starting this week, I’ll step down from the editor-in-chief role and become VentureBeat’s editor at large. I’ll continue writing my column, producing our podcast, and contributing to our news coverage and news analysis as needed. And I will advise VB as it moves forward into its next chapter.

In the meantime, Harrison Weber, who has been doing a kick-ass job as our news editor for the past year, will act as our executive editor. Jennifer Tsao will continue keeping the trains running on time as our managing editor for VB, while Jason Wilson will continue overseeing GamesBeat.

In short, little will change. Our dedication to principled journalism and our passion for technology continues. And I’ll still be here, telling those tech stories and looking for the next big thing.

——————

Originally published on VentureBeat: http://ift.tt/1IDCPUB

It’s time for new adventures

With great data comes great responsibility

A datacenter server room.

Marketers are now able to personalize messages for their customers (and potential customers) with greater precision than ever before.

Powerful new data analytics systems, marketing automation platforms, and a host of other tools are part of this revolution in targeting. But the availability of huge datasets is also key. Companies are learning how to combine formerly disparate datasets, and that gives them the ability to perform some truly amazing, and sometimes creepy, acts of individual targeting.

I learned some of what’s possible this week at VentureBeat’s first-ever GrowthBeat Summit in Boston. It was a small, invitation-only gathering of chief marketing officers and other senior marketing executives, along with a few vendors of marketing technologies, at the comfy Langham Hotel in the financial district. It’s clear that CMOs everywhere are struggling to understand and capitalize on the wealth of data and the excessive number of tools at their disposal, and at this event, they opened up to each other about their challenges, opportunities, concerns, and difficulties.

One thing that became clear to me: Marketing technologists need to start working very closely with security professionals, right away. Marketers clearly understand the opportunities that data-driven targeting afford them, but I’m not sure the industry fully grasps how much awesome responsibility it is taking on.

For example: One company, Viant (a sponsor of the event) operates a video ad network and holds an enormous database that it uses to help its customers identify promising customers. One reason that database is so big is that Viant owns MySpace. While MySpace is a fraction of its former size (about 40 million active users, making it tiny compared to Facebook), it was once large. Huge, in fact: Over time, more than one billion people registered for MySpace accounts. And while they might not have logged into that account in years, the account still contains valuable data: name, gender, date of birth, email address, and perhaps other details, like what bands you liked in 2007 or where you lived. And, critically, when you signed up for MySpace, you checked a box indicating that you consented for the company to use this data to personalize advertising messages for you.

Now Viant can work with a client company to cross-index a database that the company already owns (such as a customer list) with Viant’s billion-person database, adding details that the company didn’t already have. It can use credit reporting agencies such as Experian to add even more detail.

The result, Viant’s chief revenue officer Jeff Collins told me, is that Viant can help a retailer target people based on such fine-grained attributes as their distance from the retailer’s physical stores, their income, the number of children they have, and more.

Another company executive I spoke with, Lee Odess of Brivo Systems, added another dimension. Brivo makes physical access control systems, the software behind door card readers and similar devices that lock or unlock things once you have authenticated with them. Brivo has realized that, in addition to unlocking things, its software also contains data about where people are in the world, and that data might be useful to its customers. So, for instance, a property management company that operates a skyscraper might offer its tenants the ability to check visitors in and out of reception as soon as they enter the skyscraper lobby, assuming those visitors have a known smartphone. Or it might adjust the environment of the skyscraper lobby based on who is passing through it — perhaps adjusting what appears on video screens when employees of a certain company pass through.

David Cooperstein, who runs a consulting company called Figurr and who formerly led Forrester Research’s CMO practice, described how a retailer can even use a high degree of personalization in its physical mailers. If it sends out 500,000 copies of a mailer to customers, there might be 450,000 variants of that paper, each one customized based on what the retailer knows about its customers already.

These people and others talked about the need to not cross the line into “creepy” when crafting personalized, targeted marketing messages. As for me, though, I’m not too worried about the creepiness of the marketing. At most it’s an annoyance. What’s really scary is what might happen if that data were used by someone who wasn’t just annoying, but who actually had real power to mess with you: the government, for instance. Or hackers.

We know, thanks to the documents that Edward Snowden leaked to the press, that the government is very interested in this kind of data. Just this week the Senate passed the USA Freedom Act, which dialed back the feds’ ability to collect and store the metadata about phone calls people make. Instead, the phone companies hang onto that metadata, which they will turn over to the government if a special court approves.

Phone metadata is just the tip of the spear, though. Data companies like Viant, Brivo, Experian, and many more will have data about who your friends are, how big your family is, what your salary is, where you live, and which secure doors you’ve been into and out of. Thanks to retail beacons, they may know where you shop, how long you spend in each aisle, and what stores you pass on the way to work. They’ll know a lot about where other specific people were at the same time you were. If marketers can put this data together, the FBI and the NSA certainly can, too.

And so too can hackers. Coincidentally, on the same day that we were having these discussions at GrowthBeat, I received a letter from my insurance company, Anthem Blue Cross, explaining that hackers had broken into their IT systems during December and January, siphoning off information that may have included my name, birthdate, Social Security numbers, employment status (including salary), home address, and more. I don’t know why Anthem waited four months to tell me about a hack that may have affected 80 million people and which it learned about in February, but there it is. My data’s out there, and there’s no telling what the hackers may do with it.

In short: The same databases and technologies that make personalization and targeting so effective and powerful in the hands of marketers are also tempting targets for hackers and for surveillance by governments of all kinds.

Marketers are sitting, like Smaug, on piles of treasure. It’s virtually certain that people will be coming to try to take that treasure and make it their own. So marketers delving into the world of marketing tech need to embrace good security practices and form strong relationships with security professionals in their companies. And they need to do it now.

And our country needs to take a harder look at how data like this can be used, stored, transferred, and subpoenaed. With great data comes great responsibility — we need to use it, and protect it, well.

 

Originally published on VentureBeat: http://ift.tt/1Gdm3cE

With great data comes great responsibility