What Television Will Look Like in 2025, According to Netflix | Business | WIRED

What Television Will Look Like in 2025, According to Netflix | Business | WIRED

In the future, Netflix will know exactly what you want to watch, even before you do. You won’t have to spend all that time browsing through endless lists of shows on your television.

That’s according to Neil Hunt, Netflix’s chief product officer. It’s just one of many predictions for the future of TV that the forward-thinking executive laid out on stage today at New York City’s Internet Week conference, and no one would be surprised if all that came to fruition. If there’s one company that knows about changing the way we watch TV shows and movies, it’s Netflix. From its humble origins as a DVD-by-mail outfit back in 1997 to its current status as a video streaming powerhouse and original content creator, Netflix has already overturned the status quo more than once.

As a slew of other tech companies, from Amazon to Yahoo, compete with Netflix to move television online–and traditional broadcasters fight to protect their old business models–Hunt has a clear vision for how the war for our attention will play out by the year 2025. Here are a few of his predictions:

You’ll Have 48 Million TV Channels

People have traditionally discovered new shows by tuning into the channels that were most aligned with their interests. Love news? Then CNN might be the channel for you. If it’s children’s programming you want, Nickelodeon has you covered. And yet, none of these channels can serve 100 percent of their customers what they want to watch 100 percent of the time.

According to Hunt, this will change with internet TV. He said Netflix is now working to perfect its personalization technology to the point where users will no longer have to choose what they want to watch from a grid of shows and movies. Instead, the recommendation engine will be so finely tuned that it will show users “one or two suggestions that perfectly fit what they want to watch now.”

“I think this vision is possible,” Hunt said. “We’ve come a long way towards it, and we have a ways to go still.” He said Netflix is now devoting as much time and energy to building out that personalization technology as the company put into building the infrastructure for delivering that content in the first place.

Creative Freedom Will Come to Hollywood

Hunt knows what you’re thinking: Most of Netflix’s extensive library consists of shows and movies you’d never want to watch. “Some would call it junk,” Hunt admitted on stage.

But he doesn’t see it that way. “There are no bad shows,” he said. “But there are many shows with small, but devoted audiences.” And as Netflix’s personalization engine becomes smarter and smarter, he said, it will become easier for those small audiences to discover new content they might not otherwise have found. That will give people like filmmakers and actors more creative freedom, he explained, because they’ll finally have a distribution channel that will tolerate small, highly individuated audiences.

“Internet TV can afford to carry those small shows,” Hunt said, adding that this approach has already enabled shows and films to thrive on Netflix that might not otherwise have worked on traditional television. The Square, a documentary about the Arab Spring uprisings in Tahrir Square, is one such example. “It’s been so successful on Netflix, and might not have found a home on linear TV.”

Internet TV will also free filmmakers from traditional television formats, Hunt said, in which they get one half-hour or hour-long slot per week with which to hook a viewer. On the internet, a television episode “can be as long or as short as you want, and it doesn’t have to tease you into the next episode because you can binge right into the next episode.” Eventually, you won’t even recognize TV shows as TV shows. “The stories we watch today are not your parents’ TV,” Hunt said, “and the stories your kids watch in 2025 will blow your mind away.”

The Commercial Will Finally Die

Netflix has already proven that it’s possible to build a big business in television without advertisers. Subscription fees, it turns out, do the trick. That means that the proliferation of internet TV may be the final nail in the traditional commercial’s coffin. That would change the entire economics of the advertising industry over the next decade, Hunt explained. “The ad-free model seems to be very popular with consumers,” he said. “We have to imagine that the Geicos and the Wendys and the Chevys will have to find a different place to advertise their wares in 2025.”

But there’s another possibility. According to Hunt, the same technology that delivers personalized content to viewers could also help internet TV service providers select more targeted ads to show their users. “Maybe you only see that Chevy ad if you’re ready to buy the car today,” Hunt said. That means viewers would see fewer ads, and advertisers would get to reach a more relevant audience.

Live Sports Will Arrive on Netflix (Maybe)

Ok, so this might not have been one of Hunt’s predictions, per se, but when asked by an audience member whether we might see live television and sports broadcasting on the site in the future, Hunt said we should “stay tuned.”

He noted that bringing live sports to Netflix would change the economics of the company. “Sports franchises end up being able to sell to the highest bidder,” he explained. “It’s not an area where Netflix has the total advantage over everyone else.” But he also said that as Netflix continues to grow its footprint, it’s always a possibility. Such a development would be an even deeper blow to the traditional television industry, as live sports are one big reason people opt not to cut the cord.

Everyone Will Have a Smart TV

In 2014, Hunt said, about 100 million internet-connected TV will sold–or about one for every three homes with broadband internet. And by the year 2025, he told his audience, everyone will own a smart TV.

That means the race is on to become the smart TV manufacturer of record, and the competitive landscape is broad. This is a battle that will include cable companies, internet TV providers like Netflix, and tech giants like Google and Apple, as well as television manufacturers like Sony and Samsung. That should usher in a golden age of innovation in the smart TV space, as powerful competitors vie for your attention–and make each other better in the process.

Also on WIRED: EBay Demonstrates How Not to Respond


What HBO Can Teach Colleges About 'Trigger Warnings' - Atlantic Mobile

What HBO Can Teach Colleges About ‘Trigger Warnings’ - Atlantic Mobile

Collegians all over the country are calling for “trigger warnings,” or “explicit alerts that the material they are about to read or see in a classroom might upset them,” the N.Y. Times reports. The wisest activists favor narrowly drawn alerts intended to spare veterans and sexual assault victims from post-traumatic stress. Others want students warned about any content that might stoke anxiety or trauma. Critics of the “trigger warning” movement include academics who worry that requiring alerts in the classroom would chill speech and erode academic freedom. Others argue that the alerts are condescending, showy, or useless.

Strange as it may seem, reflecting on The Sopranos can help us here. The HBO series was as graphically violent as you’d expect of a mob drama: arms and legs are broken to extort protection money; gamblers who can’t cover debts are brutally pummeled; a couple seasons in, I’d seen aggravated assaults, extreme domestic abuse, and more murderous gunshots to heads, chests, and guts than I can recall. Hence my surprise that Season 3, episode four was preceded by a warning I’d never seen. HBO uses standard Pay TV Content Descriptors. I’d been tipped off countless times about “adult content” and “graphic violence.” What I hadn’t known till just prior to that episode is that there’s a special designation for rape:

After watching the episode, that brief warning seemed like a good idea. It isn’t that the character’s rape, awful as it was, is significantly “worse” than other traumas perpetrated in the series–in another episode, for example, a stripper is beaten, killed, and dumped in a ditch on a whim, a scene covered by generic “graphic violence.” But every viewer of The Sopranos knew people would be beaten and killed. Rape, a distinct trauma, is absent from the show aside from one episode, and virtually every viewer was unprepared for the unexpected way it arose. I suspect the preemptive descriptor helped some number of viewers to avoid the scene, or more likely, to brace for it so as to be better prepared to watch.*

Notice that the general concept of “trigger warnings” is not, in fact, unique to feminist blogs or campus activists, even if they’ve cornered that particular buzz phrase. Mainstream, mass-entertainment networks find value in preemptive viewer alerts, even at the cost of tipping everyone off to a future plot development.

Weighing costs and benefits, that Sopranos episode strikes me as deft deployment of a “trigger warning.” But the series illustrates the limits of such alerts too.

The descriptor “Adult Content” may be useful in pay TV as a whole (or may not be), but it had almost no value within the universe of people who watched or were passingly familiar with The Sopranos. Some episodes are more disturbingly violent than others–and individual sensitivities inevitably vary in a mass audience–yet the vague “adult content” label is appended to literally every episode. “Trigger warnings,” by whatever name, are diminished when applied to extreme content that a typical person expects, or when used so ubiquitously that we reflexively ignore the meaningless tip because it isn’t specific enough to be useful.

Unfortunately, college activists aren’t just agitating for warnings to precede unusually graphic content that a reasonable person probably wouldn’t have anticipated.

They’re going much farther:

Should students about to read “The Great Gatsby” be forewarned about “a variety of scenes that reference gory, abusive and misogynistic violence,” as one Rutgers student proposed? Would any book that addresses racism — like “The Adventures of Huckleberry Finn” or “Things Fall Apart” — have to be preceded by a note of caution? Do sexual images from Greek mythology need to come with a viewer-beware label?

The N.Y. Times story goes on to quote a draft guide on “trigger warnings” from Oberlin College:

Triggers are not only relevant to sexual misconduct, but also to anything that might cause trauma. Be aware of racism, classism, sexism, heterosexism, cissexism, ableism, and other issues of privilege and oppression. Realize that all forms of violence are traumatic, and that your students have lives before and outside your classroom, experiences you may not expect or understand.

The standard critique here is that a “trigger warning” policy written like that would impinge on academic freedom, chill speech, and infantalize 18, 19, 20, and 21-year-olds. I agree. But most confounding is the notion of students pushing to be warned about classroom material more tame than much of what they encounter in daily life. The Oberlin language is broad enough to cover a huge chunk of network TV shows, hip hop albums, standup comics and Hollywood films. If everything the Oberlin community considers privileged or oppressive were labeled with a “trigger warning” they’d need to be taped all over campus.

Kevin Drum expresses puzzlement. “What I don’t get is what anyone thinks the point of this is,” he writes. “You’re never going to have trigger warnings in ordinary life, right? So even if universities started adopting broad trigger policies, it would accomplish nothing except to semi-protect sensitive students for a few more years of their lives, instead of teaching them how to deal with upsetting material.”

Here’s my theory.

These college activists actually accept the widely held notion that whether a warning is necessary should depend partly on the material one expects to encounter in a given setting. It’s just that they’ve been trained by a subset of professors, administrators, and classmates to believe that the classroom is or ought to be a “safe space;” that inside it, no one should feel upset, anxious or uncomfortable.

If I’m correct, a clarification is emphatically needed, not on the syllabi of individual classes, but during the registration period at the beginning of each term.

"The world is rife with racism, classism, sexism, heterosexism, cissexism, ableism, and other issues of privilege and oppression," the Oberlin course catalog might say. "Students taking courses in the humanities and social sciences should expect to grapple regularly with those phenomena and other fraught, uncomfortable subjects besides, in both course materials and classroom discussions with people who don’t share their values, judgments, or assumptions."

That this doesn’t go without saying is an indictment of leading universities. As a UC Santa Barbara professor put it: “Any student can request some sort of individual accommodation, but… the presumption… that students should not be forced to deal with something that makes them uncomfortable is absurd or even dangerous.” How to study slavery, or the Rwandan genocide, or the Communist purges, or the Holocaust, or the Crucifixion, or the prose of Toni Morrison or James Joyce, or the speeches of MLK, or the debate that surrounds abortion, or psychological experiments about the human willingness to take orders, without risking trauma?

Surely college students should know what’s coming when they set out to plumb human civilization. A huge part of it is a horror show. To spare us upset would require morphine.

Perhaps narrow policies to help sexual assault victims or combat veterans could be useful (I have not yet seen hard evidence demonstrating as much, but anecdotes aren’t nothing). And there are, of course, rare instances when professors should tip students off to specific, unusually extreme content that no one would’ve expected given the context. But I suspect that even in those unusual cases, where the concept behind “trigger warnings” is likely useful, invoking the phrase itself is much less so, because it has become jargon. To eschew the phrase, to be specific, is to force clarity of thought and convey useful meaning.

The alternative, the future before us if the most sweeping plans for “trigger warnings” become reality, is a kind of arms race, where different groups of students demand that their highly particular, politicized sensitivities are as deserving of a trigger warning as any other. Everyone from anarchists to college Republicans will join in. Kids will feel trauma when their trauma isn’t recognized as trauma. “Trigger warnings” will be as common and useless as “adult content” warnings on HBO.

Everyone will be worse off.

*On “trigger warnings,” sentiments like the following seem to be common: “I don’t often skip a post because of a warning, but it gives me a moment to steady myself before reading. For me, it’s being exposed to a trigger unexpectedly that causes anxiety. If I know it’s coming, I’m usually okay. As for this blog, I come here expecting to read about rape and other triggering topics, so i’m already prepared.” (That is from the generally interesting comments thread of an old Amanda Hess post.)


HBO Teaches the Streaming Wannabes How to Make Big Money on Original Shows - Businessweek

HBO Teaches the Streaming Wannabes How to Make Big Money on Original Shows - Businessweek

Despite a flurry of original, lavishly produced TV programs from Hulu, Netflix , and Amazon , HBO is quietly becoming the next HBO.

True Detective, the eight-episode cop thriller that ran from Jan. 12 to March 9, won almost 12 million viewers per episode, a feat never before accomplished by an HBO series in its first season. The fourth season of Game of Thrones drew some 17 million fans of swords and nudity, more than any HBO series since the final season of The Sopranos in 2007. ”HBO’s current lineup includes four of its top five highest-rating shows ever,” Time Warner’s chief executive, Jeffrey Bewkes, crowed on a conference call this morning.

All told, HBO revenue increased 9 percent in the recent quarter, to $1.3 billion, according to an earnings report this morning from . What’s more, HBO managed a 36 percent operating margin, which is no small accomplishment in a hit-driven business that requires such expensive employees as Matthew McConaughey, Woody Harrelson, and Julia Louis-Dreyfus.

How impressive is that margin? Consider Warner Bros., Time Warner’s movie department, booked an operating margin of 12 percent last quarter and a profit of $369 million—a full 20 percent less than HBO. And the movie studio’s bottom line benefited from the huge success of The Lego Movie. Other Time Warner entertainment properties didn’t do much better.

It’s not so much that HBO’s hits have smaller budgets than most might think. After all, the network spent $8 million on one Game of Thrones episode. It’s just that its flops are relatively rare—though not unheard of. Girls generates plenty of cultural conversation, but only 670,000 viewers tuned in for its Season 3 finale in late March, down from 1.1 million who watched the first episode.

But even those smaller audience numbers are insulated a bit by the subscription model. HBO executives don’t have to worry about each installment striking a chord with advertisers, and seasons are kept short enough that the network isn’t compelled to pull the plug midseason on a subpar offering.

And the growing field of streaming services is still clamoring for HBO’s leftovers. So-called content revenue from old shows increased 13 percent in the recent quarter, a $24 million improvement. Time Warner believes that the more people who watch HBO shows on places other than the cable-TV channel, the more likely they are to become subscribers—a philosophy that has led to a hands-off approach to those borrowing HBO Go passwords from friends and family. ”We firmly believe that if you have great content, giving consumers control over where, when, and on what platform they watch it will drive increased consumption and value,” Bewkes said.

Just last week, struck a licensing deal to offer its Prime members a catalogue of old HBO hits, including the miniseries Band of Brothers and the crime drama The Wire. Time Warner has long since stopped paying for those episodes, but the shows keep pouring money into the coffers. Play the model out a bit, and we should see the same thing from True Detectives, Veep, and a certain show with dragons and knights. If only Time Warner could pull off the same trick with old copies of Time magazine.


This Google Motherboard Means Trouble for Intel | Enterprise | WIRED

This Google Motherboard Means Trouble for Intel | Enterprise | WIRED

The Intel chip factory in Chandler, Arizona was christened by President Barack Obama.

In 2012, while it was under construction, the President made a pit stop at the plant, known as Fab 42, painting it as a symbol of American optimism. “The factory that’s being built behind me is an example of an America that is within our reach–an America that attracts the next generation of good manufacturing jobs,” Obama said.

It was a noble vision, but for Intel, things didn’t exactly work out as planned. The chip giant eventually mothballed the $5 billion factory, and the construction site is now a symbol of a different kind. Fab 42 represents an Intel in transition, a company that’s struggling to evolve with a changing world.

It’s not just that people are buying iPads and Android phones built with low-power ARM processors instead of PCs and phones and tablets powered by Intel chips–the main reason the Chandler plant was put on hold. It’s that the big online companies, including Google and Facebook and Amazon, are now looking to run their operations on computer servers that use chips made by someone other than Intel. And the first trend may ultimately feed the second.

The latest blow to Intel’s future arrived on Monday in the form of a red server motherboard touted by Gordon MacKean, the man responsible for building the hundreds of thousands of servers that power Google’s online empire. In a Google+ post, MacKean said he was “excited” to show off the red motherboard, which was built using not an Intel chip, but IBM’s Power8 processor.

To the outsider, the motherboard may not look like much, but the fact that Google has taken the time and effort to port its software to IBM’s architecture and even design a motherboard based on an IBM processor is a big deal. Since its beginning, back in 1998, Google has used servers equipped with Intel processors, and today the company is one of the world’s largest buyers of Intel server chips. The search giant doesn’t make servers for anyone but itself, but it’s likely the fifth-largest Intel server chip customer on Earth.

Why is Google tinkering with a brand new microprocessor? “We’re really driven by an aggressive demand. The growth at Google has been very significant,” McKean says. In other words, Google keeps growing, and so the massive collection of servers that runs Google must keep growing too. Yes, the company can keep expanding its operation using Intel chips. But it behooves Google to use other chip suppliers. That’s a way to cut costs, but it’s also a way to ensure that the chips it uses just keep getting better. Companies like Google don’t want to rely solely on Intel. They want competition in the market. They want to play one chip maker off another.

The Two Intels

For more than a decade, Intel’s chip operation has been a beautiful thing: two parallel lines of business, delivering both staggering volume and high margins. There’s the desktop business, and the server business. But as Intel’s client business struggles, it could affect the server side of the company. As Christos Kozyrakis, a computer science professor at Stanford University, points out, Intel will typically build a desktop chip and then remake it for the server world. “They take exactly the same core with different caches, different memory controllers,” he says, “and they put it on server.”

This lets Intel spread a single chip’s development costs over several parts of the company. And that’s important. Designing a new processor core is a major undertaking, one that can take hundreds of engineers several years to complete. But it’s unclear whether this arrangement will work as well in the future. “Up until now, it seemed to be the case that whatever was good for one segment of the market was good for another,” says Kozyrakis. “Now the question becomes: has this changed?”

Intel says that desktop shipments are rebounding of late, and that server improvements are being cranked out like never before. Indeed, the company has a massive advantage in the server business. Google, Facebook, Microsoft, Amazon — all of the web giants overwhelmingly use Intel-based x86 servers. But as Intel struggles with the desktop market, it’s facing increased competition on the server side. In addition to Google exploring IBM’s Power chips, Facebook has long made noises about using ARM and other low-power chips in its servers. And now it seems Amazon is looking at the same thing.

The difference with ARM and Power is that any outside manufacturers can license the designs and modify them as need be. That’s not the case with Intel’s x86 architecture. The onus is on Intel to innovate. ARM has always licensed out its architecture, and now IBM has formed a group called OpenPower, where memory makers, graphics chip companies, and other component vendors can come together and help build the kind of systems that the Googles of the world are already clamoring for. “If you look at x86, x86 is not creating this open ecosystem environment to let everybody come in and innovate on their platform,” says Brad McCready, an IBM Fellow.

The Christopher Lameter, an R&D team lead with JumpTrading, a Chicago-based high-frequency trading firm, says that he hopes that the OpenPower effort will lead to new types of chip design that will be useful to customers like JumpTrading. He worries that the desktop slowdown will ultimately hurt new Intel developments on the server side, some four-to-six years down the line. And at the same time, he’s excited by some of the new things that have been developed in the mobile phone world. “On the kernel level, it seems that ARM/Android [is] driving innovation,” he says.

But the truth is that today, nobody is certain where the next great server breakthrough will come from. For years, chipmakers got huge performance gains by shrinking the size of their chip components. But today’s chips components are becoming so tiny that they can’t be shrunk for much longer. And the best idea right now seems to be building chips that are custom designed to be really fast at ferrying and processing data for web applications. IBM dreams that OpenPower will do that.

What’s more, if innovation is happening with ARM and Power, there’s pressure on Intel to follow suit.

The Spur of Competition

Intel’s client-side slowdown is real. A decade ago, the company’s client business was growing by 11 percent per year. In 2013, it shrunk by four percent, according to data compiled by Mercury Research, a microchip analyst firm. But Mercury’s principal analyst Dean McCarron doesn’t think that the desktop slowdown is having any effect on server innovation.

But he agrees that there’s one thing that can boosts innovation in the server space: competition. Two years ago, Intel didn’t have much of that. But with OpenPower and ARM pushing into the game, everything is changing. “When there’s a lot of competition. There’s a lot more product innovation,” McCarron says.

That said, it will take a lot more than a red motherboard to displace the king of the hill. Says McCarron: “Intel has every incentive to retain this market because of how lucrative it is.”

Also on WIRED: How Netflix Is Transforming the Economics of the Web


Watch "What Is Aereo and Why Is It So Controversial?"

What Is Aereo and Why Is It So Controversial?:


The Future of Media Will Be Streamed - Atlantic Mobile

The Future of Media Will Be Streamed - Atlantic Mobile

Today, fewer people are walking through those doors. Music stores are shuttering, devastated by the digitization of music, which sent physical albums into the dustbin of history. Meanwhile, cinema has been eclipsed by TV as the juggernaut of video entertainment. The number of annual movie tickets bought per person has declined from 4.8 to 4 in the last decade. This is what economists call “structural decline,” what couch potatoes have called “waiting until it comes out on DVD,” and what Hollywood executives call “a reason to pray rather fervently that people will pay more for the 3D version of the fourth sequel of the second reboot of our comic franchise.”

It’s ironic, then, that just as the Internet and TV have conspired to devastate the old business models of music and movies, they’re also come together to create new business models to save them. A new report from Moffett Nathanson estimates that since 2010, streaming (e.g.: Netflix and Hulu) has gone from zilch to a $3 billion business, while DVDs, which many hoped would rescue the movie industry, have declined sharply in the same period.

(via Instapaper)




By Whitson Gordon

We’ve often sung the praises of having your own home server for quiet backup, torrenting, and other services. But, if you don’t want to build a new computer, DIYer Darknezz shows us how he put together a home server out of nothing but a broken laptop’s motherboard.

Read More

Source: lifehackercom


Amazon Fire TV vs. Apple TV vs. Roku 3 vs. Chromecast Fire-tv-competition

The newly-announced Amazon Fire TV wants to take over your living room with its tiny, flat media box, but with lots of other existing devices on the market, you’ll want to consider all streaming options first before making a purchase.

Fire TV — a set-top box, gaming console hybrid — will directly compete against streaming devices like Apple TV, Roku and Google Chromecast, but it will also take on gaming consoles such as Xbox One, PlayStation 4 and even the Android-powered Ouya.

For the sake of comparing Fire TV to existing set-top boxes — and leaving the gaming consoles out for now — there’s a lot to be said about its bells and whistles. First, it uses voice search when you speak into the remote and boasts a stronger quad-core processor than the Roku 3 (double core), Apple TV (single core) and Chromecast (single core) Read more…

More about Google, Amazon, Roku, Apple Tv, and Film imageimage
Source: marketinginfographics