Uncategorized

shared content

Retailers and banks can also reduce risk by moving away from cards that use magnetic strips, which are easily faked. Many countries in Europe, Asia and elsewhere have already replaced magnetic strips with chips, which are harder to duplicate. Chip-based cards also require customers to enter a secure code before they can be used. That’s partly why the United States accounts for nearly half of all global credit card fraud, even though it generates only about a quarter of all credit card spending. American retailers, including Target, have resisted (foolishly, as it turns out) the introduction of chip-based cards because they would have to invest in new equipment to handle them. (Target now says it supports chip-based cards.)

No security measure will ever rid the economy of theft and fraud completely. But there is evidence that companies could do a lot more to protect data.

http://ift.tt/1n1t4Ta

Standard
Uncategorized

shared content

“So, for many art forms, it is indeed true that “anyone could do that”, in the sense that anyone has the technology or technique to hand to execute the idea. It has become possible for more and more people, often untrained, to express their creative imagination as doing so has become less and less dependent on technical expertise. However, not everyone can have the ideas, the eye or the ear to come up with something worth making real. That core of invention remains elusive, beyond most of us most of the time. The best answer to the moan “I could have done that” remains “but you didn’t”. No one else came up with the geometric lines and block colours of Mondrian before he did, not because they lacked the skill, but because they lacked the vision. Technology and trends in art have not, therefore, made really good art more democratic, they have simply widened the membership of the elite.”

http://ift.tt/1iqOuG6

Standard
Uncategorized

shared content

Doctors do “Google” their patients. In fact, the vast majority of physicians I know have done so. To my generation, using a search engine like Google comes as naturally as sharing pictures of our children or a recent vacation on a social networking site like Facebook. But it surprises me that more physicians don’t pause and think about what it means for the patient-doctor relationship.

What if one finds something that is not warm and fuzzy? I recently read about a case in which a 26-year-old woman went to a surgeon wanting to have a prophylactic double mastectomy, citing an extensive history of cancer in her family. However, she was not willing to undergo any work-up, and her medical team noted several inconsistencies in her story. When they searched online, it turned out she had set up multiple Facebook accounts soliciting donations for malignancies she never had. One page showed her with her head shaved, as if she had already undergone chemotherapy. The surgeons immediately decided to halt her care.

http://well.blogs.nytimes.com/2014/01/06/when-doctors-google-their-patients-2/?ref=todayspaper

Standard
Uncategorized

shared content

Which leads nicely to Lanier’s final big point: that the value of these new companies comes from us. “Instagram isn’t worth a billion dollars just because those 13 employees are extraordinary,” he writes. “Instead, its value comes from the millions of users who contribute to the network without being paid for it.” He adds, “Networks need a great number of people to participate in them to generate significant value. But when they have them, only a small number of people get paid. This has the net effect of centralizing wealth and limiting overall economic growth.” Thus, in Lanier’s view, is income inequality also partly a consequence of the digital economy.

It is Lanier’s radical idea that people should get paid whenever their information is used. He envisions a different kind of digital economy, in which creators of content — whether a blog post or a Facebook photograph — would receive micropayments whenever that content was used. A digital economy that appears to give things away for free — in return for being able to invade the privacy of its customers for commercial gain — isn’t free at all, he argues.

http://www.nytimes.com/2014/01/07/opinion/nocera-will-digital-networks-ruin-us.html?ref=todayspaper

Standard
Uncategorized

shared content

Rising expectations aren’t a sign of immature “entitlement.” They’re a sign of progress — and the wellspring of future advances. The same ridiculous discontent that says Starbucks ought to offer vegan pumpkin lattes created Starbucks in the first place. Two centuries of refusing to be satisfied produced the long series of innovations that turned hunger from a near-universal human condition into a “third world problem.”

Complaining about small annoyances can be demoralizing and obnoxious, but demanding complacency is worse. The trick is to simultaneously remember how much life has improved while acknowledging how it could be better. In the new year, then, may all your worries be first world problems.

http://www.bloomberg.com/news/2014-01-02/two-cheers-for-first-world-problems-.html

Standard
Uncategorized

shared content

This online ad customization technique is known as behavioral targeting, but Pandora adds a music layer. Pandora has collected song preference and other details about more than 200 million registered users, and those people have expressed their song likes and dislikes by pressing the site’s thumbs-up and thumbs-down buttons more than 35 billion times. Because Pandora needs to understand the type of device a listener is using in order to deliver songs in a playable format, its system also knows whether people are tuning in from their cars, from iPhones or Android phones or from desktops.

So it seems only logical for the company to start seeking correlations between users’ listening habits and the kinds of ads they might be most receptive to.

http://www.nytimes.com/2014/01/05/technology/pandora-mines-users-data-to-better-target-ads.html?ref=todayspaper

Standard

Uncategorized

shared content

Image
Uncategorized

shared content

Musicmetric listed 20 artists whose work had been illegally downloaded 64.5 million times in 2013. About 70 percent of the downloads were albums; 30 percent were individual tracks.

Mr. Mars’s music accounted for 5,783,556 of those downloads, followed closely by Rihanna (5,414,166), Daft Punk (4,212,361) and Justin Timberlake (3,930,185). Other artists on the list include Flo Rida, Kanye West, Eminem, Jay Z, Maroon 5, Adele and Katy Perry (who holds the No. 20 position, with 2,318,740 downloads).

http://artsbeat.blogs.nytimes.com/2014/01/02/bruno-mars-tops-illegal-download-chart/?ref=todayspaper

Standard
Uncategorized

shared content

On Friday, at 12 A.M., Beyoncé staged the death of several paradigms by releasing her album “Beyoncé” on iTunes. It has fourteen songs, with a full-blown music video—not a Vine or a MacBook confessional—for each one, plus a few extra videos. The bundle costs $15.99 and many, many people with computers bought it. Billboard now reports that “Beyoncé” is the “fastest-selling album ever in the iTunes store,” with almost nine hundred thousand copies sold since Friday. So, in secret, Beyoncé planned and executed an entire album, and somehow nobody leaked the news or the files. Artists have been practicing the sudden release for several years—Radiohead’s “In Rainbows” is often credited as the first significant example—but there’s never been an out-of-the-blue release of this scope and significance. In her sole statement, Beyoncé said she wanted to recapture the “immersive” experience of everyone hearing an album all at once. She got her wish.

So what died on Friday? Nothing: this drop was a demonstration, kind of like the Trinity test. Yes, social media promoted the release for free, meaning that marketing budgets could potentially shrink for incredibly famous people on major labels, like Beyoncé. Not everyone is building up to an instant profit on release day, and maybe no one who hasn’t first been pumped into the mainstream by the majors can expect such a response. But “Beyoncé” proved that we could be spared viral campaigns and fake leaks and Pepsi ads. It’s not surprising that “Beyoncé” is excellent (the pros often work better faster); what is exciting is watching the minor rearrangement within the Knowles-Carter universe, and then seeing the rippling effects throughout the critical community. It is now painfully clear that, just as there is no one way to release an album, there is no single critical response anymore. Years of message boards and blogs and tweeting set up a crossfire that is more interesting and robust than any single review ever will be. The only consensus is that Beyoncé matters—the rest is a firefight in your pocket.

http://www.newyorker.com/online/blogs/sashafrerejones/2013/12/beyonce-new-album-review.html

Standard
Uncategorized

shared content

Let me tell you a little story about innovation and creativity. Years ago, I worked on a wiki-based project to find the first instance of ideas/techniques in video games (like the first game to use cameras as weapons, or the first game to have stealth as a play element). It excited me to dig to give credit to those who laid the foundations of ideas that we now take for granted. I couldn’t wait to show the world how creative and innovative these unknown game designers/developers were.

I went into it with much passion and excitement, but unexpectedly, it turned out that there were almost no “firsts”. Every time someone put up a game that was the first to do/contain something, there was another earlier game put up to replace it with a SLIGHTLY less sophisticated, or SLIGHTLY different version of the same thing. The gradient was so smooth and constant that eventually, the element we were focusing on lost meaning. It became an unremarkable point to address at all. We ended up constantly overwriting people’s work with smaller, less passionate articles, containing a bunch of crappy games that only technically were the first to do something in the crudest manner. Sometimes only aesthetically.

After a lot of time sunk into this project, I came to the conclusion that I was mistaken about innovation/creativity. It would have been a better project to track the path of ideas/techniques than to try to find the first instance of an idea/technique. I held innovation so highly for years before that, but after this project, I saw just how small it was. How it was but a tiny extension of the thoughts of millions before it. A tiny mutation of a microscopic speck that laid on top of a mountain. It was a valuable experience that helped me very much creatively.

http://simondlr.com/post/49353423006/on-creativity-and-innovation

Standard