Over the last year or so in the US, a lot of the plastic credit cards we carry around every day have been replaced by new one with chips embedded in them. The chips are supposed to make your credit and debit cards more secure—a good thing!—but there’s one little secret no one wants to admit:
The US’s transition to chip cards has been an utter disaster. They’re confusing to use, painstakingly slow, less secure than the alternatives, and aren’t even the best solution for consumers.
If you’ve shopped in a store and used a credit card, you’ve noticed the change. Retailers have likely asked you to insert the chip into the card reader, instead of swiping. But reading the chip seems to take much longer than just swiping. And on top of that, even though many retailers now have chip reading machines, some of them ask us just the opposite–they say not to insert the card, and just swipe. It seems like there’s no rhyme or reason to the whole thing.
When trying to figure out who’s to blame, you end up with weirdly tangled web of misaligned incentives. Almost everyone involved—banks, credit card companies, retailers and merchants, payment processors, terminal manufacturers—have been focused on their own bottom lines, rather than the impact their decisions will have on customers. And that’s created a maelstrom of incompetence.
All of this started when the US decided to move to the chip standard—known in the industry as EMV. The US process was different than those of other countries, where governments instituted a mandate to upgrade everything by a certain date. The US implemented something called a “liability shift”—essentially, if a retailer didn’t support chip card payments by buying a new, expensive machine, they’d be held accountable for any sort of fraud that occurs in their store. …