Now this is creepy. We've wondered before about security on the Internet of Things, but now it's getting truly disturbing — it turns out that many of the webcams we have aimed at our babies are accessible by just about anyone. According to J.M. Porup of Ars Technica in the U.K., Shodan, a search engine for the Internet of Things, browsed all kinds of insecure webcams:

The feed includes images of marijuana plantations, back rooms of banks, children, kitchens, living rooms, garages, front gardens, back gardens, ski slopes, swimming pools, colleges and schools, laboratories, and cash register cameras in retail stores.

The problem, according to security researcher Dan Tentler, is that as IoT hardware gets cheaper, security is one of the things that gets left out.

Tentler told Ars that webcam manufacturers are in a race to bottom. Consumers do not perceive value in security and privacy. As a rule, many have not shown a willingness to pay for such things. As a result, webcam manufacturers slash costs to maximize their profit, often on narrow margins. Many webcams now sell for as little as £15 or $20.

The problem also goes well beyond baby monitors, but extends into significant infrastructure, the connections to cars, to medical equipment. Much of this is pretty insecure and as another expert noted, “the consequences of failure are higher than something as shocking as a Shodan webcam peering into the baby’s crib.”

Many in the industry have been worried about this issue for a while; the U.S. Federal Trade Commission has issued official guidance on the subject, urging manufacturers to “bake in security at the design phase rather than bolting it on as an afterthought.”

There's even a new initiative from the White House to create a sort of Underwriters Laboratory for cyber-security. Headed by a former Google security expert, it is going to be a nonprofit, to help people choose the best, most secure equipment. Peiter Zatko tells the Council for Foreign Relations website about the venture, Cyber-ITL:

In the computer security realm, we have been trying for decades to get the general public to care about security. Now they do care, but they have no way of differentiating good security products from bad ones. In fact, some of the most insecure software on the market can be the very security software that is supposed to protect you.

Zatko sees many benefits of being a private nonprofit receiving government support:

A project like this needs significant transparency to ensure trust of the public. It needs to be non-partisan and with commercial money out of the picture to ensure there are not perverse incentive structures that would work against the goal of publicly disseminating impartial information about commercial (and open source) products.

This is perhaps the most troubling aspect about the Internet of Things: the worry about who is watching and listening. And the worry isn't just the scammers and the voyeurs; there will be many people who don’t trust the government either. Just look at the attempts right now to have back doors into the encryption on cellphones. Do we want back doors into our baby monitors and nest cams?

It’s one of the main reasons that Apple’s HomeKit connected products are rolling out so slowly; the company is insisting that products be redesigned “to incorporate the mandatory HomeKit chips and firmware, and pass Apple’s strict checklist of requirements.”

That makes a lot of sense. You can’t expect everyone who buys a baby monitor to have to figure out how to set passwords and figure out security. It should be built right in at the start.

Lloyd Alter ( @lloydalter ) writes about smart (and dumb) tech with a side of design and a dash of boomer angst.