The idea of "ubiquitous computing" most people dream about doesn't usually include the troubles of patching them every week. It doesn't even mention that there would be new bugs found daily and that most of the fixes would be available weeks if not months after they were discovered.
Windows XP has been in news recently because Microsoft has finally pulled support for this aging OS. 30% of all active desktops are still on XP and now we know of a new security bug, which would never get fixed for these users.
XP may eventually become the epitome of unpatched buggy software because of the visibility this issue got, but I feel this may just be the tip of the iceberg. For every XP out there, I bet there is one or more unpatched networking device just waiting for someone to exploit it, and this number is growing very fast. Some of these bugs are just that... bugs, but I suspect most of them are due to less then reputable code/design quality. Its a wild-wild-west out there and this has to stop.
The other problem with ubiquitous computing is that the number of devices per house hold is growing rapidly and doing manual updates to every single one is getting close to impossible. We need to get to a place where users won't have to worry about manually updating the devices. The industry as a whole needs to do a better job at promoting a type of automation and testing which requires significantly higher levels of investment in resources (by manufacturer) to make it happen. Apple with its iOS update infrastructure and Google with its Chrome updates has shown that its possible to do it at scale.
So what can we as users do ? For a start we may have an obligation to ask about auto-updates when we buy new devices. For connected devices at least, shipping updates shouldn't be "optional". Vote for the right manufacturer with your wallet.