Technology industry leaders Atmel Corporation, Broadcom Corporation, Dell, Intel Corporation, Samsung Electronics Co., Ltd., and Wind River, are joining forces to establish a new industry consortium focused on improving interoperability and defining the connectivity requirements for the billions of devices that will make up the Internet of Things (IoT).

With the introduction of the new Photos app and iCloud Photo Library, enabling you to safely store all of your photos in iCloud and access them from anywhere, there will be no new development of Aperture. When Photos for OS X ships next year, users will be able to migrate their existing Aperture libraries to Photos for OS.

Apple is working to open up camera controls in iOS 8, giving photographers granular control over settings such as ISO, shutter speed, and more, reports AnandTech. While only a few manual controls will make it into the stock camera app, almost full manual control will be made available for third-party app developers to implement as of iOS 8.

Slingshot does let you send a shot to just one person, but the app doesn’t really encourage it. Slingshot’s killer feature is its Select All button, a button Snapchat diehards have begged for that lets you send a photo to all of your friends at once. Snapchat has been adamant about leaving out the much-requested feature, and for a good reason. If you give users the ability to select all, they’ll do it all the time, which potentially lessens the importance of every push notification you get. In my experience testing Slingshot this past weekend, I’ve received a ton of notifications. Facebook tells me that in their testing of the app in a much larger group, they might receive dozens of Slingshot notifications per hour.

Last fall, Snapchat added perhaps its biggest new feature to date — the “My Story” interface let users share multiple snaps with all of their friends in a feed that didn’t disappear immediately. Since launching, Stories have focused on what a single user shares, but Snapchat is expanding that feature a bit today with a new, collaborative “Our Story” feature.

That at least is one of the hopes expressed by Steve Mollenkopf, the CEO of the wireless chipmaker Qualcomm. In an interview with Kara Swisher at the Code Conference today Mollenkopf said the company’s dominance of the market for chips that go into smart phones and tablets can lead to applications for driverless cars. One of them, he said, is computer vision. “Everyone wants to emulate what you can see in your eye,” he said. “The driverless car requires computer vision, and where that’s being developed more than anywhere is on the phone….On the car having computer vision enables things like parking and safety features.”

Low-cost medical and security cameras could be possible in the future thanks to a new multispectral light sensor developed by University of Surrey researchers. The sensor can detect the full spectrum of light, from ultraviolet (UV) to visible and near-infrared light.

Rob Shields has been wearing a phone around his neck since 2012 in order to take one photo per minute. This persistent lifelogging has come with some technological and social hurdles. At the 2013 Quantified Self Global Conference, Rob explained some of the issues he’s been running into as he nears 300,000 photos. He also talked about the interesting data he’s been able to gather because of this practice, such as understanding who he meets and how he spends his time. Watch his talk below to learn more about his practice and other interesting insights lifelogging has provided him.