Startups at the speed of light: Lidar CEOs put their industry in perspective

As autonomous cars and robots loom over the landscapes of cities and jobs alike, the technologies that empower them are forming sub-industries of their own. One of those is lidar, which has become an indispensable tool to autonomy, spawning dozens of companies and attracting hundreds of millions in venture funding.

But like all industries built on top of fast-moving technologies, lidar and the sensing business is by definition built somewhat upon a foundation of shifting sands. New research appears weekly advancing the art, and no less frequently are new partnerships minted, as car manufacturers like Audi and BMW scramble to keep ahead of their peers in the emerging autonomy economy.

To compete in the lidar industry means not just to create and follow through on difficult research and engineering, but to be prepared to react with agility as the market shifts in response to trends, regulations, and disasters.

I talked with several CEOs and investors in the lidar space to find out how the industry is changing, how they plan to compete, and what the next few years have in store.

Their opinions and predictions sometimes synced up and at other times diverged completely. For some, the future lies manifestly in partnerships they have already established and hope to nurture, while others feel that it’s too early for automakers to commit, and they’re stringing startups along one non-exclusive contract at a time.

All agreed that the technology itself is obviously important, but not so important that investors will wait forever for engineers to get it out of the lab.

And while some felt a sensor company has no business building a full-stack autonomy solution, others suggested that’s the only way to attract customers navigating a strange new market.

It’s a flourishing market but one, they all agreed, that will experience a major consolidation in the next year. In short, it’s a wild west of ideas, plentiful money, and a bright future — for some.

The evolution of lidar

I’ve previously written an introduction to lidar, but in short, lidar units project lasers out into the world and measure how they are reflected, producing a 3D picture of the environment around them.

Police body-cam maker Axon says no to facial recognition, for now

Facial recognition is a controversial enough topic without bringing in everyday policing and the body cameras many (but not enough) officers wear these days. But Axon, which makes many of those cameras, solicited advice on the topic from and independent research board, and in accordance with its findings has opted not to use facial recognition for the time being.

sidecar2

Apple’s Sidecar just really *gets* me, you know?

With the rollout of Apple’s public beta software previews of macOS and the new iPadOS, I’ve finally been able to experience first-hand Sidecar, the feature that lets you use an iPad as an external display for your Mac. This is something I’ve been looking to make work since the day the iPad was released, and it’s finally here – and just about everything you could ask for.

These are beta software products, and I’ve definitely encountered a few bugs including my main Mac display blanking out and requiring a restart (that’s totally fine – betas by definition aren’t fully baked). But Sidecar is already a game-changer, and one that I will probably have a hard time living without in future – especially on the road.

Falling nicely into the ‘it just works’ Apple ethos, setting up Sidecar is incredibly simple. As long as your Mac is running macOS 10.15 Catalina, and your iPad is nearby, with Bluetooth and Wifi enabled, and running the iPadOS 13 beta, you just click on the AirPlay icon in your Mac’s Menu bar and it should show up as a display option.

Once you select your iPad, Sidecar just quickly displays an extended desktop from your Mac on the iOS device. It’s treated as a true external display in macOS System Preferences, so you can arrange it with other displays, mirror your Mac and more. The one thing you can’t do that you can do with traditional displays is change the resolution – Apple keeps things default here at 1366 x 1024, but it’s your iPad’s extremely useful native resolution (2732 x 2048, plus Retina pixel doubling for the first-generation 12.9-inch iPad Pro I’m using for testing), and it means there’s nothing weird going on with pixelated graphics or funky text.

Apple also turns on, by default, both a virtual Touchbar and a new feature called ‘Sidebar’ (yes, it’s a Sidebar for your Sidecar) that provides a number of useful commands including the ability to call up the dock, summon a virtual keyboard, quickly access the command key and more. This is particularly useful if you’re using the iPad on its own without the attached Mac, which can really come in handy when you’re deep in a drawing application and just looking to do quick things like undo, and Apple has a dedicated button in Sidebar for that, too.

The Touchbar is identical to Apple’s hardware Touchbar, which it includes on MacBook Pros, dating back to its introduction in 2016. The Touchbar has always been kind of a ‘meh’ feature, and some critics vocally prefer the entry-level 13-inch MacBook Pro model that does away with it altogether in favor of an actual hardware Escape key. And on the iPad using Sidecar, you also don’t get what might be its best feature – TouchID. But, if you’re using Sidecar specifically for photo or video editing, it’s amazing to be able to have it called up and sitting there ready to do, as an app-specific dedicated quick action toolbar.

Best of all, Apple made it possible to easily turn off both these features, and to do so quickly right from your Mac’s menu bar. That way, you get the full benefit of your big beautiful iPad display. Sidecar will remember this preference too for next time you connect.

Also new to macOS Catalina is a hover-over menu for the default window controls (those three ‘stoplight’ circular buttons that appear at the top left of any Mac app). Apple now provides options to either go fullscreen, tile your app left or right to take up 50% of your display, or, if you’re using Sidecar, to quickly move the app to Sidecar display or back.

This quick shuffle action works great, and also respects your existing windows settings, so you can move an app window that you’ve resized manually to take up a quarter of your Mac’s display, and then when you send it back from the Sidecar iPad, it’ll return to where you had it originally in the same size and position. It’s definitely a nice step up in terms of native support for managing windows across multiple displays.

I’ve been using Sidecar wirelessly, though it also works wired and Apple has said there shouldn’t really be any performance disparity regardless of which way you go. So far, the wireless mode has exceeded all expectations, and any third-party competitors in terms of reliability and quality. It also works with the iPad Pro keyboard case, which makes for a fantastic input alternative if you happen to be closer to that one instead of the keyboard you’re using with your Mac.

Sidecar also really shines for digital artists, because it supports input via Apple Pencil immediately in apps that have already built in support for stylus input on Macs, including Adobe Photoshop and Affinity Photo. I’ve previously used a Wacom Cintiq 13HD with my Mac for this kind of thing, and I found Apple’s Sidecar to be an amazing alternative, not least of which because it’s wireless and even the 12.9 iPad Pro is such more portable than the Wacom device. Input seems to have very little response lag (like, it’s not even really perceivable), there’s no calibration required to make sure the Pencil lines up with the cursor on the screen, and as I mentioned above, combined with the Sidebar and dedicated ‘Undo’ button, it’s an artistic productivity machine.

The Pencil is the only means of touch input available with Sidecar, and that’s potentially going to be weird for users of other third-party display extender apps, most of which support full touch input for the extended Mac display they provide. Apple has intentionally left out finger-based touch input, because Mac just wasn’t designed for it, and in use that actually tracks with what my brain expects, so it probably won’t be too disorienting for most users.

When Apple introduced the 5K iMac, it left out one thing that had long been a mainstay of that all-in-on desktop – Target Display Mode. It was a sad day for people who like to maximize the life of their older devices. But they’ve more than made up for it with the introduction of Sidecar, which genuinely doubles the utility value of any modern iPad, provided you’re someone for whom additional screen real estate, with or without pressure-sensitive pen input, is something valuable. As someone who often works on the road and out of the office, Sidecar seems like something I personally designed in the room with Apple’s engineering team.

Apple just released the first iOS and iPadOS 13 beta to everyone

This is your opportunity to get a glimpse of the future of iOS — and iPadOS. Apple just released the first public beta of iOS 13 and iPadOS 13, the next major version of the operating systems for the iPhone and iPad. Unlike developer betas, everyone can download those betas without a $99 developer account. But don’t forget, it’s a beta.

The company still plans to release the final version of iOS and iPadOS 13.0 this fall (usually September). But Apple is going to release betas every few weeks over the summer. It’s a good way to fix as many bugs as possible and gather data from a large group of users.

As always, Apple’s public betas closely follow the release cycle of developer betas. And Apple released the second developer beta of iOS and iPadOS 13 just last week. So it sounds like the first public beta is more or less the same build as the second developer build.

But remember, you shouldn’t install an iOS beta on your primary iPhone or iPad. The issue is not just bugs — some apps and features won’t work at all. In some rare cases, beta software can also brick your device and make it unusable. Proceed with extreme caution.

I’ve been using the developer beta of iOS and it’s still quite buggy. Some websites don’t work, some apps are broken.

But if you have an iPad or iPhone you don’t need, here’s how to download it. Head over to Apple’s beta website and download the configuration profile. It’s a tiny file that tells your iPhone or iPad to update to public betas like it’s a normal software update.

You can either download the configuration profile from Safari on your iOS device directly, or transfer it to your device using AirDrop, for instance. Reboot your device, then head over to the Settings app. In September, your device should automatically update to the final version of iOS and iPadOS 13 and you’ll be able to delete the configuration profile.

Here’s a quick rundown of what’s new in iOS 13. This year, in addition to dark mode, it feels like every single app has been improved with some quality-of-life updates. The Photos app features a brand new gallery view with autoplaying live photos and videos, smart curation and a more immersive design.

This version has a big emphasis on privacy as well thanks to a new signup option called “Sign in with Apple” and a bunch of privacy popups for Bluetooth and Wi-Fi consent, background location tracking. Apple Maps now features an impressive Google Street View-like feature called Look Around. It’s only available in a handful of cities, but I recommend… looking around as everything is in 3D.

Many apps have been updated, such as Reminders with a brand new version, Messages with the ability to set a profile picture shared with your contacts, Mail with better text formatting options, Health with menstrual cycle tracking, Files with desktop-like features, Safari with a new website settings menu, etc. Read more on iOS 13 in my separate preview.

On the iPad front, for the first time Apple is calling iOS for the iPad under a new name — iPadOS. Multitasking has been improved, the Apple Pencil should feel snappier, Safari is now as powerful as Safari on macOS and more.

sensors

This robot crawls along wind turbine blades looking for invisible flaws

Wind turbines are a great source of clean power, but their apparent simplicity — just a big thing that spins — belie complex systems that wear down like any other, and can fail with disastrous consequences. Sandia National Labs researchers have created a robot that can inspect the enormous blades of turbines autonomously, helping keep our green power infrastructure in good kit.

The enormous towers that collect energy from wind currents are often only in our view for a few minutes as we drive past. But they must stand for years through inclement weather, temperature extremes, and naturally — being the tallest things around — lightning strikes. Combine that with normal wear and tear and it’s clear these things need to be inspected regularly.

But such inspections can be both difficult and superficial. The blades themselves are among the largest single objects manufactured on the planet, and they’re often installed in distant or inaccessible areas, like the many you see offshore.

“A blade is subject to lightning, hail, rain, humidity and other forces while running through a billion load cycles during its lifetime, but you can’t just land it in a hanger for maintenance,” explained Sandia’s Joshua Paquette in a news release. In other words, not only do crews have to go to the turbines to inspect them, but they often have to do those inspections in place — on structures hundreds of feet tall and potentially in dangerous locations.

Using a crane is one option, but the blade can also be orientated downwards so an inspector can rappel along its length. Even then the inspection may be no more than eyeballing the surface.

“In these visual inspections, you only see surface damage. Often though, by the time you can see a crack on the outside of a blade, the damage is already quite severe,” said Paquette.

Obviously better and deeper inspections are needed, and that’s what the team decided to work on, with partners International Climbing Machines and Dophitech. The result is this crawling robot, which can move along a blade slowly but surely, documenting it both visually and using ultrasonic imaging.

A visual inspection will see cracks or scuffs on the surface, but the ultrasonics penetrate deep into the blades, making them capable of detecting damage to interior layers well before it’s visible outside. And it can do it largely autonomously, moving a bit like a lawnmower: side to side, bottom to top.

Of course at this point it does it quite slowly and requires human oversight, but that’s because it’s fresh out of the lab. In the near future teams could carry around a few of these things, attach one to each blade, and come back a few hours or days later to find problem areas marked for closer inspection or scanning. Perhaps a crawler robot could even live onboard the turbine and scurry out to check each blade on a regular basis.

Another approach the researchers took was drones — a natural enough solution, since the versatile fliers have been pressed into service for inspection of many other structures that are dangerous for humans to get around: bridges, monuments, and so on.

These drones would be equipped with high-resolution cameras and infrared sensors that detect the heat signatures in the blade. The idea is that as warmth from sunlight diffuses through the material of the blade, it will do so irregularly in spots where damage below the surface has changed its thermal properties.

As automation of these systems improves, the opportunities open up: A quick pass by a drone could let crews know whether any particular tower needs closer inspection, then trigger the live-aboard crawler to take a closer look. Meanwhile the humans are on their way, arriving to a better picture of what needs to be done, and no need to risk life and limb just to take a look.

lightsail-earth

Crowdfunded spacecraft LightSail 2 prepares to go sailing on sunlight

Among the many spacecraft and satellites ascending to space on Monday’s Falcon Heavy launch, the Planetary Society’s LightSail 2 may be the most interesting. If all goes well, a week from launch it will be moving through space — slowly, but surely — on nothing more than the force exerted on it by sunlight.

LightSail 2 doesn’t have solar-powered engines, or use solar energy or heat for some secondary purpose; it will literally be propelled by the physical force of photons hitting its immense shiny sail. Not solar wind, mind you — that’s a different thing altogether.

It’s an idea, explained Planetary Society CEO and acknowledged Science Guy Bill Nye said in a press call ahead of the launch, that goes back centuries.

“It really goes back to the 1600s,” he said; Kepler deduced that a force from the sun must cause comet tails and other effects, and “he speculated that brave people would one day sail the void.”

So they might, as more recent astronomers and engineers have pondered the possibility more seriously.

“I was introduced to this in the 1970s, in the disco era. I was in Carl Sagan’s astronomy class… wow, 42 years ago, and he talked about solar sailing,” Nye recalled. “I joined the Planetary Society when it was formed in 1980, and we’ve been talking about solar sails around here ever since then. It’s really a romantic notion that has tremendous practical applications; there are just a few missions that solar sails are absolutely ideal for.”

Those would primarily be long-term, medium-orbit missions where a craft needs to stay in an Earth-like orbit, but still get a little distance away from the home planet — or, in the future, long-distance missions where slow and steady acceleration from the sun or a laser would be more practical than another propulsion method.

Mission profile

The eagle-eyed among you may have spotted the “2” in the name of the mission. LightSail 2 is indeed the second of its type; the first launched in 2015, but was not planned to be anything more than a test deployment that would burn up after a week or so.

That mission had some hiccups, with the sail not deploying to its full extent and a computer glitch compromising communications with the craft. It was not meant to fly via solar sailing, and did not.

“We sent the CubeSat up, we checked out the radio, the communications, the overall electronics, and we deployed the sail and we got a picture of that deployed sail in space,” said COO Jennifer Vaughn. “That was purely a deployment test; no solar sailing took place.”

The spacecraft itself, minus the sail, of course.

But it paved the way for its successor, which will attempt this fantastical form of transportation. Other craft have done so, most notably JAXA’s IKAROS mission to Venus, which was quite a bit larger — though as LightSail 2’s creators pointed out, not nearly as efficient as their craft — and had a very different mission.

The brand new spacecraft, loaded into a 3U CubeSat enclosure — that’s about the size of a loaf of bread — is piggybacking on an Air Force payload going up to an altitude of about 720 kilometers. There it will detach and float freely for a week to get away from the rest of the payloads being released.

Once it’s safely on its own, it will fire out from its carrier craft and begin to unfurl the sail. From that loaf-sized package will emerge an expanse of reflective Mylar with an area of 32 square meters — about the size of a boxing ring.

Inside the spacecraft’s body is also what’s called a reaction wheel, which can be spun up or slowed down in order to impart the opposite force on the craft, causing it to change its attitude in space. By this method LightSail 2 will continually orient itself so that the photons striking it propel it in the desired direction, nudging it into the desired orbit.

1 HP (housefly power) engine

The thrust produced, the team explained, is very small — as you might expect. Photons have no mass, but they do (somehow) have momentum. Not a lot, to be sure, but it’s greater than zero, and that’s what counts.

“In terms of the amount of force that solar pressure is going to exert on us, it’s on the micronewton level,” said LightSail project manager Dave Spencer. “It’s very tiny compared to chemical propulsion, very small even compared to electric propulsion. But the key for solar sailing is that it’s always there.”

“I have many numbers that I love,” cut in Nye, and detailed one of them: “It’s nine micronewtons per square meter. So if you have 32 square meters you get about a hundred micronewtons. It doesn’t sound like much, but as Dave points out, it’s continuous. Once a rocket engine stops, when it runs out of fuel, it’s done. But a solar sail gets a continuous push day and night. Wait…” (He then argued with himself about whether it would experience night — it will, as you see in the image below.)

Bruce Betts, chief scientist for LightSail, chimed in as well, to make the numbers a bit more relatable: “The total force on the sail is approximately equal to the weight of a house fly on your hand on Earth.”

Yet if you added another fly every second for hours at a time, pretty soon you’ve got a really considerable amount of acceleration going on. This mission is meant to find out whether we can capture that force.

“We’re very excited about this launch,” said Nye, “because we’re going to get to a high enough altitude to get away from the atmosphere, far enough that we’ll really gonna be able to build orbital energy and take some, I hope, inspiring pictures.”

Second craft, same (mostly) as the last

The LightSail going up this week has some improvements over the last one, though overall it’s largely the same — and a relatively simple, inexpensive craft at that, the team noted. Crowdfunding and donations over the last decade have provided quite a bit of cash to pursue this project, but it still is only a small fraction of what NASA might have spent on a similar mission, Spencer pointed out.

“This mission is going to be much more robust than the previous LightSail 1, but as we said previously, it’s done by a small team,” he said. “We’ve had a very small budget relative to our NASA counterparts, probably 1/20th of the budget that a similar NASA mission would have. It’s a low-cost spacecraft.”

Annotated image of LightSail 2, courtesy of Planetary Society.

But the improvements are specifically meant to address the main problems encountered by LightSail 2’s predecessor.

Firstly, the computer inside has been upgraded to be more robust (though not radiation-hardened) and given the ability to sense faults and reboot if necessary — they won’t have to wait, as they did for LightSail 1, for a random cosmic ray to strike the computer and cause a “natural reboot.” (Yes, really.)

The deployment of the sail itself has also improved. The previous one only extended to about 90% of its full width and couldn’t be adjusted after the fact. Subsequently tests have been done, Betts told me, to exactly determine how many revolutions the motor must make to extend the sail to 100%. Not only that, but they have put markings on the extending booms or rods that will help double check how deployment has gone.

“We also have the capability on orbit, if it looks like it’s not fully extended, we can extend it a little bit more,” he said.

Once it’s all out there, it’s uncharted territory. No one has attempted to do this kind of mission, even IKAROS, which had a totally different flight profile. The team is hoping their sensors and software are up to the task — and it should be clear whether that’s the case within a few hours of unfurling the sail.

It’s still mainly an experiment, of course, and what the team learns from this they will put into any future LightSail mission they attempt, but also share it with the spaceflight community and others attempting to sail on sunlight.

“We all know each other and we all share information,” said Nye. “And it really is — I’ve said it as much as I can — it’s really exciting to be flying this thing at last. It’s almost 2020 and we’ve been talking about it for, well, for 40 years. It’s very, very cool.”

LightSail 2 will launch aboard a SpaceX Falcon Heavy no sooner than June 24th. Keep an eye on the site for the latest news and a link to the live stream when it’s almost time for takeoff.