Gates-backed Lumotive upends lidar conventions using metamaterials

Pretty much every self-driving car on the road, not to mention many a robot and drone, uses lidar to sense its surroundings. But useful as lidar is, it also involves physical compromises that limit its capabilities. Lumotive is a new company with funding from Bill Gates and Intellectual Ventures that uses metamaterials to exceed those limits, perhaps setting a new standard for the industry.

The company is just now coming out of stealth, but it’s been in the works for a long time. I actually met with them back in 2017 when the project was very hush-hush and operating under a different name at IV’s startup incubator. If the terms “metamaterials” and “Intellectual Ventures” tickle something in your brain, it’s because the company has spawned several startups that use intellectual property developed there, building on the work of materials scientist David Smith.

Metamaterials are essentially specially engineered surfaces with microscopic structures — in this case, tunable antennas — embedded in them, working as a single device.

Echodyne is another company that used metamaterials to great effect, shrinking radar arrays to pocket size by engineering a radar transceiver that’s essentially 2D and can have its beam steered electronically rather than mechanically.

The principle works for pretty much any wavelength of electromagnetic radiation — i.e. you could use X-rays instead of radio waves — but until now no one has made it work with visible light. That’s Lumotive’s advance, and the reason it works so well.

Flash, 2D, and 1D lidar

Lidar basically works by bouncing light off the environment and measuring how and when it returns; This can be accomplished in several ways.

Flash lidar basically sends out a pulse that illuminates the whole scene with near-infrared light (905 nanometers, most likely) at once. This provides a quick measurement of the whole scene, but limited distance as the power of the light being emitted is limited.

2D or raster scan lidar takes a NIR laser and plays it over the scene incredibly quickly, left to right, down a bit, then do it again, again, and again… scores or hundreds of times. Focusing the power into a beam gives these systems excellent range, but similar to a CRT TV with an electron beam tracing out the image, it takes rather a long time to complete the whole scene. Turnaround time is naturally of major importance in driving situations.

1D or line scan lidar strikes a balance between the two, using a vertical line of laser light that only has to go from one side to the other to complete the scene. This sacrifices some range and resolution but significantly improves responsiveness.

Lumotive offered the following diagram, which helps visualize the systems, although obviously “suitability” and “too short” and “too slow” are somewhat subjective:

The main problem with the latter two is that they rely on a mechanical platform to actually move the laser emitter or mirror from place to place. It works fine for the most part, but there are inherent limitations. For instance, it’s difficult to stop, slow, or reverse a beam that’s being moved by a high speed mechanism. If your 2D lidar system sweeps over something that could be worth further inspection, it has to go through the rest of its motions before coming back to it… over and over.

This is the primary advantage offered by a metamaterial system over existing ones: electronic beam steering. In Echodyne’s case the radar could quickly sweep over its whole range like normal, and upon detecting an object could immediately switch over and focus 90 percent of its cycles tracking it in higher spatial and temporal resolution. The same thing is now possible with lidar.

Imagine a deer jumping out around a blind curve. Every millisecond counts because the earlier a self-driving system knows the situation, the more options it has to accommodate it. All other things being equal, an electronically-steered lidar system would detect the deer at the same time as the mechanically-steered ones, or perhaps a bit sooner; Upon noticing this movement, could not just make more time for evaluating it on the next “pass,” but a microsecond later be backing up the beam and specifically targeting just the deer with the majority of its resolution.

Just for illustration. The beam isn’t some big red thing that comes out.

Targeted illumination would also improve the estimation of direction and speed, further improving the driving system’s knowledge and options — meanwhile the beam can still dedicate a portion of its cycles to watching the road, requiring no complicated mechanical hijinks to do so. Meanwhile it has an enormous aperture, allowing high sensitivity.

In terms of specs, it depends on many things, but if the beam is just sweeping normally across its 120×25 degree field of view, the standard unit will have about a 20Hz frame rate, with a 1000×256 resolution. That’s comparable to competitors, but keep in mind that the advantage is in the ability to change that field of view and frame rate on the fly. In the example of the deer, it may maintain a 20Hz refresh for the scene at large but concentrate more beam time on a 5×5 degree area, giving it a much faster rate.

Meta doesn’t mean mega-expensive

Naturally one would assume that such a system would be considerably more expensive than existing ones. Pricing is still a ways out — Lumotive just wanted to show that its tech exists for now — but this is far from exotic tech.

CG render of a lidar metamaterial chip.The team told me in an interview that their engineering process was tricky specifically because they designed it for fabrication using existing methods. It’s silicon-based, meaning it can use cheap and ubiquitous 905nm lasers rather than the rarer 1550nm, and its fabrication isn’t much more complex than making an ordinary display panel.

CTO and co-founder Gleb Akselrod explained: “Essentially it’s a reflective semiconductor chip, and on the surface we fabricate these tiny antennas to manipulate the light. It’s made using a standard semiconductor process, then we add liquid crystal, then the coating. It’s a lot like an LCD.”

An additional bonus of the metamaterial basis is that it works the same regardless of the size or shape of the chip. While an inch-wide rectangular chip is best for automotive purposes, Akselrod said, they could just as easily make one a quarter the size for robots that don’t need the wider field of view, or an larger or custom-shape one for a specialty vehicle or aircraft.

The details, as I said, are still being worked out. Lumotive has been working on this for years and decided it was time to just get the basic information out there. “We spend an inordinate amount of time explaining the technology to investors,” noted CEO and co-founder Bill Colleran. He, it should be noted, is a veteran innovator in this field, having headed Impinj most recently, and before that was at Broadcom, but is perhaps he is best known for being CEO of Innovent when it created the first CMOS Bluetooth chip.

Right now the company is seeking investment after running on a 2017 seed round funded by Bill Gates and IV, which (as with other metamaterial-based startups it has spun out) is granting Lumotive an exclusive license to the tech. There are partnerships and other things in the offing but the company wasn’t ready to talk about them; the product is currently in prototype but very showable form for the inevitable meetings with automotive and tech firms.

Let’s block ads! (Why?)

Link to original source

The West accepts its drought-ridden future, slashes water use

Exceptionally low water levels in Lake Mead reveal a white
Exceptionally low water levels in Lake Mead reveal a white “bathtub ring.”
Image: Shutterstock / OSDG

Out West, the future is dry.

Amid an unprecedented 19-year drought in the expansive Colorado River Basin — which supplies water to 40 million Americans — seven Western states have acknowledged that the 21st century will only grow drier as temperatures continue to rise. And that means less water in the 1,450-mile Colorado River. On Tuesday, water managers from states including California, Utah, and New Mexico announced a drought plan (formally called a Drought Contingency Plan), which cuts their water use for the next seven years — until an even more austere plan must be adopted.

Already, the drought has left water levels at Lake Mead — the nearly 250-square-mile reservoir that’s held back by the formidable Hoover Dam — at their lowest levels in half a century. The water shortage has left telltale, white mineral “bathtub rings” around the basin, well over 100 feet high. 

“This is a long anticipated step that clearly needed to happen,” Brad Udall, a senior water and climate research scientist at Colorado State University who had no role in the plan, said in an interview. 

“The bad news is we still have a lot of work left,” added Udall. “As the climate continues to change and warm in the Southwest, all science shows that the river is expected to decline in the future.”

“We all recognize we’re looking at a drier future,” Tom Buschatzke, Director of the Arizona Department of Water Resources, said in a call with reporters on Tuesday. 

Lake Powell, another massive, low Colorado River reservoir.

Lake Powell, another massive, low Colorado River reservoir.

Image: Shutterstock / GagliardiPhotography

The Colorado River and its reservoirs — though certainly not yet low enough to imperil millions of Westerners — are gradually evaporating while the desert land grows ever drier. The West is still expected to see yearly fluctuations in extreme precipitation, though the region can’t escape the consequences of a steadily rising thermostat. “We’ll continue to see odd and unusual climate extremes, both wet and dry,” said Udall. “But these warmer temperatures just add an environmental load onto the system in very harmful ways.”

Just how much are the West’s rising temperatures and associated heat waves — so hot that they have grounded commercial jetliners in Arizona — drying out the winding Colorado River basin? A study coauthored by Udall last year found that climate change was responsible for half the Colorado River’s flow declines over the last century (with other factors like less rain accounting for the other drops). Though, a 2017 study found this number to be a bit lower, at around one-third. 

Either way, the climate effect is substantial and only expected to grow more potent: Over the last 40 years, Earth has experienced an accelerated warming trend and civilization’s heat-trapping carbon emissions probably won’t even peak for at least 10 more years. Already, the planet’s carbon dioxide levels are likely the highest they’ve been in 15 million years.  

“It’s projected to continue to get warmer,” said Ursula Rick, Managing Director of the University of Colorado Boulder’s Western Water Assessment, in reference to the Colorado River Basin. 

Accepting the reality of hotter climes, the latest drought plans will manage the total supply of water, so that levels at the two greatest reservoirs, Mead and Powell, don’t drop to levels that would trigger automatic water restrictions and a takeover by the federal government. Lake Mead currently sits at around 1,089 feet. If it ever fell to 1,075 feet, water rationing would go into effect. 

“They’re [drought plans] meant to avoid a crisis on the river,” Bureau of Reclamation Commissioner Brenda Burman said in a Tuesday call with reporters.

One important part of the plan allows upper basin states like Colorado to keep more water in Lake Powell (which generates great amounts of electricity and lowers their utility bills), rather than being required to send this water to the depleted Lake Mead — but only as a reward if upper basin states slash their water use. Meanwhile, lower basin states like Arizona and California are expected to cut their water use, too. 

The Colorado River Basin.

The Colorado River Basin.

Image: USGS

Today’s water woes are a significant departure from the 1980s and 1990s, fruitful times when Lake Mead even reached its storage capacity. The lake’s levels were nearly 140 feet higher back then. 

“We’ve never seen 19 years like this. It’s unprecedented,” said Udall. “The old ways of managing water in the West aren’t working and won’t work in the 21st century.”

Although this latest drought contingency plan — which must now get approved by Congress — averts a near-term crisis, there’s still a gaping hole in the scheme. California’s thirsty agricultural Imperial Valley — which claims rights to a whopping 70 percent of all the Colorado River water the Golden State is afforded each year — did not agree to the plan. First, the Imperial Irrigation District wants $200 million in taxpayer dollars to fix the nearby environmental catastrophe that is the vanishing Salton Sea. (It’s California’s largest — and often stinkiest — lake, created by accident over a century ago, and is a long, winding water fiasco of its own.) 

But in the future, with water only growing more limited, the powerful water district will almost certainly have to slash its ample share of water consumption. “They’re going to have to contribute,” said Udall. 

Western state temperatures compared to the historic average between 2000 and 2015.

Western state temperatures compared to the historic average between 2000 and 2015.

Image: epa

The fate of the West over the coming decades, however, won’t only be determined by water cuts and humanity’s efforts to stymie climate change. It’s dependent upon how many people choose to settle in the seemingly-idyllic, sun-blanketed Western world. “The growth of the region is a big unknown,” noted the Western Water Assessment’s Rick. 

What’s more, the ability of water to sustain the region is dependent on how people will want to live, said Rick. “Will they want yards? Will they want their food to be locally grown?” she asked. 

In a warmer world, the state and federal government can intervene like they’re doing now to avert water crises and maintain societal desires. Though, at some point, water demand may be too great for a burgeoning, thirsty West. “They can reduce demand — but only to a certain point,” said Rick. 

“The old ways of managing water in the West aren’t working and won’t work in the 21st century.”

Now, states are watching the drought intently. It’s evident that water managers see a water-limited future, stoked by climate change. Back in 2003, Udall noted that when he gave talks on future climate impacts to the water management community, he was sometimes given “dirty looks.” 

But 10 years later, that changed. By 2013, the unprecedented drought caused Colorado River reservoirs to plummet. A truly unsettling drought had set in. And it hasn’t gone away.

“Somewhere around 2013 I think the light went on,” said Udall, describing how many water managers began to accept the scientific realities of long-term drought and climate change. “The light went on that climate change is here, and now we gotta prepare for it.”

Uploads%252fvideo uploaders%252fdistribution thumb%252fimage%252f85981%252f120f5e1f 7646 4214 ac05 8e5ec6b6f03d.png%252foriginal.png?signature=xh6iamctwja5xroqir8hv1skfzy=&source=https%3a%2f%2fblueprint api production.s3.amazonaws

Let’s block ads! (Why?)

Link to original source

Enormous, weird fish washes up on an Australian beach. So, what is it?

This is certainly one very fishy encounter.

Two fishers stumbled across quite the surprise when they found a sunfish which had washed onto the beach at Coorong National Park in South Australia.

The photos, taken by Linette Grzelak, were posted on Facebook by National Parks South Australia on Tuesday, and boy, it’s a weird looking fish.

Grzelak told CNN they thought the fish was a piece of driftwood when they drove past it.

The strange-looking sea creature has since been identified by the South Australian Museum’s ichthyology manager Ralph Foster as an ocean sunfish (Mola mola), due to markings on its tail and the shape of its head.

It’s known for its large size, odd flattened body shape and fins, although in this case, Foster estimates the fish to be 1.8 metres (70 inches) long, which is about average for the species. 

The species was only discovered and named in 2017, and it’s known as the sunfish because it enjoys basking in the sun on the ocean’s surface.

“Researchers have been putting satellite tags and data loggers on these fish and found they will come to the surface and lay on their side on the surface, hence the name the sunfish,” Foster explained to the news outlet.

“Once they are warm enough they dive down several hundreds of metres and feed on jellyfish and stay down there for lengthy periods of time.”

Foster said very little was known about sunfish, and it’s only in the last few years researchers have known more with the help of technology.

“Because it had evaded recognition and was misidentified for so long it was named the ‘Hoodwinker Sunfish’ by its discoverer,” he added.

“It was thought to be a purely southern hemisphere species but just a couple of weeks ago one made the news when it turned up on a Californian beach, highlighting how little we know about sunfish in general.”

By the way, their size and tendency to sunbake means that boats can hit them, or in much bigger cases, actually sink yachts.

Uploads%252fvideo uploaders%252fdistribution thumb%252fimage%252f90309%252fb8db4263 bca0 48f3 af32 6bdba9f29273.jpg%252foriginal.jpg?signature=9pxi6rfg7vvringhmqp6tb thcq=&source=https%3a%2f%2fblueprint api production.s3.amazonaws

Let’s block ads! (Why?)

Link to original source

NASA photos capture immense flooding of a vital U.S. Air Force base

Flooding in Nebraska as seen from space on March 16, 2019.
Flooding in Nebraska as seen from space on March 16, 2019.
Image: nasa landsat

In 1948, Air Force Secretary Stuart Symington stationed the United States’ long-range nuclear bombers at Offutt Air Force Base in eastern Nebraska, a location safe in the middle of the nation and well-insulated from the coast.

But 70 years later, the base — now home to the U.S. Strategic Command which deters “catastrophic actions from adversaries and poses an immediate threat to any actor who questions U.S. resolve by demonstrating our capabilities” — isn’t safe from historic and record-setting floods

Intense rains on top of the rapid melting of ample snow has inundated large swathes of Nebraska and a full one-third of the Offutt Air Force Base, including the headquarters building.

NASA’s Landsat 8 satellite captured before and after images of the flooding — which the European Union Earth Observation Programme called “biblical.” The overloaded river burgeoned in size, creeping into Offutt, neighborhoods, and farmlands. 

Satellite image from March 20, 2018, a year prior to the flooding.

Satellite image from March 20, 2018, a year prior to the flooding.

Image: nasa

Flooded Nebraska on March 16, 2019.

Flooded Nebraska on March 16, 2019.

Image: nasa

A number of potent factors mixed to create what Offutt Air Force Base Commander Mike Manion has labeled a “1,000 year flood” — meaning there’s only a one in 1,000 chance of such an extreme event happening in any given year. 

NASA noted that exceptionally cold Arctic blasts (from a wobbly polar vortex) preserved bounties of snow that soon rapidly melted when “unusually warm” March air produced massive amounts of runoff. Exacerbating matters, the winter’s freeze made the ground less absorbent when extreme downpours then slammed the region. 

If that wasn’t enough, big rains in 2018 had already “loaded the dice even more,” meteorologist Bryce Anderson noted on Twitter: A thawed ground, already saturated with water, wouldn’t have been able to soak up much water anyway, he said

On top of this confluence of extreme weather events, Earth’s atmosphere is considerably different than it was a century ago. Specifically, the climate has warmed by 1 degree Celsius (1.8 Fahrenheit), and due to simple physics, the warmer air is able to hold more water vapor. Specifically, for every 1 degree Celsius of warming, the air can hold seven percent more water.

That means more intense downpours. Between 1958 and 2012, the amount of rain in the heaviest rainfall events in the midwest shot up by 37 percent, according to U.S. government scientists.

Forthcoming research will reveal the role climate change played during these floods, though atmospheric scientists expect this same climate lever to bring more intense precipitation blasts to other parts of the nation in the near future, notably California. 

A flooded runway at Offutt Air Force Base

A flooded runway at Offutt Air Force Base

Image: U.S. Air Force photo by TSgt. Rachelle Blake

On March 17, the National Weather Service (NWS) expected the Missouri River just south of Offutt Air Force Base to break record levels by a whopping four feet, noted CBS meteorologist Eric Fisher. The forecast turned out to be almost spot on.

“That’s unreal for a river with some big floods in the past,” Fisher wrote. 

Uploads%252fvideo uploaders%252fdistribution thumb%252fimage%252f85981%252f120f5e1f 7646 4214 ac05 8e5ec6b6f03d.png%252foriginal.png?signature=xh6iamctwja5xroqir8hv1skfzy=&source=https%3a%2f%2fblueprint api production.s3.amazonaws

Let’s block ads! (Why?)

Link to original source

Intel and Cray are building a $500 million ‘exascale’ supercomputer for Argonne National Lab

In a way, I have the equivalent of a supercomputer in my pocket. But in another, more important way, that pocket computer is a joke compared with real supercomputers — and Intel and Cray are putting together one of the biggest ever with a half-billion-dollar contract from the Department of Energy. It’s going to do exaflops!

The “Aurora” program aims to put together an “exascale” computing system for Argonne National Laboratory by 2021. The “exa” is prefix indicating bigness, in this case 1 quintillion floating point operations, or FLOPs. They’re kind of the horsepower rating of supercomputers.

For comparison, your average modern CPU does maybe a hundred or more gigaflops. A thousand gigaflops makes a teraflop, a thousand teraflops makes a petaflop, and a thousand petaflops makes an exaflop. So despite major advances in computing efficiency going into making super powerful smartphones and desktops, we’re talking several orders of magnitude difference. (Let’s not get into GPUs, it’s complicated.)

And even when compared with the biggest supercomputers and clusters out there, you’re still looking at a max of 200 petaflops (that would be IBM’s Summit, over at Oak Ridge National Lab) or thereabouts.

Just what do you need that kind of computing power for? Petaflops wouldn’t do it? Well, no, actually. One very recent example of computing limitations in real-world research was this study of how climate change could affect cloud formation in certain regions, reinforcing the trend and leading to a vicious cycle.

This kind of thing could only be estimated with much coarser models before; Computing resources were too tight to allow for the kind of extremely large number of variables involved here (or here — more clouds). Imagine simulating a ball bouncing on the ground — easy — now imagine simulating every molecule in that ball, their relationships to each other, gravity, air pressure, other forces — hard. Now imagine simulating two stars colliding.

The more computing resources we have, the more can be dedicated to, as the Intel press release offers as examples, “developing extreme-scale cosmological simulations, discovering new approaches for drug response prediction and discovering materials for the creation of more efficient organic solar cells.”

Intel says that Aurora will be the first exaflop system in the U.S. — an important caveat, since China is aiming to accomplish the task a year earlier. There’s no reason to think they won’t achieve it, either, since Chinese supercomputers have reliably been among the fastest in the world.

If you’re curious what ANL may be putting its soon-to-be-built computers to work for, feel free to browse its research index. The short answer is “just about everything.”

Let’s block ads! (Why?)

Link to original source