How to add types of markers to the map in OpenLayers 3

I am looking for example of how to add markers of different type (different images) using OpenLayers 3.

This is my code:

iconStyle = new{ image : new{ anchor : [ 0.5, 46 ], anchorXUnits : 'fraction', anchorYUnits : 'pixels', opacity : 0.75, src : '/icon.png">

You need to create a separate icon style for each style you want to display. It helps to have them all in a reference cache so you don't create a separate style for each object (see thestylesobject in the code below.

Then, when adding a feature to a layer you need to reference the style from the cache which matches the feature type you're adding.

Please check the example below and try to integrate it into your code:

var styles = []; var features = []; /* * data is a JSON object from web API that looks like this: * var data = [{ markerClass: "blue", code: "water", lat: 0, // no really, but you get the idea lng: 0 }, { etc… }] * */ $.each(data, function (i, item) { // Check style cache for already created style if (!styles[item.markerClass]) { // In your case you will want to use image : new{ // but this is the example that I have on hand… styles[item.markerClass] = new{ image: new{ radius: 5, stroke: new{ color: '#000' }), fill: new{ color: item.markerClass // attribute colour }) }), text: new{ text: item.code, // attribute code fill: new{ color: "#000" // black text // TODO: Unless circle is dark, then white… }) }) }); } // Create the feature var marker = new ol.Feature({ content: item, mapid: i, geometry: new ol.geom.Point( ol.proj.transform( [item.lng,], proj_out, proj_in) ) }); // Set style created earlier marker.setStyle(styles[item.markerClass]); features.push(marker); }); // Assuming your layer / source is already map bound var source = layer.getSource(); source.addFeatures(features);

This version of MAPublisher is fully compatible with Adobe Illustrator 2020 so go ahead and upgrade! Are you excited? We are too, but mostly about the other new stuff that is also included in this version of MAPublisher.

Want to add mileage markers or mark intervals along roads, trails or other paths? Do it automatically using this new feature! Options for interval markers are found in the Path Utilities tool.

  • Define the distance of the interval and the units
  • Select and style the shape of the interval marker
  • Choose the font type, size, and spacing within the marker shape
  • Choose where to start and how to increment the markers

The Water Cycle

The U.S. Geological Survey (USGS) offers you the most comprehensive information about the natural water cycle anywhere. We've developed Web pages aimed at both younger students and kids and at adults and advanced students, so choose your path below and begin learning.

What is the water cycle?

Let me introduce myself. I am Drippy, your host at the U.S. Geological Survey's Water Science School. And, what is the water cycle? I can easily answer that—it is "me" all over! The water cycle describes the existence and movement of water on, in, and above the Earth.

Earth's water is always in movement and is always changing states, from liquid to vapor to ice and back again. The water cycle has been working for billions of years and all life on Earth depends on it continuing to work the Earth would be a pretty stale place to live without it.


Study Area

The WVC Reporter was developed and tested in Utah (219,807 km 2 ), which is located in the southwestern United States. The Utah landscape is topographically diverse with elevations ranging from 663–4,413 m [26]. The climate for much of the state is considered semi-arid (127–381 mm precipitation annually), but high elevation areas can receive considerably more precipitation (>1,473 mm) [27]. Three major ecoregions comprise the majority of the state: the Colorado Plateau, the Wasatch and Uinta Mountains, and the Central Basin and Range [28]. As a result, Utah is ecologically diverse and inhabited by a wide variety of plants and animals that are adapted to an array of habitats from salt desert shrub lands to alpine tundra [29].

Utah is largely a rural state with 75% of the land area being federally or state owned [26]. There are, however, several urban areas along the western front of the Wasatch Mountains in central Utah, where the majority of the state’s 2.8 million residents live [30]. According to the latest census estimate, Utah was the 3 rd fastest growing state [31] in the United States. Consequently, the state is rapidly becoming urbanized [32]. The growing human population has increased demand for transportation and traffic volumes have doubled in the past 30 years (1980–2010) [33]. In 2010, it was estimated that 42.8 billion km were driven on the states 73,413 km of roads [33], [34].

Wildlife-vehicle collisions commonly occur in Utah and are a considerable public safety concern [35]. Most reported wildlife vehicle collisions in Utah involve mule deer (Odocoileus hemionus) [35], which is the state’s most abundant wild large mammal [36]. Vehicle collisions with mule deer in Utah result in an average of $7.5 million in damages each year [37]. Consequently, mitigation measures such as wildlife crossings and exclusionary fencing have been used to address the problem [38].

WVC Data Collection

Surveys for wildlife carcasses using automobiles have been conducted systematically in Utah since at least 1998 [39]. Automobile surveys were done by Utah Department of Transportation (UDOT) contractors. During the study, UDOT contractors were contractually obligated to drive ∼2,800 km of roads twice a week (Monday and Thursday) throughout the year. UDOT contractor routes were selected because they had high numbers of WVCs. During surveys, UDOT contractors were required to remove all animal carcasses that were detected on the road surface, the median, and the road shoulder. They also were required to keep detailed records of the species removed and their locations. Removal ensured that carcasses were not double-counted in future surveys, because removed carcasses were transported away from roads by survey crews and deposited at local landfills. The Utah Division of Wildlife Resources (UDWR) employees also reported and removed animal carcasses that occurred on roads other than those covered by UDOT contractors (A. Aoude, UDWR, Pers. Comm.). UDWR employees did not conduct systematic surveys, but reported carcasses opportunistically. Prior to implementation of the WVC Reporter system, both agencies recorded animal carcass data using the pen/paper method.

WVC Reporter System

The WVC Reporter system consists of three integrated components: 1) a mobile web application, 2) a database, and 3) a desktop web application (Figure 1). The mobile web application was designed for in-field data collection. It allows the user to report information on wildlife carcasses using a smartphone. When reporting a wildlife carcass, the user simply clicks on the mobile web application bookmark and a report form opens. The report form contains a dropdown menu of wildlife species that are commonly encountered. If the species being reported is not available in the menu, it can be entered manually. The user also enters the sex (male, female, or unknown) and age class (adult, juvenile, or unknown) of the animal. However, it is important to note that accurately identifying species, sex, age class of animal remains depends on a variety of factors that include observer experience, animal species, and the physical condition of the carcass. Optional information that can be reported with the application includes a carcass fat measurement (an indicator of health in ungulates) and an ID number if the animal was involved in a research study and marked.

Using the WVC Reporter system, data are collected in the field using smartphones and a mobile web application. Collected data are then transferred via mobile broadband internet to a centralized database that is dynamically linked to a desktop web application where WVC locations can be viewed.

For each reported carcass, the mobile application generates a number of pieces of information automatically. For example, the mobile web application accesses the smartphone GPS and acquires coordinates (latitude/longitude) for the location. Coordinates are then used to determine the nearest highway and marker automatically. This eliminates all data entry errors associated with location information. The mobile web application also reports the user, time, and date. When the user is finished entering information in the report form, the send button transfers data via a mobile internet connection to the WVC Reporter database. If mobile internet service is unavailable, the information is stored in the phone cache until the next report is submitted.

The mobile web application is compatible with most iPhone and Android smartphones. Specific device requirements include iOS Safari 3.2+, Android Browser 2.1+, or Google Chrome 10.0+. The programming code for the mobile web application was written in HTML5, CSS, and JavaScript. The HTML5 geolocation Application Program Interface (API) was used to enable location data collection, and the application cache allows the mobile web application to be used even when there is no internet connection available. Programming for all components of the WVC Reporter was done by the Utah Automated Geographic Reference Center (AGRC). The programming code for the system is provided in Appendix S1.

The WVC Reporter database serves as the central repository for all reports that are submitted using the mobile web application. The database is dynamic and updated when reports are submitted through an ESRI ArcGIS Server Feature Service. The database is an ESRI ArcSDE Geodatabase, and it is housed in a Structured Query Language (SQL) Server at the AGRC in Salt Lake City, Utah.

The desktop web application was designed to make it easier for planners, maintenance crews, and wildlife managers to use WVC data. To accomplish this, the web application serves as: 1) a map to view carcass locations at user defined scales, 2) a place to download current WVC data, 3) a way to enter carcass data manually, and 4) a link to the mobile web application. To map carcass locations, the desktop web application uses ESRI’s ArcGIS Server and ArcGIS API for JavaScript. The web application is dynamically linked to the WVC Reporter database, so mapped carcass locations represent the most current data available. Rather than display all carcass locations on the map regardless of the spatial extent, the map viewer shows clusters of carcass locations as circles, where the size of the circle represents the number of carcasses in the area (Figure 2). As one zooms in on specific locations within the state, the circles become progressively smaller and eventually disappear at smaller scale extents showing only the actual carcass locations. This provides an efficient means to see where WVC hotspots occur regardless of the scale extent the map is viewed at. Carcass locations also can be overlaid on one of seven different base maps. The high-resolution aerial imagery base map provides an excellent backdrop for analyzing WVC patterns, because landscape features such as vegetation, rivers, human developments, agricultural fields, and roads are clearly visible at smaller scale extents. Additionally, the terrain base map shades relief making topography appear three dimensional, which is helpful for viewing carcass location with respect to major topographic features such as drainages. To add additional context not available in the base maps, we included GIS layers for wildlife crossing locations, exclusionary fencing, marker locations, and management regions (UDOT and UDWR) that can be toggled on and off by the user. The map viewer also includes data filters (date, species, and management region) allowing the user to modify data to suit their specific needs. For fine-scale WVC analysis, users can also enter a highway number (e.g., US 6) and section (e.g., markers 210–213), and the map viewer will zoom to that location and summarize WVC data for that area (Figure 2). Finally, the map viewer allows displayed data to be exported as a PDF, which provides the user with a way to share data or create figures for reports.

Spatial patterns in wildlife-vehicle collisions can be efficiently analyzed at both broad (left image) and fine (right image) scale extents using the WVC Reporter map viewer.

While the map viewer provides an efficient means to visualize WVC patterns, in some situations it may be desirable to perform more sophisticated spatial analyses (e.g., spatial clustering or autocorrelation indices). To facilitate this, the desktop web application allows the user to download the WVC Reporter database as either an ESRI shapefile or a dbf file. The shapefile is a common GIS format that allows carcasses location to be easy imported into GIS software where spatial analyses can be performed. The download function also respects the data filters in the desktop web application.

When designing the desktop web application, we realized not all agency personnel reporting WVC collision data would have access to smartphones and consequently some information would still be collected on paper forms. To address this situation, the desktop web application has a report form for manually entering carcass locations. It essentially functions the same as the mobile web application report form, with the exception that the user has to define the carcass location manually by either entering GPS coordinates (latitude/longitude or UTM), the highway/marker, or the street address. Once the location information is entered, the user is able to verify that the location information was correct by viewing the location on a built-in map viewer.

The final function of the desktop web application is to serve as a location to link to the mobile web application. Before field technicians can use the mobile application on their individual smartphones, they must first access the web application (, click on the mobile app link, and then bookmark the location on their smartphone. The desktop web application was programmed using the same languages as the mobile application, and it works with nearly all commonly used web browsers (Internet Explorer 7+, Chrome, Firefox, and Safari).

Location Error

We tested the WVC reporter application using a Motorola Droid X smartphone (Model 10083V2-B-K1, Verizon, New York, New York, USA) and an Apple iPhone 4 (Model A1349, Apple, Inc., Cupertino, California, USA). To estimate the horizontal error for locations collected with these phones, we tested them at random locations on highways throughout the state of Utah. At each random location, we recorded location coordinates using a mapping-grade Archer Differential Global Positioning System (DGPS) receiver (Model XF101, Juniper Systems, Logan, Utah, USA) that was capable of sub-meter accuracy. We used locations collected with DGPS receiver to represent the “true” location. Additionally at each random point, we recorded location coordinates using the smartphones and a recreation-grade Garmin GPS receiver (Model eTrex Legend H, Garmin International, Inc., Olathe, Kansas, USA). We included the recreation-grade GPS in testing to determine how the smartphones compared to a standalone GPS receiver. All location data were imported into ArcGIS 10.1 (ESRI, Redlands, California, USA) for analysis. Location error was estimated as the Euclidean distance between the true location and the points collected by the test units. Because the location errors were not normally distributed, we reported the medians and median absolute deviations (MADs) instead of means and standard deviations. We also used the nonparametric Kruskal–Wallis test to test for differences in location errors between units. All statistical tests for this study were performed using R 2.14.1 (R Development Core Team, Vienna, Austria). To estimate how much spatial accuracy improved by using smartphones and WVC Reporter application, we compared location errors associated with that technique to those empirically measured by Gunson et al. [10] for reporting highway/marker locations. We used this information to estimate the percent decrease in location error associated with using smartphones and the WVC Reporter application.

Data Entry and Transcription Times

We estimated the amount of time required to report carcasses using the WVC Reporter application and the pen/paper method under field conditions. Data entry times can vary based on an individual’s natural ability and experience level. To reduce this bias, all data entry times were collected by the principal investigator, who was experienced entering data using both the pen/paper method and the WVC Reporter application. Data entry and transcription times were recorded in seconds(s) using a stopwatch. For WVC Reporter, data entry times represented the time from when the application was opened on the smartphone until all data was entered and the submit report button was pressed. Data values entered included species, sex, and age class. For the pen/paper method, data entry and transcription times represented the time from when the first and last data values were entered. Values entered included date, highway/marker, species, sex, age class, and GPS coordinates in UTMs. Data entry times were also non-normal, so we reported medians and MADs. We tested for differences in data entry times between methods using the Kruskal–Wallis test.

To determine the how much time could be saved annually, we compared the annual data entry time for the WVC Reporter and the pen/paper method. We estimated annual data entry time for the WVC Reporter by multiplying the median data entry time for each smartphone by the number of carcasses reported during the first year (n = 6,822). Similarly, we calculated annual data entry time for the pen/paper method by multiplying the median data entry by the same number of carcasses (n = 6,822). We then subtracted annual data entry time for the pen/paper from the annual data entry time for the WVC reporter for each phone to get the estimated range of hours saved by using the WVC reporter. A range was reported because the two smartphones tested had slightly different data entry times.

Data Entry Errors

We estimated reporting errors for the previous system of paper forms and transcription. Data used to estimate entry errors were collected and transcribed by UDOT contractors prior to the implementation of the WVC Reporter system. Due to the nature of the dataset, reporting errors could only be verified for location data. Errors undoubtedly occurred due to misidentification of species, sex, and age information for carcasses, but we did not evaluate these errors because it would have required a separate field study that would have exceeded the financial resources available for this project. Location data collected included highway/marker, and GPS coordinates in UTMs. To identity location errors in carcass records, we imported carcass locations into ArcGIS 10.1 and overlaid them on highway/marker locations to verify that the reported GPS coordinates matched the reported highway/marker locations. If GPS coordinates and highway/marker information matched, we assumed that both had been recorded correctly. When GPS coordinates were associated with a highway, but the reported highway/marker did not match that location, we assumed that the highway/marker was reported incorrectly. When GPS coordinates did not coincide with a highway, we assumed that the coordinates were reported incorrectly.

Costs Savings

To estimate the total cost savings from using the WVC Reporter, we used the data entry time saved for both in-field data collection and transcription and assumed the mean hourly wage for those reporting and transcribing data was $12/hr.

Life probably exists beyond Earth. So how do we find it?

With next-generation telescopes, tiny space probes, and more, scientists aim to search for life beyond our solar system—and make contact.

In her office on the 17th floor of MIT’s Building 54, Sara Seager is about as close to space as you can get in Cambridge, Massachusetts. From her window, she can see across the Charles River to downtown Boston in one direction and past Fenway Park in the other. Inside, her view extends to the Milky Way and beyond.

Seager, 47, is an astrophysicist. Her specialty is exoplanets, namely all the planets in the universe except the ones you already know about revolving around our sun. On a blackboard, she has sketched an equation she thought up to estimate the chances of detecting life on such a planet. Beneath another blackboard filled with more equations is a clutter of memorabilia, including a vial containing some glossy black shards.

“It’s a rock that we melted.”

Seager speaks in brisk, uninflected phrases, and she has penetrating hazel eyes that hold on to whomever she is talking to. She explains that there are planets known as hot super-Earths whizzing about so close to their stars that a year lasts less than a day. “These planets are so hot, they probably have giant lava lakes,” she says. Hence, the melted rock.

“We wanted to test the brightness of lava.”

When Seager entered graduate school in the mid-1990s, we didn’t know about planets that circle their stars in hours or others that take almost a million years. We didn’t know about planets that revolve around two stars, or rogue planets that don’t orbit any star but just wander about in space. In fact, we didn’t know for sure that any planets at all existed beyond our solar system, and a lot of the assumptions we made about planet-ness have turned out to be wrong. The very first exoplanet found—51 Pegasi b, discovered in 1995—was itself a surprise: A giant planet crammed up against its star, winging around it in just four days.

“51 Peg should have let everyone know it was going to be a crazy ride,” Seager says. “That planet shouldn’t be there.”

Today we have confirmed about 4,000 exoplanets. The majority were discovered by the Kepler space telescope, launched in 2009. Kepler’s mission was to see how many planets it could find orbiting some 150,000 stars in one tiny patch of sky—about as much as you can cover with your hand with your arm outstretched. But its ultimate purpose was to resolve a much more freighted question: Are places where life might evolve common in the universe or vanishingly rare, leaving us effectively without hope of ever knowing whether another living world exists?

Kepler’s answer was unequivocal. There are more planets than there are stars, and at least a quarter are Earth-size planets in their star’s so-called habitable zone, where conditions are neither too hot nor too cold for life. With a minimum of 100 billion stars in the Milky Way, that means there are at least 25 billion places where life could conceivably take hold in our galaxy alone—and our galaxy is one among trillions.

It’s no wonder that Kepler, which ran out of fuel last October, is regarded almost with reverence by astronomers. (“Kepler was the greatest step forward in the Copernican revolution since Copernicus,” University of California, Berkeley astrophysicist Andrew Siemion told me.) It’s changed the way we approach one of the great mysteries of existence. The question is no longer, is there life beyond Earth? It’s a pretty sure bet there is. The question now is, how do we find it?

The revelation that the galaxy is teeming with planets has reenergized the search for life. A surge in private funding has created a much more nimble, risk-friendly research agenda. NASA too is intensifying its efforts in astrobiology. Most of the research is focused on finding signs of any sort of life on other worlds. But the prospect of new targets, new money, and ever increasing computational power has also galvanized the decades-long search for intelligent aliens.

To Seager, a MacArthur “genius award” winner, participating on the Kepler team was one more step toward a lifelong goal: to find an Earth-like planet orbiting a sunlike star. Her current focus is the Transiting Exoplanet Survey Satellite (TESS), an MIT-led NASA space telescope launched last year. Like Kepler, TESS looks for a slight dimming in the luminosity of a star when a planet passes—transits—in front of it. TESS is scanning nearly the whole sky, with the goal of identifying about 50 exoplanets with rocky surfaces like Earth’s that could be investigated by more powerful telescopes coming on line, beginning with the James Webb Space Telescope, which NASA hopes to launch in 2021.

On her “vision table,” which runs along one wall of her office, Seager has collected some objects that express “where I am now and where I’m going, so I can remind myself why I’m working so hard.” Among them are some polished stone orbs representing a red dwarf star and its covey of planets, and a model of ASTERIA, a low-cost planet-finding satellite she developed.

“I haven’t gotten around to putting this up,” Seager says, unrolling a poster that’s a fitting expression of where her career began. It’s a chart showing the spectral signatures of the elements, like colored bar codes. Every chemical compound absorbs a unique set of wavelengths of light. (We see leaves as green, for instance, because chlorophyll is a light-hungry molecule that absorbs red and blue, so the only light reflected is green.) While still in her 20s, Seager came up with the idea that compounds in a transiting planet’s upper atmosphere might leave their spectral fingerprints in starlight passing through. Theoretically, if there are gases in a planet’s atmosphere from living creatures, we could see the evidence in the light that reaches us.

“It’s going to be really hard,” she tells me. “Think of a rocky planet’s atmosphere as the skin of an onion, and the whole thing is in front of, like, an IMAX screen.”

There’s an outside chance a rocky planet orbits a star close enough for the Webb telescope to capture sufficient light to investigate it for signs of life. But most scientists, including Seager, think we’ll need to wait for the next generation of space telescopes. Covering most of the wall over her vision table is a panel of micro-thin black plastic shaped like the petal of a giant flower. It’s a reminder of where she’s going: a space mission, still in development, that she believes can lead her to another living Earth.

From an early age, Olivier Guyon has had a problem with sleep: namely, that it’s supposed to happen at night, when it’s so much better to be awake. Guyon grew up in France, in the countryside of Champagne. When he was 11, his parents bought him a small telescope, which he says they later regretted. He spent many nights peering into it, only to fall asleep the next day in class. When he outgrew that telescope, he built a bigger one. But while he could magnify his view of heavenly objects, Guyon could do nothing to enlarge the number of hours in the night. Something had to give, so one day when he was a teenager, he decided to do away with sleep almost entirely. At first he felt great, but after a week or so, he became seriously ill. Recalling it now, he still shudders.

At 43 years old, Guyon today has a very big telescope to work with. The Subaru observatory, along with 12 others, sits atop the summit of Mauna Kea, on Hawaii’s Big Island. The Subaru’s 8.2-meter (27 feet) reflector is among the largest single-piece mirrors in the world. (Operated by the National Astronomical Observatory of Japan, the telescope has no affiliation with the car company—Subaru is the Japanese name for the Pleiades star cluster.) At 13,796 feet above sea level, Mauna Kea affords one of the highest, clearest views of the universe, yet it’s only an hour and a half drive from Guyon’s home in Hilo. The proximity allows him to make frequent trips to test and improve the instrument he built and attached to the telescope, often working through the night. He carries around a thermos of espresso, and for a while he took to spiking it with shots of liquid caffeine, until a friend pointed out that his daily intake was more than half the lethal dose.

“We can spend a couple weeks up here, and we start to forget about life on Earth,” he tells me. “First you forget the day of the week. Then you start forgetting to call your family.”

Like Seager, Guyon is a MacArthur winner. His particular genius is in the mastery of light: how to massage and manipulate it to catch a glimpse of things that even the Subaru’s huge mirror would be blind to without Guyon’s legerdemain.

“The big question is whether there is biological activity up there,” he says, pointing at the sky. “If yes, what is it like? Are there continents? Oceans and clouds? All these questions can be answered, if you can extract the light of a planet from the light of its star.”

In other words, if you can see the planet. Trying to separate the light of a rocky, Earth-size planet from that of its star is like squinting hard enough to make out a fruit fly hovering inches in front of a floodlight. It doesn’t seem possible, and with today’s telescopes, it isn’t. But Guyon has his sights set on what the next generation of ground-based telescopes might be able to do, if they can be fashioned to squint very, very hard.

That is precisely what his instrument is designed to do. The apparatus is called—brace yourself—the Subaru Coronagraphic Extreme Adaptive Optics (SCExAO, pronounced “skex-a-o”). Guyon wanted me to see it in action, but a power outage had shut down the Subaru. Instead he offers to give me a tour of the 141-foot dome enclosing the telescope. There is 40 percent less oxygen here than at sea level. Visitors have the option of strapping on some bottled oxygen, but he decides that I don’t need any, and off we go.

“I was giving a tour the other day to some scientists, and all of a sudden, one of them fainted!” he says, with a mixture of surprise and regret. “I should have known she was not doing well. She had gotten very quiet.” I clutch the railings and make sure to keep asking questions.

Ground telescopes like the Subaru are much more powerful light-gatherers than space telescopes like the Hubble, chiefly because nobody has yet figured out how to squeeze a 27-foot mirror into a rocket and blast it into space. But ground telescopes have a serious drawback: They sit under miles of our atmosphere. Fluctuations in the air’s temperature cause light to bend erratically—think of a twinkling star, or the wavy air above an asphalt road in the summertime.

The first task of the SCExAO is to iron out those wrinkles. This is accomplished by directing the light from a star onto a shape-shifting mirror, smaller than a quarter, activated by 2,000 tiny motors. Using information from a camera, the motors deform the mirror 3,000 times a second to precisely counter the atmospheric aberrations, and voilà, a beam of starlight can be viewed that is as close as possible to what it was before our atmosphere messed it up. Next comes the squinting part. To Guyon, a star’s luminosity is “a boiling blob of light that we’re trying to get rid of.” His instrument includes an intricate system of apertures, mirrors, and masks called a coronagraph, which allows only the light reflected off the planet to slip through.

There’s a great deal more to the apparatus staring at a schematic of the device is enough to cause vertigo, even at sea level. But the eventual result, once the next-gen telescopes are built, will be a visible dot of light that is actually a rocky planet. Shunt this image to a spectrometer, a device that can parse light into its wavelengths, and you can start dusting it for those fingerprints of life, called biosignatures.

There’s one biosignature that Seager, Guyon, and just about everyone else agree would be as near a slam dunk for life as scientific caution allows. We already have a planet to prove it. On Earth, plants and certain bacteria produce oxygen as a by-product of photosynthesis. Oxygen is a flagrantly promiscuous molecule—it’ll react and bond with just about everything on a planet’s surface. So if we can find evidence of it accumulating in an atmosphere, it will raise some eyebrows. Even more telling would be a biosignature composed of oxygen and other compounds related to life on Earth. Most convincing of all would be to find oxygen along with methane, because those two gases from living organisms destroy each other. Finding them both would mean there must be constant replenishment.

It would be grossly geocentric, however, to limit the search for extraterrestrial life to oxygen and methane. Life could take forms other than photosynthesizing plants, and indeed even here on Earth, anaerobic life existed for billions of years before oxygen began to accumulate in the atmosphere. As long as some basic requirements are met—energy, nutrients, and a liquid medium—life could evolve in ways that would produce any number of different gases. The key is finding gases in excess of what should be there.

There are other sorts of biosignatures we can look for too. The chlorophyll in vegetation reflects near-infrared light—the so-called red edge, invisible to human eyes but easily observable with infrared telescopes. Find it in a planet’s biosignature, and you may well have found an extraterrestrial forest. But the vegetation on other planets might absorb different wavelengths of light—there could be planets with Black Forests that are truly black, or planets where roses are red, and so is everything else.

And why stick to plants? Lisa Kaltenegger, who directs the Carl Sagan Institute at Cornell University, and her colleagues have published the spectral characteristics of 137 microorganisms, including ones in extreme Earth environments that, on another planet, might be the norm. It’s no wonder the next generation of telescopes is so eagerly anticipated.

“For the first time, we’ll be able to collect enough light,” says Kaltenegger. “We’ll be able to figure things out.”

The first and most powerful of the next-gen ground telescopes, the European Southern Observatory’s eponymous Extremely Large Telescope (ELT) in the Atacama Desert of Chile, is scheduled to start operation in 2024. The light-gathering capacity of its 39-meter (128 feet) mirror will exceed all existing Subaru-size telescopes combined. Outfitted with a souped-up version of Guyon’s instrument, the ELT will be fully capable of imaging rocky planets in the habitable zone of red dwarf stars, the most common stars in the galaxy. They are smaller and dimmer than our sun, a yellow dwarf, so their habitable zones are closer to the star. The nearer a planet is to its star, the more light it reflects.

Alas, the habitable zone of a red dwarf star is not the coziest place in the galaxy. Red dwarfs are highly energetic, frequently hurtling flares out into space as they progress through what Seager calls a period of “very long, bad, teenage behavior.” There might be ways an atmosphere could evolve that would protect nascent life from being fried by these solar tantrums. But planets around red dwarfs are also likely to be “tidally locked”—always presenting one side to the star, in the same way our moon shows only one face to the Earth. This would render half the planet too hot for life, the other half too cold. The midline, though, might be temperate enough for life.

As it happens, there’s a rocky planet, called Proxima Centauri b, orbiting in the habitable zone of Proxima Centauri, a red dwarf that’s the nearest star to our own, about 4.2 light-years, or 25 trillion miles, away. “It’s a terribly exciting target,” Guyon says. But he agrees with Seager that the best chance of finding life will be on an Earth-like planet orbiting a sunlike star. The ELT and its ilk will be fantastic at gathering light, but even those behemoth ground telescopes won’t be able to separate the light of a planet from that of a star 10 billion times brighter.

That’s going to take a little more time and even more exotic—one might even say dreamlike—technology. Remember that flower petal–shaped panel on Seager’s wall? It’s a piece of a space instrument called Starshade. Its design consists of 28 panels arranged around a center hub like a giant sunflower, more than 100 feet across. The petals are precisely shaped and rippled to deflect the light from a star, leaving a super-dark shadow trailing behind. If a telescope is positioned far back in that tunnel of darkness, it will be able to capture the glimmer from an Earth-like planet visible just beyond the Starshade’s edge.

Starshade’s earliest likely partner is called the Wide Field Infrared Survey Telescope (WFIRST), scheduled to be finished by the mid-2020s. The two spacecraft will work together in a sort of celestial pas de deux: Starshade will amble into position to block the light from a star so WFIRST can detect any planets around it and potentially sample their spectra for signs of life. Then, while WFIRST busies itself with other tasks, Starshade will fly off into position to block the light of the next star on its list of targets. Though the dancers will be tens of thousands of miles apart, they must be aligned to within a single meter for the choreography to work.

Starshade, under development at NASA’s Jet Propulsion Laboratory in Pasadena, California, is still a decade or so away, and indeed there’s no guarantee that it will be funded. Seager, who hopes to lead the project, is confident. One can only hope. There’s something uniquely uplifting about the prospect of a giant flower in space unfurling its petals to parry the light from a distant sun to see if its orbiting worlds are alive.

When Jon Richards answered an ad in 2008 on Craigslist for a software programmer, he couldn’t have imagined he would spend much of the next 10 years in a remote valley in Northern California, looking for aliens. The search for extraterrestrial intelligence, or SETI, refers to both a research endeavor and a nonprofit organization, the SETI Institute, which employs Richards to run the Allen Telescope Array (ATA), a 340-mile drive from the institute’s headquarters in Silicon Valley. The ATA is the only facility on the planet built expressly for detecting signals from alien civilizations. Funded largely by the late Microsoft co-founder Paul Allen, it was envisioned as an assembly of 350 radio telescopes, with dishes six meters (20 feet) in diameter. But owing to funding difficulties—a regrettable leitmotif in SETI history—only 42 have been built. At one time seven scientists helped run the ATA, but due to attrition, Richards is “the last man standing,” as he gamely puts it.

I’ve come to see Richards on a hot day in August, soon after a rash of wildfires in the area. Smoke veils the view of the surrounding mountains, and in the haze the dishes seem primordially still, like Easter Island statues, each one staring implacably at the same spot in a featureless sky. Richards takes me to one of the dishes, opening the bay doors beneath it to reveal its newly installed antenna feed: a crenellated taper of shiny copper housed in a thick glass cone. “Looks kinda like a death ray,” he says.

Richards’s job is to manage the hardware and software, including algorithms developed to sift through the several hundred thousand radio signals streaming into the telescopes every night, in search of a “signal of interest.” Radio frequencies have been the favored hunting ground of SETI since the search for alien transmissions began 60 years ago, largely because they travel most efficiently through space. SETI scientists have focused in particular on a quiet zone in the radio spectrum, free of background noise from the galaxy. It made sense to search in this relatively undisturbed range of frequencies, since that would be where sensible aliens would be most likely to transmit.

Richards tells me that the ATA is working through a target list of 20,000 red dwarfs. In the evening, he makes sure everything is working properly, and while he sleeps, the dishes point, the antennas rouse, photons scuttle through fiber optic cables, and the radio music of the cosmos streams to enormous processors. If a signal passes tests that suggest it stems from neither a natural source nor some quotidian terrestrial one—a satellite, a plane, somebody’s key fob—the computer kicks out an email alert. This being an email he wouldn’t want to miss, Richards has set up his cell service to forward the message to his phone. Conceivably, then, our first contact from an alien civilization could come as a text rattling Richards’s phone on his night table.

So far, however, all the signals of interest have been false alarms. Unlike other experiments, where progress can be made incrementally, SETI is binary: Either extraterrestrials make contact on your watch, or they don’t. Even if they’re out there, the chances that you’re looking in just the right place at just the right time and at just the right radio frequency are remote. Jill Tarter, the retired head of research at SETI, likens the search to dipping a cup in the ocean: The chance you’ll find a fish is exceedingly small, but that doesn’t mean the ocean isn’t full of fish. Unfortunately, Congress long ago lost interest in dipping the cup, abruptly terminating support in 1993.

The good news is that SETI the research endeavor, if not SETI the institute, has recently received a remarkable boost in funding, sending ripples of excitement through the field. In 2015 Yuri Milner, a Russian-born venture capitalist, established the Breakthrough Initiatives, committing at least $200 million to look for life in the universe, including $100 million specifically to search for alien civilizations. Milner was an early investor in Facebook, Twitter, and many other internet companies you wish you’d been an early investor in. Before that, he founded a highly successful internet company in Russia. His philanthropic vision might be summed up as, if we agree that finding evidence for alien intelligence is worth $100 million, why shouldn’t it be his $100 million? “If you look at it that way, it makes sense,” he says, when I meet him in a glitzy watering hole in Silicon Valley. “If it was a billion a year—we should talk.”

Milner is soft-spoken and unobtrusive I hadn’t noticed him arrive until he was standing right next to my chair. He tells me about his background—a degree in physics, a lifelong passion for astronomy, and parents who named him after the cosmonaut Yuri Gagarin, who became the first human in outer space seven months before Milner was born. That was in 1961, which he points out is the same year SETI began. “Everything is interrelated,” he says.

Through one of his initiatives, Breakthrough Listen, he intends to spend $100 million over 10 years, most of it through the SETI Research Center at UC Berkeley. Another project, Breakthrough Watch, is underwriting new technology to search for biosignatures with the European Southern Observatory’s Very Large Telescope in Chile.

Most far out of all—in both senses—is Milner’s Breakthrough Starshot, which is investing $100 million to explore the feasibility of actually going to the nearest star system, Alpha Centauri, which includes the rocky planet Proxima b. Appreciating the magnitude of this challenge requires some perspective. The first Voyager spacecraft, launched in 1977, took 35 years to enter interstellar space. Traveling at that speed, Voyager would need some 75,000 years to reach Alpha Centauri. In the current vision for Starshot, a fleet of pebble-size spaceships hurtling through space at one-fifth the speed of light could reach Alpha Centauri in a mere 20 years. Working from a road map originally proposed by physicist Philip Lubin at UC Santa Barbara, these tiny Niñas, Pintas, and Santa Marías would be propelled by a ground-based laser array, more powerful than a million suns. It may not be possible. But that’s the advantage of private money: Unlike a government program, you’re allowed—expected—to take a big gamble.

“Let’s see in five or 10 years whether it will work,” Milner says, with a shrug. “I’m not an enthusiast in the sense I believe for sure any of this will happen. I’m an enthusiast because it makes sense now to try.”

The day after meeting with Milner, I went to the Berkeley campus to meet the beneficiaries of his Breakthrough Listen largesse. Andrew Siemion, the director of the Berkeley SETI Research Center, is ideally positioned to take the search for intelligent aliens to a new level. In addition to his Berkeley appointment, he has been named to head up SETI investigations at the SETI Institute itself, including operations at the ATA.

Siemion, 38, looks the part of a next-gen SETI master he has a shaved head, a compact build, and a thin gold chain discreetly visible above the buttons of his fitted shirt. While careful to credit the decades of research by Tarter and her colleagues at the SETI Institute, he’s keen to distinguish where SETI is going from where it has been. The initial search was inspired by the possibility of a connection—reaching out in hope of finding someone reaching back. SETI 2.0 is trying to determine whether technological civilization is part of the cosmic landscape, like black holes, gravitational waves, or any other astronomical phenomenon.

“We’re not looking for a signal,” Siemion says. “We’re looking for a property of the universe.”

Breakthrough Listen is by no means abandoning the conventional search for radio transmissions, he tells me on the contrary, it’s doubling down on it, dedicating to SETI roughly a quarter of the viewing time on two huge single-dish radio telescopes in West Virginia and Australia. Siemion is even more excited about a partnership with the new MeerKAT telescope in South Africa, an array of 64 radio dishes, each more than twice the size of the ATA’s. By piggybacking on observations conducted by other scientists, Breakthrough Listen will conduct a 24/7 stakeout of a million stars, dwarfing previous SETI radio searches. Powerful as it is, MeerKAT is just a precursor to radio astronomy’s dream machine: the Square Kilometre Array, which sometime in the next decade will link hundreds of dishes in South Africa with thousands of antennas in Australia, creating the collecting area of a single dish more than a square kilometer, or about 247 acres.

There are other SETI approaches Siemion tells me about—Breakthrough Listen partnerships with telescopes in China, Australia, and the Netherlands, and new technologies in development at Berkeley, the SETI Institute, and elsewhere to look for optical and infrared signals. The gist, echoed by other scientists I talk with, is that SETI is undergoing a transformation from cottage industry to global enterprise.

Most important, empowered and inspired by the accelerating rate of technological development in our own civilization, we are coming to see the target of the quest in a different light. For 60 years we’ve been waiting for ET to phone Earth. But the stark truth is that ET probably has no compelling reason to try to communicate with us, any more than we feel a heartfelt need to extend a greeting to a colony of ants. We may feel technologically mature compared with our past, but compared with what may be out there in the universe, we’re still in diapers. Any civilization that we would be able to detect will likely be millions, perhaps billions, of years ahead of us.

“We’re like trilobites, looking for more trilobites,” says Seth Shostak, a senior astronomer at the SETI Institute.

What we should be looking for is not a message from ET, but signs of ET just going about the business of being ET, alien and intelligent in ways that we may not yet comprehend but may still be able to perceive, by looking for evidence of technology—so-called technosignatures.

The most obvious technosignatures would be ones we’ve produced, or can imagine producing, ourselves. Avi Loeb of Harvard University, who chairs the Breakthrough Starshot advisory board, has noted that if another civilization were using similar laser propulsion to sail through space, its Starshot-like beacons would be visible to the edge of the universe. Loeb also has suggested looking for the spectral signatures of chlorofluorocarbons soiling the atmosphere of aliens who failed to live past the technological diaper stage.

“Based on our own behavior, there must be many civilizations that killed themselves by harnessing technologies that led to their own destruction,” he tells me when I visit him. “If we find them before we destroy our own planet, that would be very informative, something we could learn from.”

On a cheerier note, we could learn a great deal more from civilizations that have solved their energy problem. At a NASA conference on technosignatures (yes, after a quarter century, NASA too is getting back into the SETI game), there was talk about looking for the waste heat from megastructures that we have imagined creating in the future. A Dyson sphere—solar arrays surrounding a star and capturing all of its energy—around our own sun would generate enough power in a second to supply our current demand for a million years. Learning that other civilizations have already accomplished such feats might provide us some hope.

Still, space is vast, and so is time. Even with our ever more powerful computers and telescopes, SETI’s expanded agenda, and the gravity assist of a hundred Yuri Milners, we may never encounter an alien intelligence. On the other hand, the first intimation of life from a distant planet feels thrillingly close.

“You never know what’s going to happen,” Seager says. “But I know that something great is around those stars.”

JS Choropleth Map Customization

You already have a wonderful and fully functional JavaScript choropleth map. But what if you want to change some things or add some further functionality?

AnyChart is a very flexible JS charting library. So you can easily include modifications that are specific for your needs.

Right now I will show you how to implement the following changes:

  • Add a legend to the chart
  • Add bubbles for graphical representation of the number of deaths
  • Configure the tooltip

In the end, I will get the following picture:

Add a legend to the chart

As a general rule for data visualization, every time you utilize a color scale, you should include an explanation of what each color represents somewhere on the page

Using AnyChart, you can create a legend just by adding chart.legend(true) .

In this case, because the choropleth map has only one series, you will need to have the involved categories represented in the legend.

Here’s the code for adding a legend to the JS choropleth map created above:

That’s functional, but for aesthetic reasons I want the legend to appear on the right-hand side of the chart and to be aligned vertically. To do that, I will just add some CSS rules:

Here is the result (available on AnyChart Playground with the full code):

Add bubbles that will represent the number of deaths

Inspired by the visualization created by the Center for Systems Science and Engineering (CSSE) at Johns Hopkins University, I decided to depict the number of deaths using bubbles.

First, the total number of COVID-19 deaths for each country needs to be stored in the data alongside the total number of COVID-19 cases:

Second, store only the countries that have at least one death in a separate array:

Then, add the following JS code to create the bubble series on top of the choropleth series on the map.

Take a look at the map below or on AnyChart Playground:

It is a good idea to customize the size of the bubbles so that they don’t occupy so much space on the plot. To do this, use AnyChart’s maxBubbleSize() and minBubbleSize() functions. For example, like this:

Next, you can modify the color and stroke of the bubbles to make them more consistent with the rest of the chart:

Check out the outcome after these modifications, below or on AnyChart Playground:

Configure the choropleth map tooltip

Finally, let’s configure the JS choropleth map’s tooltip so that it shows information about total cases if you hover over a country’s territory and total deaths if you hover over a bubble.

The code to achieve this can look as follows:

Look at the result — you can also find it on AnyChart Playground with the full code:

Just in case, here’s the code of the final customized choropleth map coded in JavaScript, which can now be easily embedded into a web, mobile or standalone project:

Watch the video: Angular-Openlayers Part 3: Working with custom markers (October 2021).