Coast Survey research vessel helps Coast Guard re-establish normal ship traffic in the Chesapeake   Leave a comment

Coast Survey’s research vessel, Bay Hydro II, was diverted from its regular hydrographic mission this week to help the U.S. Coast Guard determine if there was a new danger to navigation in the Chesapeake Bay.

On June 13, 2016, U.S. Coast Guard Sector Hampton Roads was notified that the barge WEEKS 179 lost a large portion of its cargo near the Virginia-Maryland line, in a charted traffic scheme area that takes ships around Smith Point in the Chesapeake Bay. The WEEKS 179, carrying construction materials to New Jersey, lost approximately 25 concrete beams and bridge deck pieces, ranging from 10 ft. to 15 ft. long. While the Coast Guard diverted ship traffic around the area, Bay Hydro II deployed to the site, to establish the cargo’s exact position and determine if it posed a hazard to navigation.

By early the next morning, Bay Hydro II was conducting the search. The survey technicians used side scan sonar to locate the sunken cargo, and then followed it up with their multibeam echo sounder to collect bathymetric data over the field of debris. (While the side scan sonar is typically a better search tool for locating objects in large areas, the multibeam is best for obtaining precise position and depths over the items so the hydrographers can determine if dangers to navigation exist.)

Within hours, the Coast Survey vessel had located the cargo and, even better, had determined that the beams were so deep that they did not pose a danger. The Coast Guard was able to use Bay Hydro II’s information to quickly re-establish normal shipping patterns through the area.

Smith_Point_Chartlet

Beta test of crowdsourced bathymetry holds promise for improving U.S. nautical charts   8 comments

We are on the verge of acquiring a significant new source of data to improve NOAA nautical charts, thanks to an enthusiastic industry and mariners equipped with new technology.

By Lt. Adam Reed, Integrated Oceans and Coastal Mapping (IOCM) Assistant Coordinator

The United States has about 3,400,000 square nautical miles of water within our coastal and Great Lakes jurisdiction. Coast Survey, who is responsible for charting that vast area, averages about 3,000 square nautical miles of hydrographic surveying each year. The data collected by those surveys update over a thousand NOAA charts. However, hydrographic surveys are expensive and laborious, and so Coast Survey directs them toward the highest priority sites, which leaves many coastal areas without updates for many years.

Coast Survey may soon get new sources of information, provided voluntarily by mariners, which will alert cartographers to areas where shoaling and other changes to the seafloor have made the chart inaccurate.

Rose Point Navigation System beta tests new crowdsourcing database

Technology has reached the point where any boater can buy an echo sounder kit, add a GPS system, record depth measurements, and make their own geospatial observations in a common reference frame. The question then for hydrographic offices (who are concerned with improving nautical charts for safe navigation) becomes “how do we take advantage of that?”

Rose Point Navigation Systems is working with system developers at NOAA’s National Centers for Environmental Information (NCEI) and with hydrographic experts at Coast Survey and others who are collaborating on an international effort to maintain crowdsourced bathymetry. In a beta test released on May 13, 2016, Rose Point has added a new feature to Coastal Explorer that gives users an option to send anonymous GPS position and soundings data to a new international database managed by NCEI. After getting permission from users, Rose Point systems will generate data log files of positions, depths, and time, and automatically transmit the files to the data center, where Coast Survey can pull the data to compare it to nautical charts.

Crowdsourced bathymetry is an international project

Using data from private sources is not new for Coast Survey. Private interactive cruising guides and other internet-based enterprises have set up services that allow commercial mariners and recreational boaters to share information about navigation hazards they see (or experience) while on the water. The United States Power Squadrons and the U.S. Coast Guard Auxiliary have a decades-long tradition of sharing updates through our cooperative charting programs. But the lack of appropriate software and integration between sources has hampered efforts to use the information to its full potential.

Hydrographic offices around the world are re-thinking crowdsourced bathymetry. In October 2014, Coast Survey led the U.S. delegation to the Fifth Extraordinary International Hydrographic Conference, with Rear Admiral Gerd Glang at the helm. At this meeting, the U.S. and France jointly proposed an initiative (see Proposal No. 4) that introduced crowdsourced bathymetry as a recognized source of data for nautical charts. One of the results of that initiative was the formation of the IHO Crowdsourced Bathymetry Working Group (IHO CSBWG) that set out to develop crowdsourcing principles and guidelines, and then offer a platform for sharing best practices around the world.

Working hand-in-hand with NCEI, the working group has developed a database that can receive volunteered bathymetric data. Data can come from anyone in the world, and everyone can access it.

Coast Survey will use crowdsourced bathymetry to assess chart accuracy

Crowdsourced reports serve an important role in focusing attention on trouble areas. The data helps cartographers determine whether a charted area needs to be re-surveyed, or if they can make changes based on the information at hand. Even with very sparse data, cartographers can make improvements to nautical charts.

Agreeing in principle to use crowdsourced data is much different than applying the system to the vigor of data transmission from moving vessels, however, so Coast Survey experts contributed hydrographic expertise and system testing. Using Rose Point’s Coastal Explorer, Coast Survey Research Vessel Bay Hydro II transmitted “crowdsourced” data using log files that were automatically produced by the electronic charting system software. (Bay Hydro II is Coast Survey’s primary platform to test and evaluate new hydrographic survey technologies.)

BHII bathymetric data collection

Coast Survey Research Vessel Bay Hydro II collected about 123,000 soundings, over 12 days, to pre-test the efficacy of Rose Point beta test for bathymetric crowdsourcing.

“When you aggregate crowdsourced data, we can expect to see trends develop where the seafloor has likely changed from charted data,” explains Lt. Anthony Klemm. “Using Bay Hydro II data transmissions, we saw such trends indicating shoaling near the Patuxent river entrance. Similarly, in the approach to Solomons harbor, trends displayed depths deeper than charted.”

It is important to emphasize that Coast Survey does not necessarily make changes to any significant charted feature based on crowdsourced data alone. That data, however, is about to become a major factor in making charts better.

How NOAA updates nautical charts with high-tech tools   Leave a comment

From a NOAA National Ocean Service podcast…

Boaters rely on NOAA’s nautical charts for depth measurements so they don’t accidentally ground on sandbars or other underwater obstructions. See how NOAA updates nautical charts with high tech tools —including new experimental ocean “robots” that are small enough to survey the nation’s shallowest coastal areas.

 

Transcript available.

Be sure to get the latest nautical charts!   3 comments

Hope springs eternal. Or maybe, given the cold and dreary May (at least on the East Coast), the adage should be re-phrased a bit, as we have “eternal hopes for spring.” In any case, boaters still have time to get their updated charts as boating season starts in earnest.

Coast Survey is constantly improving the nation’s nautical charts, giving customers a greater range of products and services. Which products work best for you?

If you use a paper chart, you have a couple of options.

Using a paper chart

Paper nautical charts are the original risk management tools at sea. Photo credit: Tim Osborn, NOAA

For official NOAA charts, updated to the time they are printed “on-demand,” you should check out the list of 19 NOAA-certified printers. To ensure the integrity, authenticity, and responsiveness of NOAA chart distributors, NOAA certifies these agents to sell official up-to-date NOAA charts. (NOTE: NOAA will revoke distributor status when the agent does not perform up to its terms of service. Effective April 26, 2016, NOAA has revoked the authorized distributor status of Bellingham Chart Printers.)

If you like to keep the latest chart handy for recreational use, and you want to print it at home, check out the NOAA BookletChart, in 8 ½ x 11” PDF format. Select which BookletChart you want, then download, print, and keep it on your boat. (We hear that some boaters put the pages into notebook sleeve protectors, to keep the pages dry.)

NOAA charts are also available as free PDFs. These PDFs are the same size as the traditional charts, so you will need to arrange for professional printing, or you can crop, save, and print only the section you need. (More info on the introduction of the PDF versions is at NOAA nautical charts now available as free PDFs.

If you use electronic systems…

NOAA has two types of charts that are popularly referred to as “digital.” Raster charts are geo-referenced digital images of the traditional chart. They are static. Electronic navigational charts, or ENCs, are produced from a vector database and are much more intelligent, with layers of data and interactive features.

App uses RNC

Mobile apps using NOAA raster charts are increasing popular with recreational boaters.

NOAA raster navigational charts (NOAA RNC®) are available for free download from the Coast Survey website. Chart plotters and computer-based navigation systems often use commercial charts derived from official NOAA RNCs; in recent years, a number of the larger manufacturers have switched to NOAA raster charts themselves. These navigation systems, which also include tablets and mobile devices – at last count, there are over 60 mobile apps – as well as web mapping applications, increasingly use Coast Survey’s new chart tile services to update their data. These tile sets are updated weekly with the latest Notice to Mariners, as well as any other changes to the charts that are made that week.

NOAA electronic navigational charts (NOAA ENC®), produced from a vector database of features, support real-time navigation as well as collision and grounding avoidance. ENCs are used by ECDIS and other electronic navigation systems. ENCs are available for free download from the Coast Survey website, or the data may be viewed as a single, continuous webmap at ENC Online.

For more information…

…and to find the right chart in the format you need, bookmark Coast Survey’s interactive chart catalog.

Olympic Coast survey provides data for multiple uses   1 comment

Coastal planners, fishery managers, and oceanographic researchers will soon reap important seafloor and water column data from the coast of Washington, when NOAA Ship Rainier undertakes a special project in the waters within and near the Olympic Coast National Marine Sanctuary in May.

Map of IOCM projects Olympic Coast NMS

The blue lines indicate NOAA Ship Rainier’s survey project areas. From north to south, the project encompasses Juan De Fuca Canyon (65 square nautical miles), Quinault Canyon (378 square nautical miles), and Willapa Canyon (189 square nautical miles). The teal dots in Quinault and Willapa canyons are the locations of deep underwater natural methane gas seeps being investigated in a University of Washington research project. The green shaded area is the extent of the Olympic Coast National Marine Sanctuary.

The project, which is being managed by NOAA’s Integrated Ocean and Coast Mapping program, grew from NOAA’s National Centers for Coastal Ocean Science seafloor mapping prioritization exercise among coastal stakeholders from federal and state (Oregon and Washington) agencies, tribes, and academia. The group determined that one of the biggest needs by most of the organizations was a better understanding of canyon depths, seafloor, and habitat.

A scientific team of experts from the College of Charleston, University of Washington, and Oregon State University will contribute to the NOAA-led multi-disciplinary survey project, gathering data for a host of research projects and ocean management activities. In general, the data will collect swath bathymetry, acoustic backscatter, and water column data to:

  • inform regulatory decisions on coastal development;
  • provide benthic habitat mapping and seafloor characterization for sustainable fisheries initiatives, and to help assess fishery stocks and critical spawning aggregation locations;
  • better understand and manage shelf and canyon resources;
  • aid in resolving multiple-use conflicts;
  • advance research in determining chemical and biological contamination levels; and
  • provide a data repository for the development of ocean tourism and recreational fishing.

Some specific research projects are also planned.

  • A University of Washington scientist will analyze the water column plumes over natural methane gas seeps in the planned survey areas. The university is a leader in the study of methane hydrates.
  • Because Rainier heads to Alaska after the survey in the sanctuary, the ship will also conduct an exploratory survey to obtain seafloor imagery and data over a newly discovered mud volcano in the upper continental slope offshore of Dixon Entrance, just off the Inside Passage near Ketchikan, Alaska. California State researchers will use the data from this 40 square nautical mile survey to analyze the seafloor shape, assess the area for effects on potential tsunamis, and identify unique biological communities.

As part of her regular mission, Rainier will acquire depth measurements and other hydrographic data throughout the entire project to update NOAA nautical charts 18480 and 18500 off the coast of Washington, and chart 17400 in Alaskan waters. The corresponding electronic navigational charts (NOAA ENC®) are US3WA03M, US3AK40M, and US3AK40M.

Chris Stubbs, from the College of Charleston, will serve as the project’s chief scientist. Cmdr. Edward J. Van Den Ameele is Rainier’s commanding officer.

NOAA ship Rainier, a 48-year-old survey vessel, is part of the NOAA fleet of ships operated, managed and maintained by NOAA’s Office of Marine and Aviation Operations, which includes commissioned officers of the NOAA Corps, one of the seven uniformed services of the United States, and civilian wage mariners.

Four Gulf of Mexico basins named for officers who led EEZ bathymetric mapping   1 comment

The U.S. Board on Geographic Names recently named four previously unknown basins in the United States Exclusive Economic Zone (EEZ) in the Gulf of Mexico, honoring retired NOAA officers who mapped the area in the late 1980s and early 1990s. The names — Armstrong Basin, Floyd Basin, Matsushige Basin and Theberge Basin — were proposed by Texas A&M University, based on their new compilation of bathymetry drawn largely from the NOAA multibeam mapping project conducted by now-decommissioned NOAA ships Whiting and Mt. Mitchell.

Newly named Armstrong Basin

A basin in the Gulf of Mexico was named after retired NOAA Captain Andrew Armstrong.

 

Retired NOAA Capt. Richard P. Floyd was the commanding officer of NOAA Ship Whiting from February 1990 to March 1992; he was followed by retired Capt. Andrew A. Armstrong III, who was CO from February 1992 to January 1994. Retired NOAA Capt. Roy K. Matsushige was commanding officer of NOAA Ship Mt. Mitchell from December 1988 to January 1991, followed by retired Capt. Albert E. Theberge, who served as CO from January to November 1991. The officers led the bathymetric mapping operations under the direction of NOAA’s Office of Charting and Geodetic Services, a predecessor of today’s Office of Coast Survey.

Cartographers rely on the Board of Geographic Names, for good reason!

Since 1890, federal cartographers have relied on the decisions of the U.S. Board on Geographic Names — the 125-year multi-agency federal program to standardize names of geographic features — that operates under the umbrella of the Department of the Interior.

“The Board on Geographic names has its intellectual roots in the earliest map-making efforts,” explains Theberge. To illustrate the need for standardization in the U.S., Theberge points to a November 7, 1805, report by the famed explorer William Clark.

“Ocian in view! O! the joy… Great joy in camp we are in view of the Ocian, this great Pacific Octean which we been So long anxious to See.”

As Theberge points out: “In one sentence, Clark gives the reasons for the Board.”

EEZ mapping project achieved policy and technical objectives for U.S.

The four NOAA commanding officers led surveys for the EEZ mapping project, which was active between 1984 and 1991. The project originated from President Ronald Reagan’s 1983 proclamation establishing a U.S. Exclusive Economic Zone, which created a 200-mile-wide nautical “belt” around the U.S. and territories, adding over 3,000,000 square nautical miles to the nation’s jurisdiction.

In response to the EEZ proclamation, both NOAA and the United States Geological Survey embarked on mapping programs. The USGS used a deep-water, very wide swath, side scan sonar system called GLORIA, which gave a qualitative picture of the seafloor somewhat akin to aerial photography; and NOAA used both medium depth multibeam sounding systems (150 meters to 1000 meters) and deep water systems (1000 meters depth to full oceanic depth), which gave quantitative (depth) values. As opposed to widely-spaced single beam trackline in deep water areas, NOAA’s program attained 100% bottom coverage with the then-new (to the civil community) multibeam systems.

The Gulf of Mexico was one region of the mapping program, as maps were produced for waters of the East Coast, Gulf of Mexico, West Coast, Alaska, and Hawaii. In a paper presented at the 1988 Exclusive Economic Zone Symposium, the goals of mapping in the Gulf of Mexico (actually applicable to all EEZ regions) were espoused:

  • Build the foundation of a marine environmental geographic information system for solving global and regional change problems.
  • Improve targeting of scientific and engineering efforts involving higher-cost, manned, submersible investigations and remotely-operated vehicle operations.
  • Better manage the living and mineral resources of the EEZ.
  • Better model the physical oceanography of the Gulf of Mexico, including factors affecting water mass movements, acoustic propagation paths, and sediment transport regimes.
  • Model geological and geophysical hazards affecting coastal regions and offshore construction.
  • Discover and/or define unique or previously unknown marine environments for designation as marine sanctuaries or protected areas.
  • Improve and enhance nautical charts and bathymetric maps.

This early multibeam mapping effort helped develop many concepts that Coast Survey later built on in shallow water multibeam charting, such as methods for correcting and calibrating beam pointing errors, use of GPS, ray-bending algorithms to account for refraction of beams, etc. Philosophically, the project also helped pave the way for the era of digital paperless survey data acquisition and processing, as EEZ survey operations significantly reduced the vast amounts of paper fathograms, printouts, and other products that accompanied classical hydrographic survey operations.

In 1992, a report by the Marine Board of the National Research Council addressed the needs of mapping the EEZ. It noted:

“EEZ mapping and survey activities of the USGS and NOAA have been impressive, especially given the limits on funding, assets, and human resources. …The current activities depend on individual efforts and assets that are, in many instances, borrowed or diverted from other projects.”

By the time the report was written, circumstances — including the grounding of the Queen Elizabeth II in Martha’s Vineyard Sound — dictated that NOAA devote more resources to inshore charting. The EEZ project was terminated but it left a legacy of new and improved methods, as well as a gentle nudge towards a paradigm shift from primarily paper data acquisition to digital data acquisition.

We still use the digital data gathered by the EEZ mapping project. During the monitoring of the Deepwater Horizon oil spill, NOAA used the data as its underlying bathymetric dataset. The spill was near Whiting Dome and Mitchell Dome, which were named respectively for their discovery by the NOAA ships Whiting and Mt. Mitchell during the EEZ project.

New project picks up where the EEZ project left off

Today, a new national deep-water bathymetric mapping project is underway, picking up where the EEZ project left off. The Office of Coast Survey’s Joint Hydrographic Center at the University of New Hampshire, along with NOAA’s Office of Ocean Exploration, is leading the bathymetric mapping work of the interagency U.S. Extended Continental Shelf (ECS) Project. Using today’s modern high-resolution descendants of the multibeam systems aboard Whiting and Mt. Mitchell, the ECS Project the ECS Project is mapping the continental slope in several regions, including the Gulf of Mexico, to establish the outer limits of the U.S. continental shelf in areas beyond the 200 nautical mile EEZ. Andy Armstrong, of recently named Armstrong Basin fame, continues to use his bathymetric mapping expertise, now conducting mapping operations for the ECS Project.

How accurate are nautical charts?   5 comments

Charts will provide more information on “zone of confidence”

It is a major challenge – some might say an impossibility – to keep all thousand U.S. nautical charts up to date. But exactly how out of date is the chart data? Chart users will get a better idea now that Coast Survey is gradually rolling out a new chart feature called the zone of confidence, or “ZOC” box. It will replace the source diagram that is currently on large-scale charts. Source diagrams, and now the improved ZOC, help mariners assess hydrographic survey data and the associated level of risk to navigate in a particular area.

The first charts to show the new ZOC box are 18622, 18682, 18754, and 11328. They were released on April 7.

Both source diagrams and ZOC diagrams consist of a graphic representation of the extents of hydrographic surveys within the chart and accompanying table of related survey quality categories. Where the old source diagrams were based on inexact and sometimes subjective parameters, however, the new ZOC classifications are derived more consistently, using a combination of survey date, position accuracy, depth accuracy, and sea floor coverage (the survey’s ability to detect objects on the seafloor).

To see the zones of confidence on charts, look for the chart markings (A1, A2, B, C, and D) on the chart itself. Check the ZOC box (located on non-water portions of the chart) for the date of the data acquisition, the position accuracy, the depth accuracy, and characterization of the seafloor for each particular zone.

ZOC categories

Why do users need a “zone of confidence?

The age and accuracy of data on nautical charts can vary. Depth information on nautical charts, paper or digital, is based on data from the latest available hydrographic survey, which in many cases may be quite old. In too many cases, the data is more than 150 years old. Sometimes, particularly in Alaska, the depth measurements are so old that they may have originated from Captain Cook in 1778.

Mariners need to know if data is old. They need to understand the capabilities and the limitations of the chart. In particular, the mariner should understand that nautical chart data, especially when it is displayed on navigation systems and mobile apps, possess inherent accuracy limitations.

Before the advent of GPS, the position accuracy of features on a paper chart was more than adequate to serve the mariner’s needs. Twenty years ago, mariners were typically obtaining position fixes using radar ranges, visual bearings, or Loran C. Generally, these positioning methods were an order of magnitude less accurate than the horizontal accuracy of the survey information portrayed on the chart. Back then, Coast Survey cartographers were satisfied when we plotted a fix with three lines of position that resulted in an equilateral triangle whose sides were two millimeters in length at a chart scale of 1:20,000. In real world coordinates, the triangle would have 40-meter sides. Close enough!

Now, with GPS, charted locations that are off by 10 or 15 meters are not nearly close enough. Mariners now expect, just as they did 30 years ago, that the horizontal accuracy of their charts will be at least as accurate as the positioning system available to them. Unfortunately, charts based on data acquired with old survey technologies will never meet that expectation.

Source data is deficient by today’s standards

The overall accuracy of data portrayed on paper charts is a combination of the accuracy of the underlying source data and the accuracy of the chart compilation process. Most nautical charts are made up of survey data collected by various sources over a long time. A given chart might encompass one area that is based on a lead line and sextant hydrographic survey conducted in 1890, while another area of the same chart might have been surveyed in the year 2000 with a full-coverage shallow-water multibeam echo sounder.

In general, federal hydrographic surveys have used the highest standards, with the most accurate hydrographic survey instrumentation available at the time. On a 1:20,000-scale chart, for example, the survey data was required to be accurate to 15 meters. Features whose positions originate in the local notice to mariners, reported by unknown source, are usually charted with qualifying notations like position approximate (PA) or position doubtful (PD). The charted positions of these features, if they do exist, may be in error by miles.

Similarly, the shoreline found on most NOAA charts is based on photogrammetric or plane table surveys that are more than 30 years old.

Another component of chart accuracy involves the chart compilation process. Before NOAA’s suite of charts was scanned into raster format in 1994, all chart compilation was performed manually. Cartographers drew projection lines by hand and plotted features relative to these lines. They graphically reduced large-scale (high-detail) surveys or engineering drawings to chart scale. Very often, they referenced these drawings to state or local coordinate systems. The data would then be converted to the horizontal datum of the chart, e.g., the North American 1927 (NAD27) or the North American Datum 1983 (NAD83). In the late 1980s and early 1990s, NOAA converted all of its charts to NAD83, using averaging techniques and re-drawing all of the projection lines manually.

When NOAA scanned its charts and moved its cartographic production into a computer environment, cartographers noted variations between manually constructed projection lines and those that were computer-generated. They adjusted all of the raster charts so that the manual projection lines conformed to the computer-generated projection.

Many electronic chart positional discrepancies that are observed today originate from the past graphical chart compilation techniques. The manual application of survey data of varying scales to the fixed chart scale was a source of error that often introduced biases. Unfortunately, on any given chart, the magnitude and the direction of these discrepancies will vary in different areas of the chart. Therefore, no systematic adjustment can automatically improve chart accuracy.

Coast Survey is addressing the accuracy problem

NOAA’s suite of over a thousand nautical charts covers 95,000 miles of U.S. coastline, and includes 3.4 million square nautical miles of U.S. jurisdiction within the Exclusive Economic Zone (which is an area that extends 200 nautical miles from shore.) About half of the depth information found on NOAA charts is based on hydrographic surveys conducted before 1940. Surveys conducted with lead lines or single-beam echo sounders sampled a small percentage of the ocean bottom. Due to technological constraints, hydrographers were unable to see between the sounding lines. Depending on the water depth, these lines may have been spaced at 50, 100, 200, or 400 meters. Today, as NOAA and its contractors re-survey areas and obtain full-bottom coverage, we routinely discover previously uncharted features (some that are dangers to navigation). These features were either: 1) not detected on prior surveys; 2) man-made objects, like wrecks and obstructions, that have appeared on the ocean bottom since the prior survey; or 3) the result of natural changes that have occurred since the prior survey.

Coast Survey is also improving our chart production system. As NOAA developed its charts over the centuries, cartographers relied on separate sets of data: one set for traditional paper charts, and another for the modern electronic navigational charts. We are currently integrating a new charting system that will use one central database to produce all NOAA chart products. The new chart system slims down the system while it beefs up performance, speeding new data and updates to all chart versions of the same charted areas and removing inconsistencies.

As always, NOAA asks chart users to let us know when you find an error on a NOAA chart. Just go to the discrepancy reporting system, give us your observation, and we will take it from there.

Posted April 8, 2016 by NOAA Office of Coast Survey in Cartography, Nautical charts

Tagged with

Follow

Get every new post delivered to your Inbox.

Join 11,300 other followers

%d bloggers like this: