NEANIAS Gitlab

Commit ce7408b8 authored by Carlos H. Brandt's avatar Carlos H. Brandt
Browse files

update readme: fill overview section

parent 218827b5
......@@ -11,7 +11,80 @@ from Mars and the Moon; Image mosaicking, landing sites evaluation and tools for
TBD
## Overview
The produce of the products here listed is composed by a set of steps, involving not only
the final products but their ancillary data, their storage and visualization
The final products are the last stage of processing from a series of (pre-)processing
steps applied over the original data.
Original data must, first, be download and stored; Then we may have to adjust their signal
(_i.e._, photometry) and spatial distribution (_i.e._, projection).
At each processing stage the data sit on -- data _level_ -- has a appropriate storage area
where all data alike are placed together.
At each stage, metadata associated to state of data is commited to a persistent database.
The internal _database_ -- not to be confused with the data _archive_ -- provide the
interface for querying data products.
The database provide spatial index for efficient spatial-related queries and data access.
Visualization tools, for instance, query the database to discover the data products
matching criteria like spatial intersections or products attributes.
### Workflow
Our internal, long-term data store is meant to host the "science-ready" data, ready for
the production of final products: mosaics, terrain analysis (landing sites) and
annotations (map-mapking).
The processing done over downloaded data, ready for final products, is considering
_pre-processing_.
We'll use _processing_ to -- in terms of general workflow -- talk about the final
products production, using (pre-processed) science-ready data.
_Search_ and _Download_ of source data are entangled in one "source data" processing
routine.
The goal of this step is to download data products and their metadata, copy data products
(images and cubes) to a "upstream" storage space and insert relevant metadata -- _e.g._
product ID, local storage path -- into a persistent (spatial) database.
Once data is downloaded, _Pre-processing_ do all the necessary steps to put data ready
for further analysis/products (where the _NEANIAS services_ really are).
Typically, data is radiometrically calibrated, map-projected, and file-formated.
At the end of this step, data is archived in a "reduced" storage space and the internal
database is updated with current data products information
(projection, path-isis, path-tiff, processing logfile).
At this point, (reduced) image data can be loaded and visualized on GIS tools.
For instance, MEEO/ADAM platform (eventualy, we will put them public for download).
> GeoTIFF is one of the file formats data is archived at this level.
> Other file format is ISIS cubes, used internally for further processing (products)
>
> * Science-ready products are publishable
Final _Processing_ make use of _reduced_ data, fundamentaly, Planetary services offer
users the reduced data base for selection and further processing
(see [Products](#products))
Final products are archived in a "products" storage space and corresponding metadata
(product-id, paths, logfile, source data, date, user/ip)
Eventualy, this workflow ends at a data publication interface.
All user-level products, final products or reduced images + metadata are publishable.
Direct download from GeoTIFF products or access through OGC services.
- - -
Metadata (DB):
* product ID
* footprint (geometry)
* local path
* source URL
- - -
* Search data products
* Download data products
* Store in "`upstream`" path
......@@ -20,6 +93,8 @@ TBD
* Store GeoTIFF products
* Store data/metadata
* Write DB
* Visualize
#### Search
Search for data products intersecting a bounding-box
......
Markdown is supported
0% or .
You are about to add 0 people to the discussion. Proceed with caution.
Finish editing this message first!
Please register or to comment