<?xml version="1.0"?>
<feed xmlns="http://www.w3.org/2005/Atom" xml:lang="en">
		<id>https://www.organicdatascience.org/gpf/api.php?action=feedcontributions&amp;feedformat=atom&amp;user=Suzanne</id>
		<title>Geoscience Paper of the Future - User contributions [en]</title>
		<link rel="self" type="application/atom+xml" href="https://www.organicdatascience.org/gpf/api.php?action=feedcontributions&amp;feedformat=atom&amp;user=Suzanne"/>
		<link rel="alternate" type="text/html" href="https://www.organicdatascience.org/gpf/index.php/Special:Contributions/Suzanne"/>
		<updated>2026-04-05T16:30:03Z</updated>
		<subtitle>User contributions</subtitle>
		<generator>MediaWiki 1.24.1</generator>

	<entry>
		<id>https://www.organicdatascience.org/gpf/index.php?title=Main_Page&amp;diff=11716</id>
		<title>Main Page</title>
		<link rel="alternate" type="text/html" href="https://www.organicdatascience.org/gpf/index.php?title=Main_Page&amp;diff=11716"/>
				<updated>2015-04-03T18:51:12Z</updated>
		
		<summary type="html">&lt;p&gt;Suzanne: &lt;/p&gt;
&lt;hr /&gt;
&lt;div&gt;&lt;br /&gt;
The '''Geoscience Paper of the Future (GPF)''' activity aims to demonstrate how papers will be published in the future, going beyond a PDF format and including software, datasets, and workflow all published in open and accessible ways that make the paper transparent, reproducible, and machine indexable.  We refer to such a paper as a geoscience paper of the future, or GPF for short.&lt;br /&gt;
&lt;br /&gt;
== Quick Links ==&lt;br /&gt;
* [[Hold_regular_telecons#Call_Time_and_Access_Codes | Telecon information]]&lt;br /&gt;
* [[Document_GPF_activities | Task descriptions and training materials]]&lt;br /&gt;
* [[Plan_overall_timeline | Overall timeline]]&lt;br /&gt;
&lt;br /&gt;
{| class=&amp;quot;wikitable&amp;quot; style=&amp;quot;color:black; background-color:#ffffcc;&amp;quot; cellpadding=&amp;quot;10&amp;quot;&lt;br /&gt;
|style=&amp;quot;width: 10%&amp;quot; |'''Name'''&lt;br /&gt;
|style=&amp;quot;width: 20%&amp;quot; |'''Affiliation'''&lt;br /&gt;
|style=&amp;quot;width: 20%&amp;quot; |'''Research Area'''&lt;br /&gt;
|style=&amp;quot;width: 50%&amp;quot; |'''Topic of the paper'''&lt;br /&gt;
|-&lt;br /&gt;
| Cedric David&lt;br /&gt;
| NASA Jet Propulsion Laboratory&lt;br /&gt;
| Hydrology and river modeling&lt;br /&gt;
| [[Document_GPF_activities_by_Cedric_David | River Network Routing on the NHDPlus Dataset]]&lt;br /&gt;
|-&lt;br /&gt;
| Ibrahim Demir&lt;br /&gt;
| University of Iowa&lt;br /&gt;
| Hydrology&lt;br /&gt;
| [[Document_GPF_activities_by_Ibrahim_Demir | Optimization and evaluation of hydrological network representation techniques for fast access and query in web-based system]]&lt;br /&gt;
|-&lt;br /&gt;
| Wally Fulweiler&lt;br /&gt;
| Boston University&lt;br /&gt;
| Coastal marine ecosystems and biogeochemistry&lt;br /&gt;
| [[Document_GPF_activities_by_Wally_Fulweiler | A long-term data set of direct sediment N2 fluxes in a temperate estuary in Rhode Island]]&lt;br /&gt;
|-&lt;br /&gt;
|Leif Karlstrom &amp;amp; Lay Kuan Loh&lt;br /&gt;
| University of Oregon &amp;amp; Carnegie Mellon University&lt;br /&gt;
| Volcanology and fluid mechanics&lt;br /&gt;
| [[Document_GPF_activities_by_Leif_Karlstrom | Spectral clustering of spatial point clouds for volcanic vents and associated attributes]]&lt;br /&gt;
|-&lt;br /&gt;
| Kyo Lee&lt;br /&gt;
| NASA Jet Propulsion Laboratory&lt;br /&gt;
| Climate model evaluation&lt;br /&gt;
| [[Document_GPF_activities_by_Kyo_Lee | Evaluation of simulated temperature, precipitation, cloud fraction and insolation over the conterminous United States using Regional Climate Model Evaluation System&lt;br /&gt;
]]&lt;br /&gt;
|-&lt;br /&gt;
| Kim Miller&lt;br /&gt;
| Columbia University/Lamont Observatory&lt;br /&gt;
| Earth surface processes&lt;br /&gt;
| [[Document_GPF_activities_by_Kim_Miller | Effects of intermittency on delta dynamics]]&lt;br /&gt;
|-&lt;br /&gt;
| Heath Mills&lt;br /&gt;
| University of Houston Clear Lake&lt;br /&gt;
| Marine geomicrobiology&lt;br /&gt;
| [[Document_GPF_activities_by_Heath_Mills | Molecular Characterization of Water Column Microbial Populations within the Northern Gulf of Mexico Hypoxic Zone]]&lt;br /&gt;
|-&lt;br /&gt;
| Ji-Hyun Oh&lt;br /&gt;
| NASA Jet Propulsion Laboratory&lt;br /&gt;
| Tropical meteorology&lt;br /&gt;
| [[Document_GPF_activities_by_Ji-Hyun_Oh | Tools for computing momentum budget for the westerly wind event associated with the Madden-Julian Oscillation]]&lt;br /&gt;
|-&lt;br /&gt;
| Suzanne Pierce&lt;br /&gt;
| Texas Advanced Computing Center, The University of Texas Austin&lt;br /&gt;
| Energy and Earth Resources; Groundwater&lt;br /&gt;
| [[Document_GPF_activities_by_Suzanne_Pierce | MCSDSS: A data fusion and integration platform for dynamic decision support and interactive science visualization ]]&lt;br /&gt;
|-&lt;br /&gt;
| Allen Pope&lt;br /&gt;
| NSDIC&lt;br /&gt;
| Polar sciences&lt;br /&gt;
| [[Document_GPF_activities_by_Allen_Pope | Source Data and MATLAB Code for Estimating and Evaluating Supraglacial Lake Depth With Landsat 8 and other Multispectral Sensors]]&lt;br /&gt;
|-&lt;br /&gt;
| Jordan Read &amp;amp; Luke Winslow&lt;br /&gt;
| US Geological Survey&lt;br /&gt;
| Ecology and physical limnology&lt;br /&gt;
| [[Document_GPF_activities_by_Jordan_Read | Lake catchment modeling]]&lt;br /&gt;
|-&lt;br /&gt;
| Mimi Tzeng&lt;br /&gt;
| Dauphin Island Sea Lab&lt;br /&gt;
| Physical Oceanography&lt;br /&gt;
| [[Document_GPF_activities_by_Mimi_Tzeng | Fisheries Oceanography of Coastal Alabama (FOCAL): A Subset of a Time-Series of Hydrographic and Current Data from a Permanent Moored Station Outside Mobile Bay (13 Apr to 18 May 2011)]]&lt;br /&gt;
|-&lt;br /&gt;
| Sandra Villamizar&lt;br /&gt;
| University of California Merced&lt;br /&gt;
| River ecohydrology&lt;br /&gt;
| [[Document_GPF_activities_by_Sandra_Villamizar | Using whole stream metabolism to assess the response of river ecosystems to flow disturbance events - The case of the San Joaquin River restoration effort]]&lt;br /&gt;
|-&lt;br /&gt;
| Xuan Yu&lt;br /&gt;
| University of Delaware&lt;br /&gt;
| Hydrogeology&lt;br /&gt;
| [[Document_GPF_activities_by_Xuan_Yu | A semidiscrete finite volume formulation for multiprocess watershed simulation]]&lt;br /&gt;
|-&lt;br /&gt;
| Jon Goodall and Bakinam Essawy&lt;br /&gt;
| University of Virginia&lt;br /&gt;
| Hydrology&lt;br /&gt;
| [[Document_GPF_activities_by_Jon_Goodall | Hydrology data preprocessing workflows using data grids and Docker]]&lt;br /&gt;
|}&lt;br /&gt;
&lt;br /&gt;
== Using this Wiki ==&lt;br /&gt;
'''Learn the basics of how wikis work''': &lt;br /&gt;
&lt;br /&gt;
* [[Quick_Guide_to_Using_a_Wiki | Quick Guide to Using a Wiki]]&lt;br /&gt;
* A detailed [//meta.wikimedia.org/wiki/Help:Contents User's Guide].&lt;br /&gt;
&lt;br /&gt;
'''Some tips''':&lt;br /&gt;
* You must be logged in before you can edit&lt;br /&gt;
* You can preview your changes before saving, look for the button at the bottom of the edit page&lt;br /&gt;
* Add text or edit only above the line that says &lt;br /&gt;
&lt;br /&gt;
  &amp;lt;nowiki&amp;gt;&amp;lt;!-- Add any wiki Text above this Line --&amp;gt;&lt;br /&gt;
&lt;br /&gt;
  &amp;lt;!-- Do NOT Edit below this Line --&amp;gt;&amp;lt;/nowiki&amp;gt;&lt;br /&gt;
&lt;br /&gt;
* You can see all the prior versions of a page in the &amp;quot;history&amp;quot; tab at the top right of the page&lt;br /&gt;
&lt;br /&gt;
== Acknowledgments ==&lt;br /&gt;
&lt;br /&gt;
This activity is organized by the [http://www.geosoft-earthcube.org GeoSoft project] as part of the [http://www.earthcube.org EarthCube initiative] of the US National Science Foundation.&lt;/div&gt;</summary>
		<author><name>Suzanne</name></author>	</entry>

	<entry>
		<id>https://www.organicdatascience.org/gpf/index.php?title=Main_Page&amp;diff=11715</id>
		<title>Main Page</title>
		<link rel="alternate" type="text/html" href="https://www.organicdatascience.org/gpf/index.php?title=Main_Page&amp;diff=11715"/>
				<updated>2015-04-03T18:49:48Z</updated>
		
		<summary type="html">&lt;p&gt;Suzanne: &lt;/p&gt;
&lt;hr /&gt;
&lt;div&gt;&lt;br /&gt;
The '''Geoscience Paper of the Future (GPF)''' activity aims to demonstrate how papers will be published in the future, going beyond a PDF format and including software, datasets, and workflow all published in open and accessible ways that make the paper transparent, reproducible, and machine indexable.  We refer to such a paper as a geoscience paper of the future, or GPF for short.&lt;br /&gt;
&lt;br /&gt;
== Quick Links ==&lt;br /&gt;
* [[Hold_regular_telecons#Call_Time_and_Access_Codes | Telecon information]]&lt;br /&gt;
* [[Document_GPF_activities | Task descriptions and training materials]]&lt;br /&gt;
* [[Plan_overall_timeline | Overall timeline]]&lt;br /&gt;
&lt;br /&gt;
{| class=&amp;quot;wikitable&amp;quot; style=&amp;quot;color:black; background-color:#ffffcc;&amp;quot; cellpadding=&amp;quot;10&amp;quot;&lt;br /&gt;
|style=&amp;quot;width: 10%&amp;quot; |'''Name'''&lt;br /&gt;
|style=&amp;quot;width: 20%&amp;quot; |'''Affiliation'''&lt;br /&gt;
|style=&amp;quot;width: 20%&amp;quot; |'''Research Area'''&lt;br /&gt;
|style=&amp;quot;width: 50%&amp;quot; |'''Topic of the paper'''&lt;br /&gt;
|-&lt;br /&gt;
| Cedric David&lt;br /&gt;
| NASA Jet Propulsion Laboratory&lt;br /&gt;
| Hydrology and river modeling&lt;br /&gt;
| [[Document_GPF_activities_by_Cedric_David | River Network Routing on the NHDPlus Dataset]]&lt;br /&gt;
|-&lt;br /&gt;
| Ibrahim Demir&lt;br /&gt;
| University of Iowa&lt;br /&gt;
| Hydrology&lt;br /&gt;
| [[Document_GPF_activities_by_Ibrahim_Demir | Optimization and evaluation of hydrological network representation techniques for fast access and query in web-based system]]&lt;br /&gt;
|-&lt;br /&gt;
| Wally Fulweiler&lt;br /&gt;
| Boston University&lt;br /&gt;
| Coastal marine ecosystems and biogeochemistry&lt;br /&gt;
| [[Document_GPF_activities_by_Wally_Fulweiler | A long-term data set of direct sediment N2 fluxes in a temperate estuary in Rhode Island]]&lt;br /&gt;
|-&lt;br /&gt;
|Leif Karlstrom &amp;amp; Lay Kuan Loh&lt;br /&gt;
| University of Oregon &amp;amp; Carnegie Mellon University&lt;br /&gt;
| Volcanology and fluid mechanics&lt;br /&gt;
| [[Document_GPF_activities_by_Leif_Karlstrom | Spectral clustering of spatial point clouds for volcanic vents and associated attributes]]&lt;br /&gt;
|-&lt;br /&gt;
| Kyo Lee&lt;br /&gt;
| NASA Jet Propulsion Laboratory&lt;br /&gt;
| Climate model evaluation&lt;br /&gt;
| [[Document_GPF_activities_by_Kyo_Lee | Evaluation of simulated temperature, precipitation, cloud fraction and insolation over the conterminous United States using Regional Climate Model Evaluation System&lt;br /&gt;
]]&lt;br /&gt;
|-&lt;br /&gt;
| Kim Miller&lt;br /&gt;
| Columbia University/Lamont Observatory&lt;br /&gt;
| Earth surface processes&lt;br /&gt;
| [[Document_GPF_activities_by_Kim_Miller | Effects of intermittency on delta dynamics]]&lt;br /&gt;
|-&lt;br /&gt;
| Heath Mills&lt;br /&gt;
| University of Houston Clear Lake&lt;br /&gt;
| Marine geomicrobiology&lt;br /&gt;
| [[Document_GPF_activities_by_Heath_Mills | Molecular Characterization of Water Column Microbial Populations within the Northern Gulf of Mexico Hypoxic Zone]]&lt;br /&gt;
|-&lt;br /&gt;
| Ji-Hyun Oh&lt;br /&gt;
| NASA Jet Propulsion Laboratory&lt;br /&gt;
| Tropical meteorology&lt;br /&gt;
| [[Document_GPF_activities_by_Ji-Hyun_Oh | Tools for computing momentum budget for the westerly wind event associated with the Madden-Julian Oscillation]]&lt;br /&gt;
|-&lt;br /&gt;
| Suzanne Pierce&lt;br /&gt;
| The University of Texas Austin&lt;br /&gt;
| Hydrogeology&lt;br /&gt;
| [[Document_GPF_activities_by_Suzanne_Pierce | MCSDSS: A data fusion and integration platform for dynamic decision support and interactive science visualization ]]&lt;br /&gt;
|-&lt;br /&gt;
| Allen Pope&lt;br /&gt;
| NSDIC&lt;br /&gt;
| Polar sciences&lt;br /&gt;
| [[Document_GPF_activities_by_Allen_Pope | Source Data and MATLAB Code for Estimating and Evaluating Supraglacial Lake Depth With Landsat 8 and other Multispectral Sensors]]&lt;br /&gt;
|-&lt;br /&gt;
| Jordan Read &amp;amp; Luke Winslow&lt;br /&gt;
| US Geological Survey&lt;br /&gt;
| Ecology and physical limnology&lt;br /&gt;
| [[Document_GPF_activities_by_Jordan_Read | Lake catchment modeling]]&lt;br /&gt;
|-&lt;br /&gt;
| Mimi Tzeng&lt;br /&gt;
| Dauphin Island Sea Lab&lt;br /&gt;
| Physical Oceanography&lt;br /&gt;
| [[Document_GPF_activities_by_Mimi_Tzeng | Fisheries Oceanography of Coastal Alabama (FOCAL): A Subset of a Time-Series of Hydrographic and Current Data from a Permanent Moored Station Outside Mobile Bay (13 Apr to 18 May 2011)]]&lt;br /&gt;
|-&lt;br /&gt;
| Sandra Villamizar&lt;br /&gt;
| University of California Merced&lt;br /&gt;
| River ecohydrology&lt;br /&gt;
| [[Document_GPF_activities_by_Sandra_Villamizar | Using whole stream metabolism to assess the response of river ecosystems to flow disturbance events - The case of the San Joaquin River restoration effort]]&lt;br /&gt;
|-&lt;br /&gt;
| Xuan Yu&lt;br /&gt;
| University of Delaware&lt;br /&gt;
| Hydrogeology&lt;br /&gt;
| [[Document_GPF_activities_by_Xuan_Yu | A semidiscrete finite volume formulation for multiprocess watershed simulation]]&lt;br /&gt;
|-&lt;br /&gt;
| Jon Goodall and Bakinam Essawy&lt;br /&gt;
| University of Virginia&lt;br /&gt;
| Hydrology&lt;br /&gt;
| [[Document_GPF_activities_by_Jon_Goodall | Hydrology data preprocessing workflows using data grids and Docker]]&lt;br /&gt;
|}&lt;br /&gt;
&lt;br /&gt;
== Using this Wiki ==&lt;br /&gt;
'''Learn the basics of how wikis work''': &lt;br /&gt;
&lt;br /&gt;
* [[Quick_Guide_to_Using_a_Wiki | Quick Guide to Using a Wiki]]&lt;br /&gt;
* A detailed [//meta.wikimedia.org/wiki/Help:Contents User's Guide].&lt;br /&gt;
&lt;br /&gt;
'''Some tips''':&lt;br /&gt;
* You must be logged in before you can edit&lt;br /&gt;
* You can preview your changes before saving, look for the button at the bottom of the edit page&lt;br /&gt;
* Add text or edit only above the line that says &lt;br /&gt;
&lt;br /&gt;
  &amp;lt;nowiki&amp;gt;&amp;lt;!-- Add any wiki Text above this Line --&amp;gt;&lt;br /&gt;
&lt;br /&gt;
  &amp;lt;!-- Do NOT Edit below this Line --&amp;gt;&amp;lt;/nowiki&amp;gt;&lt;br /&gt;
&lt;br /&gt;
* You can see all the prior versions of a page in the &amp;quot;history&amp;quot; tab at the top right of the page&lt;br /&gt;
&lt;br /&gt;
== Acknowledgments ==&lt;br /&gt;
&lt;br /&gt;
This activity is organized by the [http://www.geosoft-earthcube.org GeoSoft project] as part of the [http://www.earthcube.org EarthCube initiative] of the US National Science Foundation.&lt;/div&gt;</summary>
		<author><name>Suzanne</name></author>	</entry>

	<entry>
		<id>https://www.organicdatascience.org/gpf/index.php?title=Suzanne_Pierce_should_make_software_executable_by_others&amp;diff=11714</id>
		<title>Suzanne Pierce should make software executable by others</title>
		<link rel="alternate" type="text/html" href="https://www.organicdatascience.org/gpf/index.php?title=Suzanne_Pierce_should_make_software_executable_by_others&amp;diff=11714"/>
				<updated>2015-04-03T18:44:39Z</updated>
		
		<summary type="html">&lt;p&gt;Suzanne: Set PropertyValue: Progress = 80&lt;/p&gt;
&lt;hr /&gt;
&lt;div&gt;[[Category:Task]]&lt;br /&gt;
&amp;lt;br/&amp;gt;&amp;lt;b&amp;gt;Details on how to do this task:&amp;lt;/b&amp;gt; [[Make software executable by others]]&amp;lt;br/&amp;gt;&amp;lt;br/&amp;gt;&lt;br /&gt;
Installing, Using, and Extending MCSDSS&lt;br /&gt;
&lt;br /&gt;
The MCSDSS has been developed such that it is relatively easy to update the data used by the current design of the data interactive components and the visual content. However, extending the application by adding new data to the map or wiring up additional map tile services is a somewhat more complicated procedure that will require some knowledge of JavaScript and current web development workflows. Familiarity with the AngularJS framework is also beneficial but not necessarily required. The MCSDSS application must be recompiled after any changes are made to the code base. This document shows step by step instructions for how to extend the mapping services, add additional geodata, and access additional third party resources that describe in detail how to do additional modifications to the map capabilities.  To get started, you will need to launch your IDE or text editor of choice and open the file located at //eaa-aquiferium/app/scripts/directives/leaflet-directive.js. &lt;br /&gt;
&lt;br /&gt;
This file contains the code for the entirety of the mapping capability within Aquiferium. It is an Angular directive, which is essentially a way for JavaScript developers to create new or customized components that can be added to a web application using simple HTML tag syntax. In this case, if you were to open the file located at //eaa-aquiferium/app/views/geography.html and look at line 3 you will see the following HTML &lt;br /&gt;
&lt;br /&gt;
code: &lt;br /&gt;
&lt;br /&gt;
&amp;lt;leaflet-map id=&amp;quot;map&amp;quot; class=&amp;quot;div-fixed z-100&amp;quot; display-recharge-&lt;br /&gt;
panel=&amp;quot;displayRechargePanel()&amp;quot; display-wells-panel=&amp;quot;displayWellsPanel()&amp;quot; display-&lt;br /&gt;
springs-panel=&amp;quot;displaySpringsPanel()&amp;quot;&amp;gt;&amp;lt;/leaflet-map&amp;gt;&lt;br /&gt;
&lt;br /&gt;
Through the use of the Angular directive, any web app developer with access to your directive code file can use it in their project. Simply include it in the framework as a dependency and use the above HTML code snippet alongside the appropriate CSS code. In fact, most of the code snippet is telling the Angular framework how to respond to links specific to the Aquiferium. Therefore, a developer, who only wanted to utilize the mapping capability, could use the following simplified code snippet:&lt;br /&gt;
&lt;br /&gt;
&amp;lt;leaflet-map id=&amp;quot;map&amp;quot;&amp;gt;&amp;lt;/leaflet-map&amp;gt;&lt;br /&gt;
&lt;br /&gt;
Note: All the above code snippets require th CSS code below to also be present within the web application &lt;br /&gt;
&lt;br /&gt;
(the web developer can customize the CSS to their needs):&lt;br /&gt;
&lt;br /&gt;
#map {&lt;br /&gt;
&lt;br /&gt;
 width: 100%;&lt;br /&gt;
&lt;br /&gt;
 height: 100%;&lt;br /&gt;
&lt;br /&gt;
 top: 5% /* Offsets navigation bar */;&lt;br /&gt;
&lt;br /&gt;
 right: 0;&lt;br /&gt;
&lt;br /&gt;
 bottom: 0;&lt;br /&gt;
&lt;br /&gt;
 left: 0;&lt;br /&gt;
&lt;br /&gt;
}&lt;br /&gt;
&lt;br /&gt;
The inclusion of these three pieces – the directive file, the HTML code and the corresponding CSS code - are all that is required to repurpose the mapping capabilities in Aquiferium into any other AngularJS application.&lt;br /&gt;
&lt;br /&gt;
'''Preparing the Codebase for Development&lt;br /&gt;
'''&lt;br /&gt;
The process for recompiling the MCSDSS codebase is relatively straightforward but requires that the entire codebase be downloaded, to include all assets used by the system (these are currently bundled into the code base). In addition to the appropriate source code, new or bundled assets (in the case of new geodata) that are required by the extensions being made you will need to setup the appropriate development environment to compile the application. The aquiferium codebase will require the installation of Ruby, NodeJS, Compass and Yeoman in order to accomplish this, as well as the dependencies for the project that we will install during the compilation process. The following steps detail how to configure the development environment for &lt;br /&gt;
&lt;br /&gt;
compilation of the code base.&lt;br /&gt;
&lt;br /&gt;
Step 1: Install Ruby, Compass, NodeJS, and Yeoman&lt;br /&gt;
&lt;br /&gt;
Links to these resources are included in MCSDSS Technology Stack to acquire the necessary installation files and instructions. It is required that you install NodeJS before compass or Yeoman because the npm utility for package management used for their installation is included with NodeJS.&lt;br /&gt;
&lt;br /&gt;
The recommended installation sequence is as follows: &lt;br /&gt;
&lt;br /&gt;
1. Ruby&lt;br /&gt;
&lt;br /&gt;
2. NodeJS + NPM&lt;br /&gt;
&lt;br /&gt;
3. Compass&lt;br /&gt;
&lt;br /&gt;
4. Yeoman&lt;br /&gt;
&lt;br /&gt;
Both Ruby and NodeJS + NPM are straight forward installers that handle the process for you. During the installation of Ruby, it is recommended you select the ‘add Ruby to your System Path’ option. Once both Ruby and NodeJS are installed, open a command prompt and execute the following commands to prepare the system and install Compass:&lt;br /&gt;
&lt;br /&gt;
&amp;gt; gem update –system&lt;br /&gt;
&lt;br /&gt;
&amp;gt; gem install compass&lt;br /&gt;
&lt;br /&gt;
&amp;gt; npm install compass&lt;br /&gt;
&lt;br /&gt;
Note: There is a known SSL issue with updating ruby gems in order to install the compass gem properly. See this guide for the solution to that error should you encounter it: &lt;br /&gt;
&lt;br /&gt;
https://gist.github.com/luislavena/f064211759ee0f806c88&lt;br /&gt;
&lt;br /&gt;
Step 2: Install Code Dependencies and Packages&lt;br /&gt;
&lt;br /&gt;
Once you have completed the installation of the utilities required for the development workflow, you need to clone a copy of the online repository to your local system. The clone should result in a directory named eaa-&lt;br /&gt;
aquiferium wherever you chose to clone the repo. You should then navigate to the root directory of the code base (../eaa-aquiferium) and open a command prompt in this location.&lt;br /&gt;
&lt;br /&gt;
On the command line. Type the following command to install all Node packages required by the Aquiferium:&lt;br /&gt;
&lt;br /&gt;
&amp;gt; npm install&lt;br /&gt;
&lt;br /&gt;
Note: This process may take several minutes as Aquiferium has a large number of package dependencies required to download and install. It is a good idea additionally to run an update on all the packages required by the product to ensure they are current to the latest or specified versions required by the project. To do this run the following command:&lt;br /&gt;
&lt;br /&gt;
&amp;gt; npm update&lt;br /&gt;
&lt;br /&gt;
Now we need to install all the library dependencies required by MCSDSS. To do this, type the following command:&lt;br /&gt;
&lt;br /&gt;
&amp;gt; bower install&lt;br /&gt;
&lt;br /&gt;
If the bower install command exits without creating and installing the dependencies in the folder located at &lt;br /&gt;
&lt;br /&gt;
./eaa-aquiferium/app/bower_components, check the console output to see if you have received an ENOENT &lt;br /&gt;
&lt;br /&gt;
ERROR for any of the packages you were trying to install. If there is an error it, will most likely, be the last one listed in the console. If so, open up the file located at ./eaa-aquiferium/package.json and look for the matching entry to that package that gave the ENOENT ERROR. Delete it from the file (ensuring that the empty line is also removed and that any trailing commas are removed if this changes the last entry in the “dependencies”: {} object). Make sure you have taken note of what dependency had to be removed in case it needs to be reinstalled with npm or bower at a later time.  Once you have completed installing the bower dependencies, run the following command to execute the gruntfile.js taskrunner from inside ./eaa-aquiferium to ensure the project is fully setup and ready to develop actively on local:&lt;br /&gt;
&lt;br /&gt;
&amp;gt; grunt serve&lt;br /&gt;
&lt;br /&gt;
The gruntfile.js taskrunner completes a myriad of operations on the code base prepares it for deployment and launches a local web server with the application running inside of it for local development and testing.At this point, you should be able to use the Aquiferium exactly like the deployed online version. You can now begin updating or extending the application as described in the following sections.&lt;br /&gt;
&lt;br /&gt;
[[Adding New Map Tile Services]]&lt;br /&gt;
&lt;br /&gt;
Aquiferium has the most popular map tiling services already included with the exception of Google Maps. Due to the complexity of integrating Google Maps into the Aquifeirum application and the fact that its functionality as a tile service is duplicated by Open Street Maps it was determined that the task of code integration was not worth the effort.&lt;br /&gt;
&lt;br /&gt;
Should the inclusion of an additional or map tiling service be necessary, the following code examples demonstrate how to extend the map tiling services available within Aquiferium.&lt;br /&gt;
&lt;br /&gt;
On line 85, insert the following code:&lt;br /&gt;
&lt;br /&gt;
var newMapService_Link = '&amp;lt;a href=&amp;quot;http://www.newmapservice.com/&amp;quot;&amp;gt;New Map Service&amp;lt;/a&amp;gt;';&lt;br /&gt;
&lt;br /&gt;
var newMapService_Url = 'http://newMapTileServer/tile/{z}/{y}/{x}';&lt;br /&gt;
&lt;br /&gt;
var newMapService_Attrib = '&amp;amp;copy; ' + newMapService_Link;&lt;br /&gt;
&lt;br /&gt;
var newMapService_Map = L.tileLayer(newMapService_Url, {&lt;br /&gt;
&lt;br /&gt;
});&lt;br /&gt;
&lt;br /&gt;
On line 520 the baseLayers object for the map is defined. Insert the following code inside this object:&lt;br /&gt;
&lt;br /&gt;
‘newMapService_Map': newMapService_Map,&lt;br /&gt;
&lt;br /&gt;
Note: Care should be taken to ensure inclusion of a trailing comma after this snippet if it is not the last map services listed in the baseLayer object. Vice-versa that there is NO trailing comma if it is the last services listed in the baseLayer object.&lt;br /&gt;
&lt;br /&gt;
Note: The order in which the services are listed here, is the order they will appear on the map for selection.&lt;br /&gt;
&lt;br /&gt;
Note: You should replace all instances of ‘newMapService’ in the code above with a unique name that &lt;br /&gt;
&lt;br /&gt;
describes the map service in some identifiable way. (ex. customMapboxTiles).&lt;br /&gt;
&lt;br /&gt;
That concludes the code changes required to extend the map tiling services available for display in Aquiferium. The application must be recompiled to reflect changes and complete the process. The procedure for recompiling the application is detailed later in this document&lt;br /&gt;
&lt;br /&gt;
[[Adding New Geodata]]&lt;br /&gt;
&lt;br /&gt;
In order to add additional geodata layers, in this case geojson data, you will need to duplicate and modify code within the directive in several places. Additionally you will need already to have the geojson data you want to include on the map, preferably optimized for web use, and converted to WGS84 projection. &lt;br /&gt;
&lt;br /&gt;
[[Instructions for the conversion process are as follows.]]&lt;br /&gt;
&lt;br /&gt;
Note: If additional shapefile optimization is required, that process can be found described at this URL: &lt;br /&gt;
&lt;br /&gt;
http://blog.thematicmapping.org/2012/10/mapping-regions-of-new-zealand-with.html&lt;br /&gt;
&lt;br /&gt;
'''Step 1: GDAL Installation'''&lt;br /&gt;
&lt;br /&gt;
In order to convert most geodata files into geoJSON you will need to use the GDAL utility. GDAL stands for Geospatial Data Abstraction Library and is a suite of tools for manipulating, editing, and transforming geodata into and between almost any formats used by GIS systems today. More information about GDAL’s capabilities can be found at this URL: http://www.gdal.org/index.html. GDAL can be downloaded by itself at the following URL: http://trac.osgeo.org/gdal/wiki/DownloadingGdalBinaries. If you are planning on doing regular manipulation of geodata files and are working in a Windows environment, it is recommended instead that you download and install the OSGeo4W suite of tools, which includes GDAL, from this URL: &lt;br /&gt;
&lt;br /&gt;
http://trac.osgeo.org/osgeo4w/ . Regardless of which utility you select, complete the installation instructions that accompany the application on the vendor’s website and when you are done, resume this process.&lt;br /&gt;
&lt;br /&gt;
'''Step 2: Convert Shapefiles to GeoJSON WGS84'''&lt;br /&gt;
&lt;br /&gt;
While there are a great many GIS data formats available, the most common one is the ESRI Shapefile (known by the *.shp) extension. In reality, this is not a single file, but an archived collection of related files that all attribution: newMapService_Attrib describe a specific geographic space and inculcate all the associated data the shapefile represents. To convert the shapefile, you will need to launch the GDAL command prompt, navigate to the directory your shapefile is located in and execute the following command:&lt;br /&gt;
&lt;br /&gt;
&amp;gt; ogr2ogr -f &amp;quot;GeoJSON&amp;quot; -t_srs &amp;quot;WGS84&amp;quot; OUTPUT_FILENAME.json INPUT_FILENAME.shp&lt;br /&gt;
&lt;br /&gt;
This single command will convert the data into geoJSON format as well as alter the mapping projecting to WGS84 in one pass. If your shapefile has more detailed resolution data than is necessary for the mapping zoom level you are viewing it at, you can alter the GDAL command to limit the coordinate output included in the geoJSON file as follows:&lt;br /&gt;
&lt;br /&gt;
&amp;gt; ogr2ogr -f &amp;quot;GeoJSON&amp;quot; -lco COORDINATE_PRECISION=3 -t_srs &amp;quot;WGS84&amp;quot; OUTPUT_FILENAME.json &lt;br /&gt;
&lt;br /&gt;
INPUT_FILENAME.shp&lt;br /&gt;
&lt;br /&gt;
Note: This example limits coordinate output to 3 decimal places, but any whole number can be used here.&lt;br /&gt;
&lt;br /&gt;
This can have dramatic effects on both the output file size and the visual appearance of the geodata on the map. Therefore, some testing may be required to find the correct balance between optimization and display quality desired.&lt;br /&gt;
&lt;br /&gt;
'''Step 3: Add New GeoJSON to the Application'''&lt;br /&gt;
&lt;br /&gt;
To ensure that the new geoJSON is available to the MCSDSS for display, a copy of the geoJSON file needs to be placed in the following directory: &lt;br /&gt;
&lt;br /&gt;
//eaa-aquiferium/app/data/geojson/&amp;lt;optional_subfolder&amp;gt;/NEW_GEOJSON_FILE.json&lt;br /&gt;
&lt;br /&gt;
'''Step 4: Add New GeoJSON to the Map Directive'''&lt;br /&gt;
&lt;br /&gt;
To make the new GeoJSON file accessible to the users of the MCSDSS, it must now be added into the mapping Directive’s codebase. This will require the insertion of several new lines of code in specific locations as detailed in the following section.&lt;br /&gt;
&lt;br /&gt;
Note: Any hexadecimal colors specified can be changed to the developer’s preference.&lt;br /&gt;
&lt;br /&gt;
On line 150 insert the following code:&lt;br /&gt;
&lt;br /&gt;
var newGeojson_Style = { 'fillColor': #F3F };&lt;br /&gt;
&lt;br /&gt;
var newGeojson_StyleHover = { 'fillColor': #3F3 };&lt;br /&gt;
&lt;br /&gt;
On line 185 insert the following code:&lt;br /&gt;
&lt;br /&gt;
var newGeojson_Geojson = './data/geojson/&amp;lt;optional_subfolder&amp;gt;/NEW_GEOJSON_FILE.json';&lt;br /&gt;
&lt;br /&gt;
On line 211 insert the following code: &lt;br /&gt;
&lt;br /&gt;
var newGeojson_Layer = new L.LayerGroup();&lt;br /&gt;
&lt;br /&gt;
On line 328 insert the following code:&lt;br /&gt;
&lt;br /&gt;
$.getJSON(newGeojson_Geojson, function(data) {processGeojson(data, newGeojson_Layer, newGeojson_Style, newGeojson_StyleHover);&lt;br /&gt;
&lt;br /&gt;
 });&lt;br /&gt;
&lt;br /&gt;
Note: Additional interactivity can be added to the GeoJSON layer in the above function, but that process is beyond the scope of this document.&lt;br /&gt;
&lt;br /&gt;
On line 541 insert the following code:&lt;br /&gt;
&lt;br /&gt;
‘New GeoJSON’: newGeojson_Layer,&lt;br /&gt;
&lt;br /&gt;
Note: You should replace all instances of ‘newGeojson‘ in the code above with a unique name that describes your geodata in some identifiable way. (ex. supportRegions_Geojson.json).  That concludes the code changes required to extend the geodata available for display in Aquiferium. To complete this process, the application will now need to be recompiled. The procedure for recompiling the application will be detailed later in this document.&lt;br /&gt;
&lt;br /&gt;
[[Adding New Map Markers]]&lt;br /&gt;
&lt;br /&gt;
If additional markers are desired on the map interface, they can easily be added to the Directive by inserting &lt;br /&gt;
&lt;br /&gt;
the following code at line 456:&lt;br /&gt;
&lt;br /&gt;
var newMarkerLocation = L.latLng(LAT_VAL, LON_VAL);&lt;br /&gt;
&lt;br /&gt;
 var newMarkerOptions = { title: ‘New Marker Title’ };&lt;br /&gt;
&lt;br /&gt;
 var newMarker = L.marker(newMarkerLocation, newMarkerOptions);&lt;br /&gt;
&lt;br /&gt;
 newMarker.addTo(allMarkersLayer); &lt;br /&gt;
&lt;br /&gt;
 var newMarkerPopupContent = '&amp;lt;h2&amp;gt;New Marker Title&amp;lt;/h2&amp;gt;';&lt;br /&gt;
&lt;br /&gt;
 var newMarkerContentContainer = $('&amp;lt;div /&amp;gt;');&lt;br /&gt;
&lt;br /&gt;
 newMarkerContentContainer.html(newMarkerPopupContent);&lt;br /&gt;
&lt;br /&gt;
 newMarker.bindPopup(newMarkerContentContainer[0]);&lt;br /&gt;
&lt;br /&gt;
Note: You should replace all instances of ‘newMarker’ in the code above with a unique name that describes your marker in some identifiable way. (ex. hqOfficesMarker).&lt;br /&gt;
&lt;br /&gt;
That concludes the code changes required to extend the map markers available for display in MCSDSS. To complete this process, the application will now need to be recompiled. The procedure for recompiling the application will be detailed later in this document.&lt;br /&gt;
&lt;br /&gt;
[[Recompiling the Codebase&lt;br /&gt;
]]&lt;br /&gt;
The primary process of compiling the Aquifeirum is very similar to the process for running it during development. The complexities are handled again by the gruntfile.js taskrunner that prepares the codebase for live deployment rather than local development by executing the following command from the ./eaa-aquiferium/ directory:&lt;br /&gt;
&lt;br /&gt;
&amp;gt; grunt build --force&lt;br /&gt;
&lt;br /&gt;
This will initiate the build process. The output will be logged to the console window for observation or for debugging should there be any errors during the process. Once the build process is complete you will find the final version of the project (the set of files that needs to be uploaded to your web server or copied into your local web server for offline use) in the folder located at ./eaa-aquiferium/dist.  The build process has several known bugs that have to be manually corrected as follows in these steps.&lt;br /&gt;
&lt;br /&gt;
'''Step 1: Fix Missing Dependencies''' &lt;br /&gt;
&lt;br /&gt;
Open the index file for the distribution build (located at ./eaa-aquiferium/dist/index.html) and insert the following code into the HTML right after the code reading ‘&amp;lt;!—Place favicon.ico and apple-touch-icon.png in the root directory. --&amp;gt;’ but before the subsequent opening &amp;lt;link&amp;gt; tag:&lt;br /&gt;
&lt;br /&gt;
&amp;lt;link href='http://fonts.googleapis.com/css?family=Open+Sans:400,300,700'&lt;br /&gt;
&lt;br /&gt;
rel='stylesheet' type='text/css'&amp;gt;&lt;br /&gt;
&lt;br /&gt;
 &amp;lt;link href='http://fonts.googleapis.com/css?family=Open+Sans+Condensed:300,700' &lt;br /&gt;
&lt;br /&gt;
rel='stylesheet' type='text/css'&amp;gt;&lt;br /&gt;
&lt;br /&gt;
 &amp;lt;link href='http://fonts.googleapis.com/css?family=Ubuntu:400,500,700' &lt;br /&gt;
&lt;br /&gt;
rel='stylesheet' type='text/css'&amp;gt;&lt;br /&gt;
&lt;br /&gt;
 &amp;lt;link href='http://fonts.googleapis.com/css?family=Ubuntu+Condensed' rel='stylesheet' &lt;br /&gt;
&lt;br /&gt;
type='text/css'&amp;gt;&lt;br /&gt;
&lt;br /&gt;
 &amp;lt;!-- Latest compiled and minified CSS --&amp;gt;&lt;br /&gt;
&lt;br /&gt;
 &amp;lt;link rel=&amp;quot;stylesheet&amp;quot; &lt;br /&gt;
&lt;br /&gt;
href=&amp;quot;https://maxcdn.bootstrapcdn.com/bootstrap/3.3.1/css/bootstrap.min.css&amp;quot;&amp;gt;&lt;br /&gt;
&lt;br /&gt;
'''Step 2: Fix Missing CSS'''&lt;br /&gt;
&lt;br /&gt;
Ensure appended to the very end of the file located at ./eaa-encompass/dist/styles/ the {hash}.main.css is this style:&lt;br /&gt;
&lt;br /&gt;
 .glyph-spacing-right{margin-right:0.5rem;}&lt;br /&gt;
&lt;br /&gt;
Step 3: Replace Missing Images&lt;br /&gt;
&lt;br /&gt;
In the ./eaa-aquiferium/dist/styles folder, create a new folder named images. Navigate to the folder ./eaa-&lt;br /&gt;
aquiferium/app/bower_components/leaflet/dist/images. Copy all image files from this location into this folder you previously created ( . /eaa-aquiferium/dist/styles).&lt;br /&gt;
&lt;br /&gt;
'''Step 4: Replace Missing Glyphs'''&lt;br /&gt;
&lt;br /&gt;
In the folder ./eaa-aquiferium/dist create the following directory structure: &lt;br /&gt;
&lt;br /&gt;
.eaa-aquiferium/dist/bower_components/bootstrap-sass-official/vendor/assets/fonts/bootstrap/&lt;br /&gt;
&lt;br /&gt;
Now navigate to this directory: ./eaa-aquiferium/app/bower_components/bootstrap-sass-official/vendor/assets/fonts/bootstrap/ and copy all the files inside (all named glyphicons-halflings-regular.*) into the last folder you just created in the above directory structure (bootstrap).&lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
'''Step 5: Update Broken Paths'''&lt;br /&gt;
&lt;br /&gt;
The image, video and data asset paths will be incorrect after the initial build. There is a bug in the way the path is updated by the task runner that results in an incorrect path to the resources. It requires a find &amp;amp; replace manual update on the files that follow using the specified values.&lt;br /&gt;
&lt;br /&gt;
[[Video &amp;amp; Image Assets]]&lt;br /&gt;
&lt;br /&gt;
Find and replace all instances of these string values:&lt;br /&gt;
&lt;br /&gt;
‘../videos/’ with ‘./videos/’&lt;br /&gt;
&lt;br /&gt;
‘/images/’ with ‘../images/’&lt;br /&gt;
&lt;br /&gt;
In these HTML files:&lt;br /&gt;
&lt;br /&gt;
./eaa-aquiferium/dist/geography.html&lt;br /&gt;
&lt;br /&gt;
./eaa-aquiferium/dist/geology.html&lt;br /&gt;
&lt;br /&gt;
./eaa-aquiferium/dist/springs.html&lt;br /&gt;
&lt;br /&gt;
./eaa-aquiferium/dist/conservation.html&lt;br /&gt;
&lt;br /&gt;
And in this CSS file: ./eaa-aquiferium/dist/styles/{hash}.main.css. Additionally, you will need to copy the image and video folders (and all their contents) into the ./eaa-aquiferium/dist folder to address a bug in the renaming of files during the build process.&lt;br /&gt;
&lt;br /&gt;
[[Data Assets]]&lt;br /&gt;
&lt;br /&gt;
Find and replace all instances of the string value ‘../../data/’ with ‘./data/’ in the file ./eaa-aquiferium/dist/scripts/{hash}.scripts.js&lt;br /&gt;
&lt;br /&gt;
'''Step 6: Validate Build Locally'''&lt;br /&gt;
&lt;br /&gt;
It is recommended you copy the fully updated distribution code to a local web server (Uniform Server or another solution) and test the application outside the development environment. Any missing assets or other errors will be output to the browser console for further debugging if necessary. Once you are certain that your new build is ready for deployment, upload the distribution files to your web server and test the application again online. You should now have your recompiled application running and be able to see the changes you have made or other extensions you have added.&lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
== Additional Development ==&lt;br /&gt;
&lt;br /&gt;
This document has covered all the basic steps involved in setting up the code base, making changes to the code base, and recompiling the code for live deployment. Additional capabilities are beyond the scope of this document and should be considered as another phase in the project’s development lifecycle.&lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
&amp;lt;!-- Add any wiki Text above this Line --&amp;gt;&lt;br /&gt;
&amp;lt;!-- Do NOT Edit below this Line --&amp;gt;&lt;br /&gt;
{{#set:&lt;br /&gt;
	Expertise=Open_science|&lt;br /&gt;
	Expertise=Geosciences|&lt;br /&gt;
	Owner=Suzanne_Pierce|&lt;br /&gt;
	Progress=80|&lt;br /&gt;
	StartDate=2015-03-21|&lt;br /&gt;
	TargetDate=2015-04-03|&lt;br /&gt;
	Type=Low}}&lt;/div&gt;</summary>
		<author><name>Suzanne</name></author>	</entry>

	<entry>
		<id>https://www.organicdatascience.org/gpf/index.php?title=Suzanne_Pierce_should_make_software_executable_by_others&amp;diff=11713</id>
		<title>Suzanne Pierce should make software executable by others</title>
		<link rel="alternate" type="text/html" href="https://www.organicdatascience.org/gpf/index.php?title=Suzanne_Pierce_should_make_software_executable_by_others&amp;diff=11713"/>
				<updated>2015-04-03T18:42:31Z</updated>
		
		<summary type="html">&lt;p&gt;Suzanne: &lt;/p&gt;
&lt;hr /&gt;
&lt;div&gt;[[Category:Task]]&lt;br /&gt;
&amp;lt;br/&amp;gt;&amp;lt;b&amp;gt;Details on how to do this task:&amp;lt;/b&amp;gt; [[Make software executable by others]]&amp;lt;br/&amp;gt;&amp;lt;br/&amp;gt;&lt;br /&gt;
Installing, Using, and Extending MCSDSS&lt;br /&gt;
&lt;br /&gt;
The MCSDSS has been developed such that it is relatively easy to update the data used by the current design of the data interactive components and the visual content. However, extending the application by adding new data to the map or wiring up additional map tile services is a somewhat more complicated procedure that will require some knowledge of JavaScript and current web development workflows. Familiarity with the AngularJS framework is also beneficial but not necessarily required. The MCSDSS application must be recompiled after any changes are made to the code base. This document shows step by step instructions for how to extend the mapping services, add additional geodata, and access additional third party resources that describe in detail how to do additional modifications to the map capabilities.  To get started, you will need to launch your IDE or text editor of choice and open the file located at //eaa-aquiferium/app/scripts/directives/leaflet-directive.js. &lt;br /&gt;
&lt;br /&gt;
This file contains the code for the entirety of the mapping capability within Aquiferium. It is an Angular directive, which is essentially a way for JavaScript developers to create new or customized components that can be added to a web application using simple HTML tag syntax. In this case, if you were to open the file located at //eaa-aquiferium/app/views/geography.html and look at line 3 you will see the following HTML &lt;br /&gt;
&lt;br /&gt;
code: &lt;br /&gt;
&lt;br /&gt;
&amp;lt;leaflet-map id=&amp;quot;map&amp;quot; class=&amp;quot;div-fixed z-100&amp;quot; display-recharge-&lt;br /&gt;
panel=&amp;quot;displayRechargePanel()&amp;quot; display-wells-panel=&amp;quot;displayWellsPanel()&amp;quot; display-&lt;br /&gt;
springs-panel=&amp;quot;displaySpringsPanel()&amp;quot;&amp;gt;&amp;lt;/leaflet-map&amp;gt;&lt;br /&gt;
&lt;br /&gt;
Through the use of the Angular directive, any web app developer with access to your directive code file can use it in their project. Simply include it in the framework as a dependency and use the above HTML code snippet alongside the appropriate CSS code. In fact, most of the code snippet is telling the Angular framework how to respond to links specific to the Aquiferium. Therefore, a developer, who only wanted to utilize the mapping capability, could use the following simplified code snippet:&lt;br /&gt;
&lt;br /&gt;
&amp;lt;leaflet-map id=&amp;quot;map&amp;quot;&amp;gt;&amp;lt;/leaflet-map&amp;gt;&lt;br /&gt;
&lt;br /&gt;
Note: All the above code snippets require th CSS code below to also be present within the web application &lt;br /&gt;
&lt;br /&gt;
(the web developer can customize the CSS to their needs):&lt;br /&gt;
&lt;br /&gt;
#map {&lt;br /&gt;
&lt;br /&gt;
 width: 100%;&lt;br /&gt;
&lt;br /&gt;
 height: 100%;&lt;br /&gt;
&lt;br /&gt;
 top: 5% /* Offsets navigation bar */;&lt;br /&gt;
&lt;br /&gt;
 right: 0;&lt;br /&gt;
&lt;br /&gt;
 bottom: 0;&lt;br /&gt;
&lt;br /&gt;
 left: 0;&lt;br /&gt;
&lt;br /&gt;
}&lt;br /&gt;
&lt;br /&gt;
The inclusion of these three pieces – the directive file, the HTML code and the corresponding CSS code - are all that is required to repurpose the mapping capabilities in Aquiferium into any other AngularJS application.&lt;br /&gt;
&lt;br /&gt;
'''Preparing the Codebase for Development&lt;br /&gt;
'''&lt;br /&gt;
The process for recompiling the MCSDSS codebase is relatively straightforward but requires that the entire codebase be downloaded, to include all assets used by the system (these are currently bundled into the code base). In addition to the appropriate source code, new or bundled assets (in the case of new geodata) that are required by the extensions being made you will need to setup the appropriate development environment to compile the application. The aquiferium codebase will require the installation of Ruby, NodeJS, Compass and Yeoman in order to accomplish this, as well as the dependencies for the project that we will install during the compilation process. The following steps detail how to configure the development environment for &lt;br /&gt;
&lt;br /&gt;
compilation of the code base.&lt;br /&gt;
&lt;br /&gt;
Step 1: Install Ruby, Compass, NodeJS, and Yeoman&lt;br /&gt;
&lt;br /&gt;
Links to these resources are included in MCSDSS Technology Stack to acquire the necessary installation files and instructions. It is required that you install NodeJS before compass or Yeoman because the npm utility for package management used for their installation is included with NodeJS.&lt;br /&gt;
&lt;br /&gt;
The recommended installation sequence is as follows: &lt;br /&gt;
&lt;br /&gt;
1. Ruby&lt;br /&gt;
&lt;br /&gt;
2. NodeJS + NPM&lt;br /&gt;
&lt;br /&gt;
3. Compass&lt;br /&gt;
&lt;br /&gt;
4. Yeoman&lt;br /&gt;
&lt;br /&gt;
Both Ruby and NodeJS + NPM are straight forward installers that handle the process for you. During the installation of Ruby, it is recommended you select the ‘add Ruby to your System Path’ option. Once both Ruby and NodeJS are installed, open a command prompt and execute the following commands to prepare the system and install Compass:&lt;br /&gt;
&lt;br /&gt;
&amp;gt; gem update –system&lt;br /&gt;
&lt;br /&gt;
&amp;gt; gem install compass&lt;br /&gt;
&lt;br /&gt;
&amp;gt; npm install compass&lt;br /&gt;
&lt;br /&gt;
Note: There is a known SSL issue with updating ruby gems in order to install the compass gem properly. See this guide for the solution to that error should you encounter it: &lt;br /&gt;
&lt;br /&gt;
https://gist.github.com/luislavena/f064211759ee0f806c88&lt;br /&gt;
&lt;br /&gt;
Step 2: Install Code Dependencies and Packages&lt;br /&gt;
&lt;br /&gt;
Once you have completed the installation of the utilities required for the development workflow, you need to clone a copy of the online repository to your local system. The clone should result in a directory named eaa-&lt;br /&gt;
aquiferium wherever you chose to clone the repo. You should then navigate to the root directory of the code base (../eaa-aquiferium) and open a command prompt in this location.&lt;br /&gt;
&lt;br /&gt;
On the command line. Type the following command to install all Node packages required by the Aquiferium:&lt;br /&gt;
&lt;br /&gt;
&amp;gt; npm install&lt;br /&gt;
&lt;br /&gt;
Note: This process may take several minutes as Aquiferium has a large number of package dependencies required to download and install. It is a good idea additionally to run an update on all the packages required by the product to ensure they are current to the latest or specified versions required by the project. To do this run the following command:&lt;br /&gt;
&lt;br /&gt;
&amp;gt; npm update&lt;br /&gt;
&lt;br /&gt;
Now we need to install all the library dependencies required by MCSDSS. To do this, type the following command:&lt;br /&gt;
&lt;br /&gt;
&amp;gt; bower install&lt;br /&gt;
&lt;br /&gt;
If the bower install command exits without creating and installing the dependencies in the folder located at &lt;br /&gt;
&lt;br /&gt;
./eaa-aquiferium/app/bower_components, check the console output to see if you have received an ENOENT &lt;br /&gt;
&lt;br /&gt;
ERROR for any of the packages you were trying to install. If there is an error it, will most likely, be the last one listed in the console. If so, open up the file located at ./eaa-aquiferium/package.json and look for the matching entry to that package that gave the ENOENT ERROR. Delete it from the file (ensuring that the empty line is also removed and that any trailing commas are removed if this changes the last entry in the “dependencies”: {} object). Make sure you have taken note of what dependency had to be removed in case it needs to be reinstalled with npm or bower at a later time.  Once you have completed installing the bower dependencies, run the following command to execute the gruntfile.js taskrunner from inside ./eaa-aquiferium to ensure the project is fully setup and ready to develop actively on local:&lt;br /&gt;
&lt;br /&gt;
&amp;gt; grunt serve&lt;br /&gt;
&lt;br /&gt;
The gruntfile.js taskrunner completes a myriad of operations on the code base prepares it for deployment and launches a local web server with the application running inside of it for local development and testing.At this point, you should be able to use the Aquiferium exactly like the deployed online version. You can now begin updating or extending the application as described in the following sections.&lt;br /&gt;
&lt;br /&gt;
[[Adding New Map Tile Services]]&lt;br /&gt;
&lt;br /&gt;
Aquiferium has the most popular map tiling services already included with the exception of Google Maps. Due to the complexity of integrating Google Maps into the Aquifeirum application and the fact that its functionality as a tile service is duplicated by Open Street Maps it was determined that the task of code integration was not worth the effort.&lt;br /&gt;
&lt;br /&gt;
Should the inclusion of an additional or map tiling service be necessary, the following code examples demonstrate how to extend the map tiling services available within Aquiferium.&lt;br /&gt;
&lt;br /&gt;
On line 85, insert the following code:&lt;br /&gt;
&lt;br /&gt;
var newMapService_Link = '&amp;lt;a href=&amp;quot;http://www.newmapservice.com/&amp;quot;&amp;gt;New Map Service&amp;lt;/a&amp;gt;';&lt;br /&gt;
&lt;br /&gt;
var newMapService_Url = 'http://newMapTileServer/tile/{z}/{y}/{x}';&lt;br /&gt;
&lt;br /&gt;
var newMapService_Attrib = '&amp;amp;copy; ' + newMapService_Link;&lt;br /&gt;
&lt;br /&gt;
var newMapService_Map = L.tileLayer(newMapService_Url, {&lt;br /&gt;
&lt;br /&gt;
});&lt;br /&gt;
&lt;br /&gt;
On line 520 the baseLayers object for the map is defined. Insert the following code inside this object:&lt;br /&gt;
&lt;br /&gt;
‘newMapService_Map': newMapService_Map,&lt;br /&gt;
&lt;br /&gt;
Note: Care should be taken to ensure inclusion of a trailing comma after this snippet if it is not the last map services listed in the baseLayer object. Vice-versa that there is NO trailing comma if it is the last services listed in the baseLayer object.&lt;br /&gt;
&lt;br /&gt;
Note: The order in which the services are listed here, is the order they will appear on the map for selection.&lt;br /&gt;
&lt;br /&gt;
Note: You should replace all instances of ‘newMapService’ in the code above with a unique name that &lt;br /&gt;
&lt;br /&gt;
describes the map service in some identifiable way. (ex. customMapboxTiles).&lt;br /&gt;
&lt;br /&gt;
That concludes the code changes required to extend the map tiling services available for display in Aquiferium. The application must be recompiled to reflect changes and complete the process. The procedure for recompiling the application is detailed later in this document&lt;br /&gt;
&lt;br /&gt;
[[Adding New Geodata]]&lt;br /&gt;
&lt;br /&gt;
In order to add additional geodata layers, in this case geojson data, you will need to duplicate and modify code within the directive in several places. Additionally you will need already to have the geojson data you want to include on the map, preferably optimized for web use, and converted to WGS84 projection. &lt;br /&gt;
&lt;br /&gt;
[[Instructions for the conversion process are as follows.]]&lt;br /&gt;
&lt;br /&gt;
Note: If additional shapefile optimization is required, that process can be found described at this URL: &lt;br /&gt;
&lt;br /&gt;
http://blog.thematicmapping.org/2012/10/mapping-regions-of-new-zealand-with.html&lt;br /&gt;
&lt;br /&gt;
'''Step 1: GDAL Installation'''&lt;br /&gt;
&lt;br /&gt;
In order to convert most geodata files into geoJSON you will need to use the GDAL utility. GDAL stands for Geospatial Data Abstraction Library and is a suite of tools for manipulating, editing, and transforming geodata into and between almost any formats used by GIS systems today. More information about GDAL’s capabilities can be found at this URL: http://www.gdal.org/index.html. GDAL can be downloaded by itself at the following URL: http://trac.osgeo.org/gdal/wiki/DownloadingGdalBinaries. If you are planning on doing regular manipulation of geodata files and are working in a Windows environment, it is recommended instead that you download and install the OSGeo4W suite of tools, which includes GDAL, from this URL: &lt;br /&gt;
&lt;br /&gt;
http://trac.osgeo.org/osgeo4w/ . Regardless of which utility you select, complete the installation instructions that accompany the application on the vendor’s website and when you are done, resume this process.&lt;br /&gt;
&lt;br /&gt;
'''Step 2: Convert Shapefiles to GeoJSON WGS84'''&lt;br /&gt;
&lt;br /&gt;
While there are a great many GIS data formats available, the most common one is the ESRI Shapefile (known by the *.shp) extension. In reality, this is not a single file, but an archived collection of related files that all attribution: newMapService_Attrib describe a specific geographic space and inculcate all the associated data the shapefile represents. To convert the shapefile, you will need to launch the GDAL command prompt, navigate to the directory your shapefile is located in and execute the following command:&lt;br /&gt;
&lt;br /&gt;
&amp;gt; ogr2ogr -f &amp;quot;GeoJSON&amp;quot; -t_srs &amp;quot;WGS84&amp;quot; OUTPUT_FILENAME.json INPUT_FILENAME.shp&lt;br /&gt;
&lt;br /&gt;
This single command will convert the data into geoJSON format as well as alter the mapping projecting to WGS84 in one pass. If your shapefile has more detailed resolution data than is necessary for the mapping zoom level you are viewing it at, you can alter the GDAL command to limit the coordinate output included in the geoJSON file as follows:&lt;br /&gt;
&lt;br /&gt;
&amp;gt; ogr2ogr -f &amp;quot;GeoJSON&amp;quot; -lco COORDINATE_PRECISION=3 -t_srs &amp;quot;WGS84&amp;quot; OUTPUT_FILENAME.json &lt;br /&gt;
&lt;br /&gt;
INPUT_FILENAME.shp&lt;br /&gt;
&lt;br /&gt;
Note: This example limits coordinate output to 3 decimal places, but any whole number can be used here.&lt;br /&gt;
&lt;br /&gt;
This can have dramatic effects on both the output file size and the visual appearance of the geodata on the map. Therefore, some testing may be required to find the correct balance between optimization and display quality desired.&lt;br /&gt;
&lt;br /&gt;
'''Step 3: Add New GeoJSON to the Application'''&lt;br /&gt;
&lt;br /&gt;
To ensure that the new geoJSON is available to the MCSDSS for display, a copy of the geoJSON file needs to be placed in the following directory: &lt;br /&gt;
&lt;br /&gt;
//eaa-aquiferium/app/data/geojson/&amp;lt;optional_subfolder&amp;gt;/NEW_GEOJSON_FILE.json&lt;br /&gt;
&lt;br /&gt;
'''Step 4: Add New GeoJSON to the Map Directive'''&lt;br /&gt;
&lt;br /&gt;
To make the new GeoJSON file accessible to the users of the MCSDSS, it must now be added into the mapping Directive’s codebase. This will require the insertion of several new lines of code in specific locations as detailed in the following section.&lt;br /&gt;
&lt;br /&gt;
Note: Any hexadecimal colors specified can be changed to the developer’s preference.&lt;br /&gt;
&lt;br /&gt;
On line 150 insert the following code:&lt;br /&gt;
&lt;br /&gt;
var newGeojson_Style = { 'fillColor': #F3F };&lt;br /&gt;
&lt;br /&gt;
var newGeojson_StyleHover = { 'fillColor': #3F3 };&lt;br /&gt;
&lt;br /&gt;
On line 185 insert the following code:&lt;br /&gt;
&lt;br /&gt;
var newGeojson_Geojson = './data/geojson/&amp;lt;optional_subfolder&amp;gt;/NEW_GEOJSON_FILE.json';&lt;br /&gt;
&lt;br /&gt;
On line 211 insert the following code: &lt;br /&gt;
&lt;br /&gt;
var newGeojson_Layer = new L.LayerGroup();&lt;br /&gt;
&lt;br /&gt;
On line 328 insert the following code:&lt;br /&gt;
&lt;br /&gt;
$.getJSON(newGeojson_Geojson, function(data) {processGeojson(data, newGeojson_Layer, newGeojson_Style, newGeojson_StyleHover);&lt;br /&gt;
&lt;br /&gt;
 });&lt;br /&gt;
&lt;br /&gt;
Note: Additional interactivity can be added to the GeoJSON layer in the above function, but that process is beyond the scope of this document.&lt;br /&gt;
&lt;br /&gt;
On line 541 insert the following code:&lt;br /&gt;
&lt;br /&gt;
‘New GeoJSON’: newGeojson_Layer,&lt;br /&gt;
&lt;br /&gt;
Note: You should replace all instances of ‘newGeojson‘ in the code above with a unique name that describes your geodata in some identifiable way. (ex. supportRegions_Geojson.json).  That concludes the code changes required to extend the geodata available for display in Aquiferium. To complete this process, the application will now need to be recompiled. The procedure for recompiling the application will be detailed later in this document.&lt;br /&gt;
&lt;br /&gt;
[[Adding New Map Markers]]&lt;br /&gt;
&lt;br /&gt;
If additional markers are desired on the map interface, they can easily be added to the Directive by inserting &lt;br /&gt;
&lt;br /&gt;
the following code at line 456:&lt;br /&gt;
&lt;br /&gt;
var newMarkerLocation = L.latLng(LAT_VAL, LON_VAL);&lt;br /&gt;
&lt;br /&gt;
 var newMarkerOptions = { title: ‘New Marker Title’ };&lt;br /&gt;
&lt;br /&gt;
 var newMarker = L.marker(newMarkerLocation, newMarkerOptions);&lt;br /&gt;
&lt;br /&gt;
 newMarker.addTo(allMarkersLayer); &lt;br /&gt;
&lt;br /&gt;
 var newMarkerPopupContent = '&amp;lt;h2&amp;gt;New Marker Title&amp;lt;/h2&amp;gt;';&lt;br /&gt;
&lt;br /&gt;
 var newMarkerContentContainer = $('&amp;lt;div /&amp;gt;');&lt;br /&gt;
&lt;br /&gt;
 newMarkerContentContainer.html(newMarkerPopupContent);&lt;br /&gt;
&lt;br /&gt;
 newMarker.bindPopup(newMarkerContentContainer[0]);&lt;br /&gt;
&lt;br /&gt;
Note: You should replace all instances of ‘newMarker’ in the code above with a unique name that describes your marker in some identifiable way. (ex. hqOfficesMarker).&lt;br /&gt;
&lt;br /&gt;
That concludes the code changes required to extend the map markers available for display in MCSDSS. To complete this process, the application will now need to be recompiled. The procedure for recompiling the application will be detailed later in this document.&lt;br /&gt;
&lt;br /&gt;
[[Recompiling the Codebase&lt;br /&gt;
]]&lt;br /&gt;
The primary process of compiling the Aquifeirum is very similar to the process for running it during development. The complexities are handled again by the gruntfile.js taskrunner that prepares the codebase for live deployment rather than local development by executing the following command from the ./eaa-aquiferium/ directory:&lt;br /&gt;
&lt;br /&gt;
&amp;gt; grunt build --force&lt;br /&gt;
&lt;br /&gt;
This will initiate the build process. The output will be logged to the console window for observation or for debugging should there be any errors during the process. Once the build process is complete you will find the final version of the project (the set of files that needs to be uploaded to your web server or copied into your local web server for offline use) in the folder located at ./eaa-aquiferium/dist.  The build process has several known bugs that have to be manually corrected as follows in these steps.&lt;br /&gt;
&lt;br /&gt;
'''Step 1: Fix Missing Dependencies''' &lt;br /&gt;
&lt;br /&gt;
Open the index file for the distribution build (located at ./eaa-aquiferium/dist/index.html) and insert the following code into the HTML right after the code reading ‘&amp;lt;!—Place favicon.ico and apple-touch-icon.png in the root directory. --&amp;gt;’ but before the subsequent opening &amp;lt;link&amp;gt; tag:&lt;br /&gt;
&lt;br /&gt;
&amp;lt;link href='http://fonts.googleapis.com/css?family=Open+Sans:400,300,700'&lt;br /&gt;
&lt;br /&gt;
rel='stylesheet' type='text/css'&amp;gt;&lt;br /&gt;
&lt;br /&gt;
 &amp;lt;link href='http://fonts.googleapis.com/css?family=Open+Sans+Condensed:300,700' &lt;br /&gt;
&lt;br /&gt;
rel='stylesheet' type='text/css'&amp;gt;&lt;br /&gt;
&lt;br /&gt;
 &amp;lt;link href='http://fonts.googleapis.com/css?family=Ubuntu:400,500,700' &lt;br /&gt;
&lt;br /&gt;
rel='stylesheet' type='text/css'&amp;gt;&lt;br /&gt;
&lt;br /&gt;
 &amp;lt;link href='http://fonts.googleapis.com/css?family=Ubuntu+Condensed' rel='stylesheet' &lt;br /&gt;
&lt;br /&gt;
type='text/css'&amp;gt;&lt;br /&gt;
&lt;br /&gt;
 &amp;lt;!-- Latest compiled and minified CSS --&amp;gt;&lt;br /&gt;
&lt;br /&gt;
 &amp;lt;link rel=&amp;quot;stylesheet&amp;quot; &lt;br /&gt;
&lt;br /&gt;
href=&amp;quot;https://maxcdn.bootstrapcdn.com/bootstrap/3.3.1/css/bootstrap.min.css&amp;quot;&amp;gt;&lt;br /&gt;
&lt;br /&gt;
'''Step 2: Fix Missing CSS'''&lt;br /&gt;
&lt;br /&gt;
Ensure appended to the very end of the file located at ./eaa-encompass/dist/styles/ the {hash}.main.css is this style:&lt;br /&gt;
&lt;br /&gt;
 .glyph-spacing-right{margin-right:0.5rem;}&lt;br /&gt;
&lt;br /&gt;
Step 3: Replace Missing Images&lt;br /&gt;
&lt;br /&gt;
In the ./eaa-aquiferium/dist/styles folder, create a new folder named images. Navigate to the folder ./eaa-&lt;br /&gt;
aquiferium/app/bower_components/leaflet/dist/images. Copy all image files from this location into this folder you previously created ( . /eaa-aquiferium/dist/styles).&lt;br /&gt;
&lt;br /&gt;
'''Step 4: Replace Missing Glyphs'''&lt;br /&gt;
&lt;br /&gt;
In the folder ./eaa-aquiferium/dist create the following directory structure: &lt;br /&gt;
&lt;br /&gt;
.eaa-aquiferium/dist/bower_components/bootstrap-sass-official/vendor/assets/fonts/bootstrap/&lt;br /&gt;
&lt;br /&gt;
Now navigate to this directory: ./eaa-aquiferium/app/bower_components/bootstrap-sass-official/vendor/assets/fonts/bootstrap/ and copy all the files inside (all named glyphicons-halflings-regular.*) into the last folder you just created in the above directory structure (bootstrap).&lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
'''Step 5: Update Broken Paths'''&lt;br /&gt;
&lt;br /&gt;
The image, video and data asset paths will be incorrect after the initial build. There is a bug in the way the path is updated by the task runner that results in an incorrect path to the resources. It requires a find &amp;amp; replace manual update on the files that follow using the specified values.&lt;br /&gt;
&lt;br /&gt;
[[Video &amp;amp; Image Assets]]&lt;br /&gt;
&lt;br /&gt;
Find and replace all instances of these string values:&lt;br /&gt;
&lt;br /&gt;
‘../videos/’ with ‘./videos/’&lt;br /&gt;
&lt;br /&gt;
‘/images/’ with ‘../images/’&lt;br /&gt;
&lt;br /&gt;
In these HTML files:&lt;br /&gt;
&lt;br /&gt;
./eaa-aquiferium/dist/geography.html&lt;br /&gt;
&lt;br /&gt;
./eaa-aquiferium/dist/geology.html&lt;br /&gt;
&lt;br /&gt;
./eaa-aquiferium/dist/springs.html&lt;br /&gt;
&lt;br /&gt;
./eaa-aquiferium/dist/conservation.html&lt;br /&gt;
&lt;br /&gt;
And in this CSS file: ./eaa-aquiferium/dist/styles/{hash}.main.css. Additionally, you will need to copy the image and video folders (and all their contents) into the ./eaa-aquiferium/dist folder to address a bug in the renaming of files during the build process.&lt;br /&gt;
&lt;br /&gt;
[[Data Assets]]&lt;br /&gt;
&lt;br /&gt;
Find and replace all instances of the string value ‘../../data/’ with ‘./data/’ in the file ./eaa-aquiferium/dist/scripts/{hash}.scripts.js&lt;br /&gt;
&lt;br /&gt;
'''Step 6: Validate Build Locally'''&lt;br /&gt;
&lt;br /&gt;
It is recommended you copy the fully updated distribution code to a local web server (Uniform Server or another solution) and test the application outside the development environment. Any missing assets or other errors will be output to the browser console for further debugging if necessary. Once you are certain that your new build is ready for deployment, upload the distribution files to your web server and test the application again online. You should now have your recompiled application running and be able to see the changes you have made or other extensions you have added.&lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
== Additional Development ==&lt;br /&gt;
&lt;br /&gt;
This document has covered all the basic steps involved in setting up the code base, making changes to the code base, and recompiling the code for live deployment. Additional capabilities are beyond the scope of this document and should be considered as another phase in the project’s development lifecycle.&lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
&amp;lt;!-- Add any wiki Text above this Line --&amp;gt;&lt;br /&gt;
&amp;lt;!-- Do NOT Edit below this Line --&amp;gt;&lt;br /&gt;
{{#set:&lt;br /&gt;
	Expertise=Open_science|&lt;br /&gt;
	Expertise=Geosciences|&lt;br /&gt;
	Owner=Suzanne_Pierce|&lt;br /&gt;
	Progress=0|&lt;br /&gt;
	StartDate=2015-03-21|&lt;br /&gt;
	TargetDate=2015-04-03|&lt;br /&gt;
	Type=Low}}&lt;/div&gt;</summary>
		<author><name>Suzanne</name></author>	</entry>

	<entry>
		<id>https://www.organicdatascience.org/gpf/index.php?title=Suzanne_Pierce_should_make_software_executable_by_others&amp;diff=11712</id>
		<title>Suzanne Pierce should make software executable by others</title>
		<link rel="alternate" type="text/html" href="https://www.organicdatascience.org/gpf/index.php?title=Suzanne_Pierce_should_make_software_executable_by_others&amp;diff=11712"/>
				<updated>2015-04-03T18:39:05Z</updated>
		
		<summary type="html">&lt;p&gt;Suzanne: &lt;/p&gt;
&lt;hr /&gt;
&lt;div&gt;[[Category:Task]]&lt;br /&gt;
&amp;lt;br/&amp;gt;&amp;lt;b&amp;gt;Details on how to do this task:&amp;lt;/b&amp;gt; [[Make software executable by others]]&amp;lt;br/&amp;gt;&amp;lt;br/&amp;gt;&lt;br /&gt;
Appendix D: Extending Aquiferium&lt;br /&gt;
&lt;br /&gt;
The Aquiferium has been developed such that it is relatively easy to update the data used by the current design of the data interactive components and the visual content. However, extending the application by adding new data to the map or wiring up additional map tile services is a somewhat more complicated procedure that will require some knowledge of JavaScript and current web development workflows. Familiarity with the AngularJS framework is also beneficial but not necessarily required. The Aquiferium application must be recompiled after any changes are made to the code base. This document shows step by step instructions for how to extend the mapping services, add additional geodata, and access additional third party resources that describe in detail how to do additional modifications to the map capabilities.  To get started, you will need to launch your IDE or text editor of choice and open the file located at //eaa-aquiferium/app/scripts/directives/leaflet-directive.js. &lt;br /&gt;
&lt;br /&gt;
This file contains the code for the entirety of the mapping capability within Aquiferium. It is an Angular directive, which is essentially a way for JavaScript developers to create new or customized components that can be added to a web application using simple HTML tag syntax. In this case, if you were to open the file located at //eaa-aquiferium/app/views/geography.html and look at line 3 you will see the following HTML &lt;br /&gt;
&lt;br /&gt;
code: &lt;br /&gt;
&lt;br /&gt;
&amp;lt;leaflet-map id=&amp;quot;map&amp;quot; class=&amp;quot;div-fixed z-100&amp;quot; display-recharge-&lt;br /&gt;
panel=&amp;quot;displayRechargePanel()&amp;quot; display-wells-panel=&amp;quot;displayWellsPanel()&amp;quot; display-&lt;br /&gt;
springs-panel=&amp;quot;displaySpringsPanel()&amp;quot;&amp;gt;&amp;lt;/leaflet-map&amp;gt;&lt;br /&gt;
&lt;br /&gt;
Through the use of the Angular directive, any web app developer with access to your directive code file can use it in their project. Simply include it in the framework as a dependency and use the above HTML code snippet alongside the appropriate CSS code. In fact, most of the code snippet is telling the Angular framework how to respond to links specific to the Aquiferium. Therefore, a developer, who only wanted to utilize the mapping capability, could use the following simplified code snippet:&lt;br /&gt;
&lt;br /&gt;
&amp;lt;leaflet-map id=&amp;quot;map&amp;quot;&amp;gt;&amp;lt;/leaflet-map&amp;gt;&lt;br /&gt;
&lt;br /&gt;
Note: All the above code snippets require th CSS code below to also be present within the web application &lt;br /&gt;
&lt;br /&gt;
(the web developer can customize the CSS to their needs):&lt;br /&gt;
&lt;br /&gt;
#map {&lt;br /&gt;
&lt;br /&gt;
 width: 100%;&lt;br /&gt;
&lt;br /&gt;
 height: 100%;&lt;br /&gt;
&lt;br /&gt;
 top: 5% /* Offsets navigation bar */;&lt;br /&gt;
&lt;br /&gt;
 right: 0;&lt;br /&gt;
&lt;br /&gt;
 bottom: 0;&lt;br /&gt;
&lt;br /&gt;
 left: 0;&lt;br /&gt;
&lt;br /&gt;
}&lt;br /&gt;
&lt;br /&gt;
The inclusion of these three pieces – the directive file, the HTML code and the corresponding CSS code - are all that is required to repurpose the mapping capabilities in Aquiferium into any other AngularJS application.&lt;br /&gt;
&lt;br /&gt;
'''Preparing the Codebase for Development&lt;br /&gt;
'''&lt;br /&gt;
The process for recompiling the Aquiferium codebase is relatively straightforward but requires that the entire codebase be downloaded, to include all assets used by the system (these are currently bundled into the code base). In addition to the appropriate source code, new or bundled assets (in the case of new geodata) that are required by the extensions being made you will need to setup the appropriate development environment to compile the application. The aquiferium codebase will require the installation of Ruby, NodeJS, Compass and Yeoman in order to accomplish this, as well as the dependencies for the project that we will install during the compilation process. The following steps detail how to configure the development environment for &lt;br /&gt;
&lt;br /&gt;
compilation of the code base.&lt;br /&gt;
&lt;br /&gt;
Step 1: Install Ruby, Compass, NodeJS, and Yeoman&lt;br /&gt;
&lt;br /&gt;
Links to these resources are included in Appendix C: Aquiferium Technology to acquire the necessary installation files and instructions. It is required that you install NodeJS before compass or Yeoman because the npm utility for package management used for their installation is included with NodeJS.&lt;br /&gt;
&lt;br /&gt;
The recommended installation sequence is as follows: &lt;br /&gt;
&lt;br /&gt;
1. Ruby&lt;br /&gt;
&lt;br /&gt;
2. NodeJS + NPM&lt;br /&gt;
&lt;br /&gt;
3. Compass&lt;br /&gt;
&lt;br /&gt;
4. Yeoman&lt;br /&gt;
&lt;br /&gt;
Both Ruby and NodeJS + NPM are straight forward installers that handle the process for you. During the installation of Ruby, it is recommended you select the ‘add Ruby to your System Path’ option. Once both Ruby and NodeJS are installed, open a command prompt and execute the following commands to prepare the system and install Compass:&lt;br /&gt;
&lt;br /&gt;
&amp;gt; gem update –system&lt;br /&gt;
&lt;br /&gt;
&amp;gt; gem install compass&lt;br /&gt;
&lt;br /&gt;
&amp;gt; npm install compass&lt;br /&gt;
&lt;br /&gt;
Note: There is a known SSL issue with updating ruby gems in order to install the compass gem properly. See this guide for the solution to that error should you encounter it: &lt;br /&gt;
&lt;br /&gt;
https://gist.github.com/luislavena/f064211759ee0f806c88&lt;br /&gt;
&lt;br /&gt;
Step 2: Install Code Dependencies and Packages&lt;br /&gt;
&lt;br /&gt;
Once you have completed the installation of the utilities required for the development workflow, you need to clone a copy of the online repository to your local system. The clone should result in a directory named eaa-&lt;br /&gt;
aquiferium wherever you chose to clone the repo. You should then navigate to the root directory of the code base (../eaa-aquiferium) and open a command prompt in this location.&lt;br /&gt;
&lt;br /&gt;
On the command line. Type the following command to install all Node packages required by the Aquiferium:&lt;br /&gt;
&lt;br /&gt;
&amp;gt; npm install&lt;br /&gt;
&lt;br /&gt;
Note: This process may take several minutes as Aquiferium has a large number of package dependencies required to download and install. It is a good idea additionally to run an update on all the packages required by the product to ensure they are current to the latest or specified versions required by the project. To do this run the following command:&lt;br /&gt;
&lt;br /&gt;
&amp;gt; npm update&lt;br /&gt;
&lt;br /&gt;
Now we need to install all the library dependencies required by Aquiferium. To do this, type the following command:&lt;br /&gt;
&lt;br /&gt;
&amp;gt; bower install&lt;br /&gt;
&lt;br /&gt;
If the bower install command exits without creating and installing the dependencies in the folder located at &lt;br /&gt;
&lt;br /&gt;
./eaa-aquiferium/app/bower_components, check the console output to see if you have received an ENOENT &lt;br /&gt;
&lt;br /&gt;
ERROR for any of the packages you were trying to install. If there is an error it, will most likely, be the last one listed in the console. If so, open up the file located at ./eaa-aquiferium/package.json and look for the matching entry to that package that gave the ENOENT ERROR. Delete it from the file (ensuring that the empty line is also removed and that any trailing commas are removed if this changes the last entry in the “dependencies”: {} object). Make sure you have taken note of what dependency had to be removed in case it needs to be reinstalled with npm or bower at a later time.  Once you have completed installing the bower dependencies, run the following command to execute the gruntfile.js taskrunner from inside ./eaa-aquiferium to ensure the project is fully setup and ready to develop actively on local:&lt;br /&gt;
&lt;br /&gt;
&amp;gt; grunt serve&lt;br /&gt;
&lt;br /&gt;
The gruntfile.js taskrunner completes a myriad of operations on the code base prepares it for deployment and launches a local web server with the application running inside of it for local development and testing.At this point, you should be able to use the Aquiferium exactly like the deployed online version. You can now begin updating or extending the application as described in the following sections.&lt;br /&gt;
&lt;br /&gt;
[[Adding New Map Tile Services]]&lt;br /&gt;
&lt;br /&gt;
Aquiferium has the most popular map tiling services already included with the exception of Google Maps. Due to the complexity of integrating Google Maps into the Aquifeirum application and the fact that its functionality as a tile service is duplicated by Open Street Maps it was determined that the task of code integration was not worth the effort.&lt;br /&gt;
&lt;br /&gt;
Should the inclusion of an additional or map tiling service be necessary, the following code examples demonstrate how to extend the map tiling services available within Aquiferium.&lt;br /&gt;
&lt;br /&gt;
On line 85, insert the following code:&lt;br /&gt;
&lt;br /&gt;
var newMapService_Link = '&amp;lt;a href=&amp;quot;http://www.newmapservice.com/&amp;quot;&amp;gt;New Map Service&amp;lt;/a&amp;gt;';&lt;br /&gt;
&lt;br /&gt;
var newMapService_Url = 'http://newMapTileServer/tile/{z}/{y}/{x}';&lt;br /&gt;
&lt;br /&gt;
var newMapService_Attrib = '&amp;amp;copy; ' + newMapService_Link;&lt;br /&gt;
&lt;br /&gt;
var newMapService_Map = L.tileLayer(newMapService_Url, {&lt;br /&gt;
&lt;br /&gt;
});&lt;br /&gt;
&lt;br /&gt;
On line 520 the baseLayers object for the map is defined. Insert the following code inside this object:&lt;br /&gt;
&lt;br /&gt;
‘newMapService_Map': newMapService_Map,&lt;br /&gt;
&lt;br /&gt;
Note: Care should be taken to ensure inclusion of a trailing comma after this snippet if it is not the last map services listed in the baseLayer object. Vice-versa that there is NO trailing comma if it is the last services listed in the baseLayer object.&lt;br /&gt;
&lt;br /&gt;
Note: The order in which the services are listed here, is the order they will appear on the map for selection.&lt;br /&gt;
&lt;br /&gt;
Note: You should replace all instances of ‘newMapService’ in the code above with a unique name that &lt;br /&gt;
&lt;br /&gt;
describes the map service in some identifiable way. (ex. customMapboxTiles).&lt;br /&gt;
&lt;br /&gt;
That concludes the code changes required to extend the map tiling services available for display in Aquiferium. The application must be recompiled to reflect changes and complete the process. The procedure for recompiling the application is detailed later in this document&lt;br /&gt;
&lt;br /&gt;
[[Adding New Geodata]]&lt;br /&gt;
&lt;br /&gt;
In order to add additional geodata layers, in this case geojson data, you will need to duplicate and modify &lt;br /&gt;
&lt;br /&gt;
code within the directive in several places. Additionally you will need already to have the geojson data you &lt;br /&gt;
&lt;br /&gt;
want to include on the map, preferably optimized for web use, and converted to WGS84 projection. &lt;br /&gt;
&lt;br /&gt;
[[Instructions for the conversion process are as follows.]]&lt;br /&gt;
&lt;br /&gt;
Note: If additional shapefile optimization is required, that process can be found described at this URL: &lt;br /&gt;
&lt;br /&gt;
http://blog.thematicmapping.org/2012/10/mapping-regions-of-new-zealand-with.html&lt;br /&gt;
&lt;br /&gt;
'''Step 1: GDAL Installation'''&lt;br /&gt;
&lt;br /&gt;
In order to convert most geodata files into geoJSON you will need to use the GDAL utility. GDAL stands for Geospatial Data Abstraction Library and is a suite of tools for manipulating, editing, and transforming geodata into and between almost any formats used by GIS systems today. More information about GDAL’s capabilities can be found at this URL: http://www.gdal.org/index.html. GDAL can be downloaded by itself at the following URL: http://trac.osgeo.org/gdal/wiki/DownloadingGdalBinaries. If you are planning on doing regular manipulation of geodata files and are working in a Windows environment, it is recommended instead that you download and install the OSGeo4W suite of tools, which includes GDAL, from this URL: &lt;br /&gt;
&lt;br /&gt;
http://trac.osgeo.org/osgeo4w/ . Regardless of which utility you select, complete the installation instructions that accompany the application on the vendor’s website and when you are done, resume this process.&lt;br /&gt;
&lt;br /&gt;
'''Step 2: Convert Shapefiles to GeoJSON WGS84'''&lt;br /&gt;
&lt;br /&gt;
While there are a great many GIS data formats available, the most common one is the ESRI Shapefile (known by the *.shp) extension. In reality, this is not a single file, but an archived collection of related files that all attribution: newMapService_Attrib describe a specific geographic space and inculcate all the associated data the shapefile represents. To convert the shapefile, you will need to launch the GDAL command prompt, navigate to the directory your shapefile is located in and execute the following command:&lt;br /&gt;
&lt;br /&gt;
&amp;gt; ogr2ogr -f &amp;quot;GeoJSON&amp;quot; -t_srs &amp;quot;WGS84&amp;quot; OUTPUT_FILENAME.json INPUT_FILENAME.shp&lt;br /&gt;
&lt;br /&gt;
This single command will convert the data into geoJSON format as well as alter the mapping projecting to &lt;br /&gt;
&lt;br /&gt;
WGS84 in one pass. If your shapefile has more detailed resolution data than is necessary for the mapping &lt;br /&gt;
&lt;br /&gt;
zoom level you are viewing it at, you can alter the GDAL command to limit the coordinate output included in &lt;br /&gt;
&lt;br /&gt;
the geoJSON file as follows:&lt;br /&gt;
&lt;br /&gt;
&amp;gt; ogr2ogr -f &amp;quot;GeoJSON&amp;quot; -lco COORDINATE_PRECISION=3 -t_srs &amp;quot;WGS84&amp;quot; OUTPUT_FILENAME.json &lt;br /&gt;
&lt;br /&gt;
INPUT_FILENAME.shp&lt;br /&gt;
&lt;br /&gt;
Note: This example limits coordinate output to 3 decimal places, but any whole number can be used here.&lt;br /&gt;
&lt;br /&gt;
This can have dramatic effects on both the output file size and the visual appearance of the geodata on the map. Therefore, some testing may be required to find the correct balance between optimization and display quality desired.&lt;br /&gt;
&lt;br /&gt;
'''Step 3: Add New GeoJSON to the Application'''&lt;br /&gt;
&lt;br /&gt;
To ensure that the new geoJSON is available to the Aquiferium for display, a copy of the geoJSON file needs to be placed in the following directory: &lt;br /&gt;
&lt;br /&gt;
//eaa-aquiferium/app/data/geojson/&amp;lt;optional_subfolder&amp;gt;/NEW_GEOJSON_FILE.json&lt;br /&gt;
&lt;br /&gt;
'''Step 4: Add New GeoJSON to the Map Directive'''&lt;br /&gt;
&lt;br /&gt;
To make the new GeoJSON file accessible to the users of the Aquiferium, it must now be added into the mapping Directive’s codebase. This will require the insertion of several new lines of code in specific locations as detailed in the following section.&lt;br /&gt;
&lt;br /&gt;
Note: Any hexadecimal colors specified can be changed to the developer’s preference.&lt;br /&gt;
&lt;br /&gt;
On line 150 insert the following code:&lt;br /&gt;
&lt;br /&gt;
var newGeojson_Style = { 'fillColor': #F3F };&lt;br /&gt;
&lt;br /&gt;
var newGeojson_StyleHover = { 'fillColor': #3F3 };&lt;br /&gt;
&lt;br /&gt;
On line 185 insert the following code:&lt;br /&gt;
&lt;br /&gt;
var newGeojson_Geojson = './data/geojson/&amp;lt;optional_subfolder&amp;gt;/NEW_GEOJSON_FILE.json';&lt;br /&gt;
&lt;br /&gt;
On line 211 insert the following code: &lt;br /&gt;
&lt;br /&gt;
var newGeojson_Layer = new L.LayerGroup();&lt;br /&gt;
&lt;br /&gt;
On line 328 insert the following code:&lt;br /&gt;
&lt;br /&gt;
$.getJSON(newGeojson_Geojson, function(data) {processGeojson(data, newGeojson_Layer, newGeojson_Style, newGeojson_StyleHover);&lt;br /&gt;
&lt;br /&gt;
 });&lt;br /&gt;
&lt;br /&gt;
Note: Additional interactivity can be added to the GeoJSON layer in the above function, but that process is beyond the scope of this document.&lt;br /&gt;
&lt;br /&gt;
On line 541 insert the following code:&lt;br /&gt;
&lt;br /&gt;
‘New GeoJSON’: newGeojson_Layer,&lt;br /&gt;
&lt;br /&gt;
Note: You should replace all instances of ‘newGeojson‘ in the code above with a unique name that describes your geodata in some identifiable way. (ex. supportRegions_Geojson.json).  That concludes the code changes required to extend the geodata available for display in Aquiferium. To complete this process, the application will now need to be recompiled. The procedure for recompiling the application will be detailed later in this document.&lt;br /&gt;
&lt;br /&gt;
[[Adding New Map Markers]]&lt;br /&gt;
&lt;br /&gt;
If additional markers are desired on the map interface, they can easily be added to the Directive by inserting &lt;br /&gt;
&lt;br /&gt;
the following code at line 456:&lt;br /&gt;
&lt;br /&gt;
var newMarkerLocation = L.latLng(LAT_VAL, LON_VAL);&lt;br /&gt;
&lt;br /&gt;
 var newMarkerOptions = { title: ‘New Marker Title’ };&lt;br /&gt;
&lt;br /&gt;
 var newMarker = L.marker(newMarkerLocation, newMarkerOptions);&lt;br /&gt;
&lt;br /&gt;
 newMarker.addTo(allMarkersLayer); &lt;br /&gt;
&lt;br /&gt;
 var newMarkerPopupContent = '&amp;lt;h2&amp;gt;New Marker Title&amp;lt;/h2&amp;gt;';&lt;br /&gt;
&lt;br /&gt;
 var newMarkerContentContainer = $('&amp;lt;div /&amp;gt;');&lt;br /&gt;
&lt;br /&gt;
 newMarkerContentContainer.html(newMarkerPopupContent);&lt;br /&gt;
&lt;br /&gt;
 newMarker.bindPopup(newMarkerContentContainer[0]);&lt;br /&gt;
&lt;br /&gt;
Note: You should replace all instances of ‘newMarker’ in the code above with a unique name that describes your marker in some identifiable way. (ex. hqOfficesMarker).&lt;br /&gt;
&lt;br /&gt;
That concludes the code changes required to extend the map markers available for display in Aquiferium. To complete this process, the application will now need to be recompiled. The procedure for recompiling the application will be detailed later in this document.&lt;br /&gt;
&lt;br /&gt;
[[Recompiling the Codebase&lt;br /&gt;
]]&lt;br /&gt;
The primary process of compiling the Aquifeirum is very similar to the process for running it during development. The complexities are handled again by the gruntfile.js taskrunner that prepares the codebase for live deployment rather than local development by executing the following command from the ./eaa-aquiferium/ directory:&lt;br /&gt;
&lt;br /&gt;
&amp;gt; grunt build --force&lt;br /&gt;
&lt;br /&gt;
This will initiate the build process. The output will be logged to the console window for observation or for debugging should there be any errors during the process. Once the build process is complete you will find the final version of the project (the set of files that needs to be uploaded to your web server or copied into your local web server for offline use) in the folder located at ./eaa-aquiferium/dist.  The build process has several known bugs that have to be manually corrected as follows in these steps.&lt;br /&gt;
&lt;br /&gt;
'''Step 1: Fix Missing Dependencies''' &lt;br /&gt;
&lt;br /&gt;
Open the index file for the distribution build (located at ./eaa-aquiferium/dist/index.html) and insert the following code into the HTML right after the code reading ‘&amp;lt;!—Place favicon.ico and apple-touch-icon.png in the root directory. --&amp;gt;’ but before the subsequent opening &amp;lt;link&amp;gt; tag:&lt;br /&gt;
&lt;br /&gt;
&amp;lt;link href='http://fonts.googleapis.com/css?family=Open+Sans:400,300,700'&lt;br /&gt;
&lt;br /&gt;
rel='stylesheet' type='text/css'&amp;gt;&lt;br /&gt;
&lt;br /&gt;
 &amp;lt;link href='http://fonts.googleapis.com/css?family=Open+Sans+Condensed:300,700' &lt;br /&gt;
&lt;br /&gt;
rel='stylesheet' type='text/css'&amp;gt;&lt;br /&gt;
&lt;br /&gt;
 &amp;lt;link href='http://fonts.googleapis.com/css?family=Ubuntu:400,500,700' &lt;br /&gt;
&lt;br /&gt;
rel='stylesheet' type='text/css'&amp;gt;&lt;br /&gt;
&lt;br /&gt;
 &amp;lt;link href='http://fonts.googleapis.com/css?family=Ubuntu+Condensed' rel='stylesheet' &lt;br /&gt;
&lt;br /&gt;
type='text/css'&amp;gt;&lt;br /&gt;
&lt;br /&gt;
 &amp;lt;!-- Latest compiled and minified CSS --&amp;gt;&lt;br /&gt;
&lt;br /&gt;
 &amp;lt;link rel=&amp;quot;stylesheet&amp;quot; &lt;br /&gt;
&lt;br /&gt;
href=&amp;quot;https://maxcdn.bootstrapcdn.com/bootstrap/3.3.1/css/bootstrap.min.css&amp;quot;&amp;gt;&lt;br /&gt;
&lt;br /&gt;
'''Step 2: Fix Missing CSS'''&lt;br /&gt;
&lt;br /&gt;
Ensure appended to the very end of the file located at ./eaa-encompass/dist/styles/ the {hash}.main.css is this style:&lt;br /&gt;
&lt;br /&gt;
 .glyph-spacing-right{margin-right:0.5rem;}&lt;br /&gt;
&lt;br /&gt;
Step 3: Replace Missing Images&lt;br /&gt;
&lt;br /&gt;
In the ./eaa-aquiferium/dist/styles folder, create a new folder named images. Navigate to the folder ./eaa-&lt;br /&gt;
aquiferium/app/bower_components/leaflet/dist/images. Copy all image files from this location into this folder you previously created ( . /eaa-aquiferium/dist/styles).&lt;br /&gt;
&lt;br /&gt;
'''Step 4: Replace Missing Glyphs'''&lt;br /&gt;
&lt;br /&gt;
In the folder ./eaa-aquiferium/dist create the following directory structure: &lt;br /&gt;
&lt;br /&gt;
.eaa-aquiferium/dist/bower_components/bootstrap-sass-official/vendor/assets/fonts/bootstrap/&lt;br /&gt;
&lt;br /&gt;
Now navigate to this directory: ./eaa-aquiferium/app/bower_components/bootstrap-sass-official/vendor/assets/fonts/bootstrap/ and copy all the files inside (all named glyphicons-halflings-regular.*) into the last folder you just created in the above directory structure (bootstrap).&lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
'''Step 5: Update Broken Paths'''&lt;br /&gt;
&lt;br /&gt;
The image, video and data asset paths will be incorrect after the initial build. There is a bug in the way the path is updated by the task runner that results in an incorrect path to the resources. It requires a find &amp;amp; replace manual update on the files that follow using the specified values.&lt;br /&gt;
&lt;br /&gt;
[[Video &amp;amp; Image Assets]]&lt;br /&gt;
&lt;br /&gt;
Find and replace all instances of these string values:&lt;br /&gt;
&lt;br /&gt;
‘../videos/’ with ‘./videos/’&lt;br /&gt;
&lt;br /&gt;
‘/images/’ with ‘../images/’&lt;br /&gt;
&lt;br /&gt;
In these HTML files:&lt;br /&gt;
&lt;br /&gt;
./eaa-aquiferium/dist/geography.html&lt;br /&gt;
&lt;br /&gt;
./eaa-aquiferium/dist/geology.html&lt;br /&gt;
&lt;br /&gt;
./eaa-aquiferium/dist/springs.html&lt;br /&gt;
&lt;br /&gt;
./eaa-aquiferium/dist/conservation.html&lt;br /&gt;
&lt;br /&gt;
And in this CSS file: ./eaa-aquiferium/dist/styles/{hash}.main.css. Additionally, you will need to copy the image and video folders (and all their contents) into the ./eaa-aquiferium/dist folder to address a bug in the renaming of files during the build process.&lt;br /&gt;
&lt;br /&gt;
[[Data Assets]]&lt;br /&gt;
&lt;br /&gt;
Find and replace all instances of the string value ‘../../data/’ with ‘./data/’ in the file ./eaa-aquiferium/dist/scripts/{hash}.scripts.js&lt;br /&gt;
&lt;br /&gt;
'''Step 6: Validate Build Locally'''&lt;br /&gt;
&lt;br /&gt;
It is recommended you copy the fully updated distribution code to a local web server (Uniform Server or another solution) and test the application outside the development environment. Any missing assets or other errors will be output to the browser console for further debugging if necessary. Once you are certain that your new build is ready for deployment, upload the distribution files to your web server and test the application again online. You should now have your recompiled application running and be able to see the changes you have made or other extensions you have added.&lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
== Additional Development ==&lt;br /&gt;
&lt;br /&gt;
This document has covered all the basic steps involved in setting up the code base, making changes to the code base, and recompiling the code for live deployment. Additional capabilities are beyond the scope of this document and should be considered as another phase in the project’s development lifecycle.&lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
&amp;lt;!-- Add any wiki Text above this Line --&amp;gt;&lt;br /&gt;
&amp;lt;!-- Do NOT Edit below this Line --&amp;gt;&lt;br /&gt;
{{#set:&lt;br /&gt;
	Expertise=Open_science|&lt;br /&gt;
	Expertise=Geosciences|&lt;br /&gt;
	Owner=Suzanne_Pierce|&lt;br /&gt;
	Progress=0|&lt;br /&gt;
	StartDate=2015-03-21|&lt;br /&gt;
	TargetDate=2015-04-03|&lt;br /&gt;
	Type=Low}}&lt;/div&gt;</summary>
		<author><name>Suzanne</name></author>	</entry>

	<entry>
		<id>https://www.organicdatascience.org/gpf/index.php?title=Develop_proposal_for_special_issue&amp;diff=11711</id>
		<title>Develop proposal for special issue</title>
		<link rel="alternate" type="text/html" href="https://www.organicdatascience.org/gpf/index.php?title=Develop_proposal_for_special_issue&amp;diff=11711"/>
				<updated>2015-04-03T18:23:37Z</updated>
		
		<summary type="html">&lt;p&gt;Suzanne: /* [Pierce 2015] */&lt;/p&gt;
&lt;hr /&gt;
&lt;div&gt;[[Category:Task]]&lt;br /&gt;
&lt;br /&gt;
== Background: Why a Special Issue on Geoscience Papers of the Future? ==&lt;br /&gt;
&lt;br /&gt;
[[Discuss_what_we_will_consider_a_GPF#The_Vision | Include here our discussion for the vision]]&lt;br /&gt;
&lt;br /&gt;
Background should be 1-2 pages.&lt;br /&gt;
&lt;br /&gt;
Motivated by need to fully document and make research accessible and reproducible. &lt;br /&gt;
&lt;br /&gt;
=== Motivation: The EarthCube Initiative and the GeoSoft Project ===&lt;br /&gt;
&lt;br /&gt;
[http://www.geosoft-earthcube.org/about Include here background about GeoSoft from the web site]&lt;br /&gt;
&lt;br /&gt;
OSTP memo.  EarthCube reports.&lt;br /&gt;
Other reports that talk about the need for new approaches to editing.&lt;br /&gt;
&lt;br /&gt;
It's possible that small or very large contributions are not well captured in the current publishing paradigms.  Nanopublications.&lt;br /&gt;
&lt;br /&gt;
For example, nano-publications are a possible way to reflect advances in a research process that may not merit a full pubication but they are useful advances to share with the community. A challenge here is that there is a stigma in publishing for publishing units that are too small or very small.  &lt;br /&gt;
&lt;br /&gt;
Alternatively, a very large piece of research or work with many parts may be better suited to a GPF style publication.&lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
Perhaps, the concept of a 'paper' can be better reflected in the concept of a 'wrapper' or a collection of materials and resources. The purpose is to assure that publications are representative of the work, effort, and results achieved in the research process.&lt;br /&gt;
&lt;br /&gt;
=== What is a GPF ===&lt;br /&gt;
&lt;br /&gt;
[[Discuss_what_we_will_consider_a_GPF#What_is_a_Geoscience_Paper_of_the_Future.3F | Include here our discussion of what is a GPF]]&lt;br /&gt;
&lt;br /&gt;
=== The challenges of creating GPFs ===&lt;br /&gt;
&lt;br /&gt;
The articles in this issue reflect the current best practice for generating a Geoscience Paper of the Future.&lt;br /&gt;
&lt;br /&gt;
'''Figure discussions''': Do we want to do exactly the same figure automatically.  Figures in the paper may be a clean versions of an image generated by software.  To the extent possible, authors have included clear delineations of provenance. The goal is to assure that readers may regenerate the figures using documented workflows, data, and codes.  An important note (Allen, Sandra) is that frequently figures are generated by code, scripts, etc. yet the actual figure is finalized with user.....  Mimi is trying to say: is it really worth belaboring the point about how the prettified version of the figure is made? If it is: both of the visualization software I've used (Matlab and SigmaPlot) have actual code in the background that specifies how to set up the prettification, and this code can be found, copied out, and rerun to generate the exact same figure with all of the prettification in the same place. SigmaPlot uses Visual Basic (I think) in its macros. If it is an important point about explicit code, this should be doable. But I'm not sure it's strictly necessary to specify exactly where all the prettifications are to get the gist across.&lt;br /&gt;
&lt;br /&gt;
How much of your experimental history does one include?  (Ibrahim).  The experimental process often ends up nowhere.  Should we document all the failed experiments?  Get one DOI for the results of the successful experiment?  Another for failed trials?&lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
'''''Documenting: Timing and Intermediate proceses'''''&lt;br /&gt;
When should we document and what are the bounds on what we document?&lt;br /&gt;
For example, should we document and include data and workflows for 'failed' experiments? Or should we assign datasets DOIs before we know the results from using them?  &lt;br /&gt;
The group thinks that  good ideas/practices may include documenting and sharing data when you have a clear understanding of the outcomes worth reporting. For example successful experiments should have clear, clean data documented and shared. Whereas one strategy with 'failed' experiments could include bundling the intermediate datasets with one DOI and a more general discussion of the process/methods.&lt;br /&gt;
&lt;br /&gt;
=== Related work ===&lt;br /&gt;
&lt;br /&gt;
[[Discuss_what_we_will_consider_a_GPF#New_Frameworks_to_Create_a_New_Generation_of_Scientific_Articles | Include here the related work we have discussed]]&lt;br /&gt;
&lt;br /&gt;
== Papers to be included ==&lt;br /&gt;
&lt;br /&gt;
Would it be worthwhile to group the papers into broader categories rather than giving specifics about every single paper?&lt;br /&gt;
&lt;br /&gt;
For each submission, we describe:&lt;br /&gt;
&lt;br /&gt;
* '''Authors and affiliations'''&lt;br /&gt;
* '''Keywords of research area'''&lt;br /&gt;
* '''Tentative title'''&lt;br /&gt;
* '''Short abstract'''&lt;br /&gt;
* '''Challenge''' (including &amp;quot;Reproducibility,&amp;quot; &amp;quot;Dark Code,&amp;quot; &amp;quot;Sharing Big Data,&amp;quot; and &amp;quot;Transferability&amp;quot;)&lt;br /&gt;
* '''Relationship to other publications''' (is the article based on a previously published article? is it new content? IF PREVIOUSLY PUBLISHED, PLS PROVIDE A POINTER TO THE PUBLISHED ARTICLE AND SPECIFY WHAT PERCENTAGE OF THE WORK PRESENTED WILL BE NEW)&lt;br /&gt;
* '''Pointer to the wiki page that documents the article'''&lt;br /&gt;
* '''Expected submission date'''&lt;br /&gt;
&lt;br /&gt;
=== [David 2015] ===&lt;br /&gt;
&lt;br /&gt;
* '''Authors and affiliations:''' [[Cedric David]]&lt;br /&gt;
* '''Keywords of research area:''' Hydrology, Rivers, Modeling, Testing, Reproducibility. &lt;br /&gt;
* '''Tentative title:''' Going beyond triple-checking, allowing for peace of mind in community model development.&lt;br /&gt;
* '''Short abstract:''' The development of computer models in the general field of geoscience is often made incrementally over many years.  Endeavors that generally start on one single researcher's own machine evolve over time into software that are often much larger than was initially anticipated.  Looking at years of building on their computer code, sometimes without much training in computer science, geoscience software developers can easily experience an overwhelming sense of incompetence when contemplating ways to further community usage of their software.  How does one allow others to use their code?  How can one foster survival of their tool?  How could one possibly ensure the scientific integrity of ongoing developments including those made by others?  Common issues faced by geoscience developers include selecting a license, learning how to track and document past and ongoing changes, choosing a software repository, and allowing for community development.  This paper provides a brief summary of experience with the three former steps of software growth by focusing on the almost decade-long code development of a river routing model.  The core of this study, however, focuses on reproducing previously-published experiments.  This step is highly repetitive and can therefore benefit greatly from automation.  Additionally, enabling automated software testing can arguably be considered the final step for sustainable software sharing, by allowing the main software developer to let go of a mental block considering scientific integrity.  Creating tools to automatically compare the results of an updated version of a software with those of previous studies can not only save the main developer's own time, it can also empower other researchers to in their ability to check and justify that their potential additions have retained scientific integrity.   &lt;br /&gt;
* '''Challenge:''' Reproducibility; Ensure that updates to an existing model are able to reproduce a series of simulations published previously.&lt;br /&gt;
* '''Relationship to other publications:''' This research is related to past and ongoing development of the Routing Application for Parallel computatIon of Discharge (RAPID).  The primary focus of this paper is to allow automated reproducibility of at least the [http://dx.doi.org/10.1175/2011JHM1345.1 first RAPID publication].  The scientific subject of this GPF differs from the article(s) to be reproduced as its focus is on development of automatic testing methods.  In that regard, the paper is expected to be 95% new. &lt;br /&gt;
* '''Pointer to the wiki page that documents the article:''' [[Document_GPF_activities_by_Cedric_David | Page]]&lt;br /&gt;
* '''Expected submission date:'''&lt;br /&gt;
&lt;br /&gt;
=== [Demir 2015] ===&lt;br /&gt;
&lt;br /&gt;
* '''Authors and affiliations:''' [[Ibrahim Demir]]&lt;br /&gt;
* '''Keywords of research area:''' hydrological network, optimization, network representation, database query&lt;br /&gt;
* '''Tentative title:''' Analysis and Optimization of Hydrological Network Database Representation Methods for Fast Access and Query in Web-based System&lt;br /&gt;
* '''Short abstract:''' Web based systems allow users to delineate watersheds on interactive map environments using server side processing. With increasing resolution of hydrological networks, optimized methods for storage of network representation in databases, and efficient queries and actions on the river network structure become critical. This paper presents a detailed study on analysis of widely used methods for representing hydrological networks in relational databases, and benchmarking common queries and modifications on the network structure using these methods. The analysis has been applied to the hydrological network of Iowa utilizing 90m DEM and 600,000 network nodes. The application results indicate that the representation methods provide massive improvements on query times and storage of network structure in the database. Suggested method allows watershed delineation tools running on client-side with desktop-like performance. &lt;br /&gt;
* '''Challenge:''' Reproducibility, Transferability; Some of the internal steps to prepare data might require long computation time and different software environments.&lt;br /&gt;
* '''Relationship to other publications:''' The article is based on a new study&lt;br /&gt;
* '''Pointer to the wiki page that documents the article:''' [[Document_GPF_activities_by_Ibrahim_Demir | Page]]&lt;br /&gt;
* '''Expected submission date:'''&lt;br /&gt;
&lt;br /&gt;
=== [Fulweiler 2015] ===&lt;br /&gt;
&lt;br /&gt;
* '''Authors and affiliations:''' [[Wally Fulweiler]]&lt;br /&gt;
* '''Keywords of research area:'''&lt;br /&gt;
* '''Tentative title:'''&lt;br /&gt;
* '''Short abstract:'''&lt;br /&gt;
* '''Challenge:''' &lt;br /&gt;
* '''Relationship to other publications:''' (is the article based on a previously published article? is it new content?)&lt;br /&gt;
* '''Pointer to the wiki page that documents the article:''' [[Document_GPF_activities_by_Wally_Fulweiler | Page]]&lt;br /&gt;
* '''Expected submission date:'''&lt;br /&gt;
&lt;br /&gt;
=== [Loh and Karlstrom 2015] ===&lt;br /&gt;
&lt;br /&gt;
* '''Authors and affiliations:'''  [[Lay Kuan Loh]] and [[Leif Karlstrom]]&lt;br /&gt;
* '''Keywords of research area:''' Spatial clustering, Eigenvector selection, Entropy Ranking, Cascades Volcanic Region, [http://geosphere.gsapubs.org/content/3/3/152.abstract Afar Depression], [http://astrogeology.usgs.gov/search/details/Mars/Research/Volcanic/TharsisVents/zip Tharsis provonce]&lt;br /&gt;
* '''Tentative title:''' Characterization of volcanic vent distributions using spectral clustering with eigenvector selection and entropy ranking&lt;br /&gt;
* '''Short abstract:''' Volcanic vents on the surface of Earth and other planets often appear in groups that exhibit spatial patterning. Such vent distributions reflect complex interplay between time-evolving mechanical controls on the pathways of magma ascent, background tectonic stresses, and unsteady supply of rising magma. With the ultimate aim of connecting surface vent distributions with the dynamics of magma ascent, we have developed a clustering method to quantify spatial patterns in vents. Clustering is typically used in exploratory data analysis to identify groups with similar behavior by partitioning a dataset into clusters that share similar attributes. Traditional clustering algorithms that work well on simple point-cloud type synthetic datasets generally do not scale well the real-world data we are interested in, where there are poor boundaries between clusters and much ambiguity in cluster assignments. We instead use a spectral clustering algorithm with eigenvector selection based on entropy ranking based off work from [http://www.sciencedirect.com/science/article/pii/S0925231210001311 Zhao et al 2010] that outperforms traditional spectral clustering algorithms in choosing the right number of clusters for point data. We benchmark this algorithm on synthetic vent data with increasingly complex spatial distributions, to test the ability to accurately cluster vent data with variable spatial density, skewness, number of clusters, and proximity of clusters. We then apply our algorithm to several real-world datasets from the Cascades, Afar Depression and Mars. &lt;br /&gt;
* '''Challenge:''' Reproducibility (i.e., Quantifying clustering); We plan to study how varying the statistical distribution, density, skewness, background noise, number of clusters, proximity of clusters, and combinations of any of these factors affects the performance of our algorithm. We test it against man-made and real world datasets. ''' &lt;br /&gt;
* '''Relationship to other publications:''' New content, but one of the databases we are studying in the paper (Cascades Volcanic Range) would be based off a different paper we are preparing and planning to submit earlier. &lt;br /&gt;
* '''Pointer to the wiki page that documents the article:''' [[Document_GPF_activities_by_Leif_Karlstrom | Page]]&lt;br /&gt;
* '''Expected submission date:''' June 2015&lt;br /&gt;
&lt;br /&gt;
=== [Lee 2015] ===&lt;br /&gt;
&lt;br /&gt;
* '''Authors and affiliations:''' [[Kyo Lee]], Maziyar Boustani and Chris Mattmann, Jet Propulsion Laboratory&lt;br /&gt;
* '''Keywords of research area:'''North American regional climate, regional climate model evaluation system, Open Climate Workbench, &lt;br /&gt;
* '''Tentative title:''' Evaluation of simulated temperature, precipitation, cloud fraction and insolation over the conterminous United States using Regional Climate Model Evaluation System&lt;br /&gt;
* '''Short abstract:'''This study describes the detailed process of evaluating model fidelity in simulating four key climate variables, surface air temperature, precipitation, cloud fraction and insolation and their covariability over the conterminous United States region. Regional Climate Model Evaluation System (RCMES), a suite of public database and open-source software package, provides both observational datasets and data processors useful for evaluating any climate models. In this paper, we provide a clear and easy-to-follow workflow of RCMES to replicate published papers evaluating North American Regional Climate Change Assessment Program (NARCCAP) regional climate model (RCM) hindcast simulations using observations from variety of sources. &lt;br /&gt;
* '''Challenge:'''Big Data Sharing, Dark Code; Sharing big data, better documenting source codes, encouraging climate science community to use RCMES  &lt;br /&gt;
* '''Relationship to other publications:''' [http://journals.ametsoc.org/doi/abs/10.1175/JCLI-D-12-00452.1 Kim et al. 2013], [http://link.springer.com/article/10.1007/s00382-014-2253-y Lee et al. 2014]&lt;br /&gt;
* '''Pointer to the wiki page that documents the article:''' [[Document_GPF_activities_by_Kyo_Lee | Page]]&lt;br /&gt;
* '''Expected submission date:'''End of June 2015&lt;br /&gt;
&lt;br /&gt;
=== [Miller 2015] ===&lt;br /&gt;
&lt;br /&gt;
* '''Authors and affiliations:''' [[Kim Miller]]&lt;br /&gt;
* '''Keywords of research area:'''&lt;br /&gt;
* '''Tentative title:'''&lt;br /&gt;
* '''Short abstract:'''&lt;br /&gt;
* '''Challenge:''' &lt;br /&gt;
* '''Relationship to other publications:''' (is the article based on a previously published article? is it new content?)&lt;br /&gt;
* '''Pointer to the wiki page that documents the article:''' [[Document_GPF_activities_by_Kim_Miller | Page]]&lt;br /&gt;
* '''Expected submission date:'''&lt;br /&gt;
&lt;br /&gt;
=== [Mills 2015] ===&lt;br /&gt;
&lt;br /&gt;
* '''Authors and affiliations:''' [[Heath Mills]], University of Houston Clear Lake; Brandi Kiel Reese, Texas A&amp;amp;M Corpus Christi&lt;br /&gt;
* '''Keywords of research area:'''&lt;br /&gt;
* '''Tentative title:'''Iron and Sulfur Cycling Biogeography Using Advanced Geochemical and Molecular Analyses&lt;br /&gt;
* '''Short abstract:'''My paper will develop and document a new pipeline to analyze a combined and robust genetic and geochemical data set. New, reproducible methods will be highlighted in this manuscript to help others better analyze similar data sets. There is a general lack of guidance within my field for such challenges. This manuscript will be unique and helpful from an analysis standpoint as well as for the science being presented.&lt;br /&gt;
* '''Challenge:''' Reproducibility; Dark Code&lt;br /&gt;
* '''Relationship to other publications:''' Original Manuscript&lt;br /&gt;
* '''Pointer to the wiki page that documents the article:''' [[Document_GPF_activities_by_Heith_Mills | Page]]&lt;br /&gt;
* '''Expected submission date:'''&lt;br /&gt;
&lt;br /&gt;
=== [Oh 2015] ===&lt;br /&gt;
&lt;br /&gt;
* '''Authors and affiliations:''' [[Ji-Hyun Oh]] Jet Propulsion Laboratory/University of Southern California&lt;br /&gt;
* '''Keywords of research area:''' Tropical Meteorology, Madden-Julian Oscillation, Momentum budget analysis &lt;br /&gt;
* '''Tentative title:''' Tools for computing momentum budget for the westerly wind event associated with the Madden-Julian Oscillation&lt;br /&gt;
* '''Short abstract:'''As one of the most pronounced modes of tropical intraseasonal variability, the Madden-Julian Oscillation (MJO) prominently connects global weather and climate, and serves as one of critical predictability sources for extended-range forecasting. The zonal circulation of the MJO is characterized by low-level westerlies (easterlies) in and to the west (east) of the convective center, respectively. The direction of zonal winds in the upper troposphere is opposite to that in the lower troposphere. In addition to the convective signal as an identifier of the MJO initiation, certain characteristics of the zonal circulation been used as a standard metric for monitoring the state of MJO and investigating features of the MJO and its impact on other atmospheric phenomena. This paper documents a tool for  investigating  the generation of low-level westerly winds during the MJO life cycle. The tool is used for the momentum budget analysis to understand the respective contributions of various processes involved in the wind evolution associated with the MJO using European Centre for Medium-Range Weather Forecasts operational analyses during Dynamics of the Madden–Julian Oscillation field campaign.&lt;br /&gt;
&lt;br /&gt;
* '''Challenge:''' Reproducibility, Dark Code; This paper will cover how to reproduce two key figures from the paper that I recently submitted to Journal of Atmospheric Science. This will include detailed procedures related to generating the figures such as how/where to download data, how to transform the format of the data to be used as an input for my codes, and so on.. &lt;br /&gt;
* '''Relationship to other publications:''' (is the article based on a previously published article? is it new content?) This article is related to the part of the paper submitted to Journal of Atmospheric Science. &lt;br /&gt;
* '''Pointer to the wiki page that documents the article:''' [[Document_GPF_activities_by_Ji_Hyun | Page]]&lt;br /&gt;
* '''Expected submission date:'''&lt;br /&gt;
&lt;br /&gt;
=== [Pierce 2015] ===&lt;br /&gt;
&lt;br /&gt;
* '''Authors and affiliations:'''  [[Suzanne Pierce]], John Gentle, and Daniel Noll (Texas Advanced Computing Center and Jackson School of Geosciences, The University of Texas at Austin; US Department of Energy)&lt;br /&gt;
&lt;br /&gt;
* '''Keywords of research area:''' Decision Support Systems, Hydrogeology, Participatory Modeling, Data Fusion &lt;br /&gt;
* '''Tentative title:''' MCSDSS: An accessible platform and application to enable data fusion and interactive visualization for the Geosciences&lt;br /&gt;
* '''Short abstract:'''The MCSDSS application is an advanced example of interactive design that can lead to data fusion for science visualization, decision support applications, and education. What sets the tool apart is its firm underpinning in data, innovative new forms of interface design, and the reusable platform. A key advance is the creation of a framework that can be used to feed new data, videos maps, images, or formats of information into the application with relative ease. &lt;br /&gt;
&lt;br /&gt;
* '''Challenge:''' Reproducibility, Dark Code; Fully document a new software application and framework using example case study data and tutorials; Creation of an interface that enables non-programmers to build out interactive visualizations for their data&lt;br /&gt;
* '''Relationship to other publications:''' This article is new content, the proof of concept idea was developed with DOE funding for a student competition and resulted in an initial implementation that was reported in the DOE competition report and a masters thesis for co-author Daniel Noll&lt;br /&gt;
* '''Pointer to the wiki page that documents the article:''' [[Document_GPF_activities_by_Suzanne_Pierce | Page]]&lt;br /&gt;
* '''Expected submission date:''' mid- to late June 2015&lt;br /&gt;
&lt;br /&gt;
=== [Pope 2015] ===&lt;br /&gt;
&lt;br /&gt;
* '''Authors and affiliations:''' [[Allen Pope]], National Snow and Ice Data Center, University of Colorado, Boulder&lt;br /&gt;
* '''Keywords of research area:''' Glaciology, Remote Sensing, Landsat 8, Polar Science&lt;br /&gt;
* '''Tentative title:''' Data and Code for Estimating and Evaluating Supraglacial Lake Depth With Landsat 8 and other Multispectral Sensors&lt;br /&gt;
* '''Short abstract:''' Supraglacial lakes play a significant role in glacial hydrological systems – for example, transporting water to the glacier bed in Greenland or leading to ice shelf fracture and disintegration in Antarctica. To investigate these important processes, multispectral remote sensing provides multiple methods for estimating supraglacial lake depth – either through single-band or band-ratio methods, both empirical and physically-based. Landsat 8 is the newest satellite in the Landsat series. With new bands, higher dynamic range, and higher radiometric resolution, the Operational Land Imager (OLI) aboard Landsat 8 has a lot of potential. &lt;br /&gt;
&lt;br /&gt;
This paper will document the data and code used in processing in situ reflectance spectra and depth measurements to investigate the ability of Landsat 8 to estimate lake depths using multiple methods, as well as quantify improvements over Landsat 7’s ETM+. A workflow, data, and code are provided to detail promising methods as applied to Landsat 8 OLI imagery of case study areas in Greenland, allowing calculation of regional volume estimates using 2013 and 2014 summer-season imagery. Altimetry from WorldView DEMs are used to validate lake depth estimates. The optimal method for supraglacial lake depth estimation with Landsat 8 is shown to be an average of single band depths by red and panchromatic bands. With this best method, preliminary investigation of seasonal behavior and elevation distribution of lakes is also discussed and documented.&lt;br /&gt;
* '''Challenge:''' Reproducibility, Dark Code&lt;br /&gt;
* '''Relationship to other publications:''' Documenting and explaining the data and code behind the analysis and results presented in another paper.&lt;br /&gt;
* '''Pointer to the wiki page that documents the article:''' [[Document_GPF_activities_by_Allen_Pope | Page]]&lt;br /&gt;
* '''Expected submission date:''' Late June 2015&lt;br /&gt;
&lt;br /&gt;
=== [Read and Winslow 2015] ===&lt;br /&gt;
&lt;br /&gt;
* '''Authors and affiliations:''' [[Jordan Read]] and [[Luke Winslow]]&lt;br /&gt;
* '''Keywords of research area:'''&lt;br /&gt;
* '''Tentative title:'''&lt;br /&gt;
* '''Short abstract:'''&lt;br /&gt;
* '''Challenge:''' &lt;br /&gt;
* '''Relationship to other publications:''' (is the article based on a previously published article? is it new content?)&lt;br /&gt;
* '''Pointer to the wiki page that documents the article:''' [[Document_GPF_activities_by_Jordan_Read | Page]]&lt;br /&gt;
* '''Expected submission date:'''&lt;br /&gt;
&lt;br /&gt;
=== [Tzeng 2015] ===&lt;br /&gt;
&lt;br /&gt;
* '''Authors and affiliations:''' [[Mimi Tzeng]], Brian Dzwonkowski (DISL); Kyeong Park (TAMU Galveston)&lt;br /&gt;
* '''Keywords of research area:'''physical oceanography, remote sensing&lt;br /&gt;
* '''Tentative title:''' Fisheries Oceanography of Coastal Alabama (FOCAL): A Subset of a Time-Series of Hydrographic and Current Data from a Permanent Moored Station Outside Mobile Bay (27 Jan to 18 May 2011)&lt;br /&gt;
* '''Short abstract:'''The Fisheries Oceanography in Coastal Alabama (FOCAL) program began in 2006 as a way for scientists at Dauphin Island Sea Lab (DISL) to study the natural variability of Alabama's nearshore environment as it relates to fisheries production. FOCAL provided a long-term baseline data set that included time-series hydrographic data from a permanent offshore mooring (ADCP, vertical thermister array and CTDs at surface and bottom) and shipboard surveys (vertical CTD profiles and water sampling), as well as monthly ichthyoplankton and zooplankton (depth-discrete) sample collections at FOCAL sites. The subset of data presented here are from the mooring, and includes a vertical array of thermisters, CTDs at surface and bottom, an ADCP at the bottom, and vertical CTD profiles collected at the mooring during maintenance surveys. The mooring is located at 30 05.410'N 88 12.694'W, 25 km southwest of the entrance to Mobile Bay. Temperature, salinity, density, depth, and current velocity data were collected at 20-minute intervals from 2006 to 2012. Other parameters, such as dissolved oxygen, are available for portions of the time series depending on which instruments were deployed at the time.&lt;br /&gt;
* '''Challenge:''' Dark Code, Reproducibility; My paper will be about the processing of data in a larger dataset, from which peer-reviewed papers have been written. The processing I did was not specific to any particular paper. I can point to an example paper that used some of the data from this dataset, that I processed, however all of the figures in the paper are composites that also include other data from elsewhere that I had nothing to do with (and it wouldn't be feasible to try to get hold of the other data within our timeframe).&lt;br /&gt;
* '''Relationship to other publications:''' A recent paper that used the part of the FOCAL data I'm documenting as the sample from the larger dataset: Dzwonkowski, Brian, Kyeong Park, Jungwoo Lee, Bret M. Webb, and Arnoldo Valle-Levinson. 2014. &amp;quot;Spatial variability of flow over a river-influenced inner shelf in coastal Alabama during spring.&amp;quot; Continental Shelf Research 74:25-34.&lt;br /&gt;
* '''Pointer to the wiki page that documents the article:''' [[Document_GPF_activities_by_Mimi_Tzeng | Page]]&lt;br /&gt;
* '''Expected submission date:'''&lt;br /&gt;
&lt;br /&gt;
=== [Villamizar 2015] ===&lt;br /&gt;
&lt;br /&gt;
* '''Authors and affiliations:''' [[Sandra Villamizar]], University of California, Merced&lt;br /&gt;
* '''Keywords of research area:''' river ecohydrology&lt;br /&gt;
* '''Tentative title:''' Producing long-term series of whole-stream metabolism using readily available data.  &lt;br /&gt;
* '''Short abstract:''' Continuous water quality and river discharge data that are readily available through government websites may be used to produce valuable information about key processes within a river ecosystem. In this paper I describe in detail the steps for acquisition and processing of river flow, dissolved oxygen, temperature, and specific conductance data that, combined with atmospheric data and physical properties of the river reach of interest, allow for the production of a long-term series of whole stream metabolism. This information is key in understanding the structure and function of an ecosystem such as the San Joaquin River in the Central Valley of California which has been increasingly degraded during the last 60 years due to intensive human intervention but now, since 2010, has been going through a restoration effort. The key advantage of this tool is that it uses readily available information to produce knowledge about a river ecosystem. This set of scripts, written in the R code, can be used immediately for any other river for which the key parameters (river flow, dissolved oxygen, temperature, and specific conductivity) are available. The scripts can also be modified by users to fit their particular site conditions.&lt;br /&gt;
 &lt;br /&gt;
* '''Challenge:''' Reproducibility; Dark Code; Document new software/applications. This set of scripts was written after the necessity of generating daily estimates of metabolic rates for long periods of time and at various sites within the San Joaquin River.  &lt;br /&gt;
* '''Relationship to other publications:''' This will be a new publication&lt;br /&gt;
* '''Pointer to the wiki page that documents the article:''' [[Document_GPF_activities_by_Sandra_Villamizar | Page]]&lt;br /&gt;
* '''Expected submission date:''' To be defined&lt;br /&gt;
&lt;br /&gt;
=== [Yu and Bhatt 2015] ===&lt;br /&gt;
&lt;br /&gt;
* '''Authors and affiliations:''' [[Xuan Yu]], Department of Geological Sciences, University of Delaware. Gopal Bhatt, Department of Civil &amp;amp; Environmental Engineering, Pennsylvania State University. &lt;br /&gt;
* '''Keywords of research area:''' coupled processes, integrated hydrologic modeling, PIHM, surface flow, subsurface flow, open science&lt;br /&gt;
* '''Tentative title:''' Learning integrated modeling of surface and subsurface flow from scratch&lt;br /&gt;
* '''Short abstract:''' Integrated modeling of surface and subsurface flow has been of great interest in understanding not only intimate interconnectedness of hydrological processes, but also land-surface energy balance, biogeochemical and ecological processes, and landscape evolution. Although a growing number of complex hydrologic models have been used for resolving environmental processes, hypothesis testing, hydrologic predictions for effective management of watershed, very limited resources of the model implementation have been made accessible to a large group of model users. The users have to invest a significant amount of time and effort to reproduce, and to understand the workflow of hydrologic simulation in a modeling paper. To provide a challenging and stimulating introduction to integrated modeling of surface and subsurface flow in this paper, we revisit the development of Penn State Integrated Hydrologic Model (PIHM) by reproducing a numerical benchmarking example, and a real world catchment scale application. Specifically, we document PIHM and it’s modeling workflow to enable basic understanding of simulating coupled surface and subsurface flow processes. We provide model and data to highlight the reciprocal roles between the two. In addition, we incorporate user experience as third dimension in the modeling workflow to enable deeper communications between model developers and users. The workflow has important implications for smoothing and accelerating open scientific collaborations in geosciences research.&lt;br /&gt;
* '''Challenge:''' Reproducibility; Reproduce published simulations by a existing model with the latest version. Benchmarking modeling application for numerical experiment and field data.&lt;br /&gt;
* '''Relationship to other publications:''' The article is based on a previously published article. &lt;br /&gt;
* '''Pointer to the wiki page that documents the article:''' [[Document_GPF_activities_by_Xuan_Yu | Page]]&lt;br /&gt;
* '''Expected submission date:''' End of June 2015&lt;br /&gt;
&lt;br /&gt;
== Special Issue Editors ==&lt;br /&gt;
&lt;br /&gt;
* Co-editor: Chris Duffy and/or Scott Peckham&lt;br /&gt;
* Co-editor: Cedric David&lt;br /&gt;
* Co-editor: possibly Karan Venayagamoorthy&lt;br /&gt;
&lt;br /&gt;
The editors will only accept submissions that follow the [[Develop_proposal_for_special_issue#Special_Issue_Review_Criteria | special issue review criteria]].&lt;br /&gt;
&lt;br /&gt;
The editors will select a set of reviewers to handle the submissions.  Reviewers will include computer scientists, library scientists, and geoscientists.&lt;br /&gt;
&lt;br /&gt;
== Special Issue Review Criteria ==&lt;br /&gt;
&lt;br /&gt;
The reviewers will be asked to provide feedback on the papers according to the following criteria. Note that some papers will have good reasons for limiting the information (e.g. the data is from third parties and not openly available, etc), and in that case they would document those reasons.&lt;br /&gt;
&lt;br /&gt;
* Documentation of the datasets: descriptions of datasets, unique identifiers, repositories.&lt;br /&gt;
* Documentation of software: description of all software used (including pre-processing of data, visualization steps, etc), unique identifiers, repositories.&lt;br /&gt;
* Documentation of the provenance of results: provenance for each figure or result, such as the workflow or the provenance record.&lt;br /&gt;
&lt;br /&gt;
== Tentative Timeline ==&lt;br /&gt;
&lt;br /&gt;
* Journal committed to special issue: April 15, 2015&lt;br /&gt;
* Submissions due to editors: June 30, 2015&lt;br /&gt;
* Reviews due: Sept 15, 2015&lt;br /&gt;
* Decisions out to authors: Sept 30, 2015&lt;br /&gt;
* Revisions due: October 31, 2015&lt;br /&gt;
* Final versions due November 15, 2015&lt;br /&gt;
* Issue published December 31, 2015&lt;br /&gt;
&lt;br /&gt;
&amp;lt;!-- Add any wiki Text above this Line --&amp;gt;&lt;br /&gt;
&amp;lt;!-- Do NOT Edit below this Line --&amp;gt;&lt;br /&gt;
{{#set:&lt;br /&gt;
	Owner=Chris_Duffy|&lt;br /&gt;
	Participants=Yolanda_Gil|&lt;br /&gt;
	Participants=Scott_Peckham|&lt;br /&gt;
	Participants=Cedric_David|&lt;br /&gt;
	Participants=Ibrahim_Demir|&lt;br /&gt;
	Participants=Wally_Fulweiler|&lt;br /&gt;
	Participants=Leif_Karlstrom|&lt;br /&gt;
	Participants=Kyo_Lee|&lt;br /&gt;
	Participants=Kim_Miller|&lt;br /&gt;
	Participants=Heath_Mills|&lt;br /&gt;
	Participants=Ji-Hyun_Oh|&lt;br /&gt;
	Participants=Suzanne_Pierce|&lt;br /&gt;
	Participants=Allen_Pope|&lt;br /&gt;
	Participants=Jordan_Read|&lt;br /&gt;
	Participants=Mimi_Tzeng|&lt;br /&gt;
	Participants=Sandra_Villamizar|&lt;br /&gt;
	Participants=Xuan_Yu|&lt;br /&gt;
	Progress=20|&lt;br /&gt;
	StartDate=2015-03-10|&lt;br /&gt;
	TargetDate=2015-03-16|&lt;br /&gt;
	Type=Low}}&lt;/div&gt;</summary>
		<author><name>Suzanne</name></author>	</entry>

	<entry>
		<id>https://www.organicdatascience.org/gpf/index.php?title=Develop_proposal_for_special_issue&amp;diff=11710</id>
		<title>Develop proposal for special issue</title>
		<link rel="alternate" type="text/html" href="https://www.organicdatascience.org/gpf/index.php?title=Develop_proposal_for_special_issue&amp;diff=11710"/>
				<updated>2015-04-03T18:03:01Z</updated>
		
		<summary type="html">&lt;p&gt;Suzanne: /* [Pierce 2015] */&lt;/p&gt;
&lt;hr /&gt;
&lt;div&gt;[[Category:Task]]&lt;br /&gt;
&lt;br /&gt;
== Background: Why a Special Issue on Geoscience Papers of the Future? ==&lt;br /&gt;
&lt;br /&gt;
[[Discuss_what_we_will_consider_a_GPF#The_Vision | Include here our discussion for the vision]]&lt;br /&gt;
&lt;br /&gt;
Background should be 1-2 pages.&lt;br /&gt;
&lt;br /&gt;
Motivated by need to fully document and make research accessible and reproducible. &lt;br /&gt;
&lt;br /&gt;
=== Motivation: The EarthCube Initiative and the GeoSoft Project ===&lt;br /&gt;
&lt;br /&gt;
[http://www.geosoft-earthcube.org/about Include here background about GeoSoft from the web site]&lt;br /&gt;
&lt;br /&gt;
OSTP memo.  EarthCube reports.&lt;br /&gt;
Other reports that talk about the need for new approaches to editing.&lt;br /&gt;
&lt;br /&gt;
It's possible that small or very large contributions are not well captured in the current publishing paradigms.  Nanopublications.&lt;br /&gt;
&lt;br /&gt;
For example, nano-publications are a possible way to reflect advances in a research process that may not merit a full pubication but they are useful advances to share with the community. A challenge here is that there is a stigma in publishing for publishing units that are too small or very small.  &lt;br /&gt;
&lt;br /&gt;
Alternatively, a very large piece of research or work with many parts may be better suited to a GPF style publication.&lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
Perhaps, the concept of a 'paper' can be better reflected in the concept of a 'wrapper' or a collection of materials and resources. The purpose is to assure that publications are representative of the work, effort, and results achieved in the research process.&lt;br /&gt;
&lt;br /&gt;
=== What is a GPF ===&lt;br /&gt;
&lt;br /&gt;
[[Discuss_what_we_will_consider_a_GPF#What_is_a_Geoscience_Paper_of_the_Future.3F | Include here our discussion of what is a GPF]]&lt;br /&gt;
&lt;br /&gt;
=== The challenges of creating GPFs ===&lt;br /&gt;
&lt;br /&gt;
The articles in this issue reflect the current best practice for generating a Geoscience Paper of the Future.&lt;br /&gt;
&lt;br /&gt;
'''Figure discussions''': Do we want to do exactly the same figure automatically.  Figures in the paper may be a clean versions of an image generated by software.  To the extent possible, authors have included clear delineations of provenance. The goal is to assure that readers may regenerate the figures using documented workflows, data, and codes.  An important note (Allen, Sandra) is that frequently figures are generated by code, scripts, etc. yet the actual figure is finalized with user.....  Mimi is trying to say: is it really worth belaboring the point about how the prettified version of the figure is made? If it is: both of the visualization software I've used (Matlab and SigmaPlot) have actual code in the background that specifies how to set up the prettification, and this code can be found, copied out, and rerun to generate the exact same figure with all of the prettification in the same place. SigmaPlot uses Visual Basic (I think) in its macros. If it is an important point about explicit code, this should be doable. But I'm not sure it's strictly necessary to specify exactly where all the prettifications are to get the gist across.&lt;br /&gt;
&lt;br /&gt;
How much of your experimental history does one include?  (Ibrahim).  The experimental process often ends up nowhere.  Should we document all the failed experiments?  Get one DOI for the results of the successful experiment?  Another for failed trials?&lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
'''''Documenting: Timing and Intermediate proceses'''''&lt;br /&gt;
When should we document and what are the bounds on what we document?&lt;br /&gt;
For example, should we document and include data and workflows for 'failed' experiments? Or should we assign datasets DOIs before we know the results from using them?  &lt;br /&gt;
The group thinks that  good ideas/practices may include documenting and sharing data when you have a clear understanding of the outcomes worth reporting. For example successful experiments should have clear, clean data documented and shared. Whereas one strategy with 'failed' experiments could include bundling the intermediate datasets with one DOI and a more general discussion of the process/methods.&lt;br /&gt;
&lt;br /&gt;
=== Related work ===&lt;br /&gt;
&lt;br /&gt;
[[Discuss_what_we_will_consider_a_GPF#New_Frameworks_to_Create_a_New_Generation_of_Scientific_Articles | Include here the related work we have discussed]]&lt;br /&gt;
&lt;br /&gt;
== Papers to be included ==&lt;br /&gt;
&lt;br /&gt;
Would it be worthwhile to group the papers into broader categories rather than giving specifics about every single paper?&lt;br /&gt;
&lt;br /&gt;
For each submission, we describe:&lt;br /&gt;
&lt;br /&gt;
* '''Authors and affiliations'''&lt;br /&gt;
* '''Keywords of research area'''&lt;br /&gt;
* '''Tentative title'''&lt;br /&gt;
* '''Short abstract'''&lt;br /&gt;
* '''Challenge''' (including &amp;quot;Reproducibility,&amp;quot; &amp;quot;Dark Code,&amp;quot; &amp;quot;Sharing Big Data,&amp;quot; and &amp;quot;Transferability&amp;quot;)&lt;br /&gt;
* '''Relationship to other publications''' (is the article based on a previously published article? is it new content? IF PREVIOUSLY PUBLISHED, PLS PROVIDE A POINTER TO THE PUBLISHED ARTICLE AND SPECIFY WHAT PERCENTAGE OF THE WORK PRESENTED WILL BE NEW)&lt;br /&gt;
* '''Pointer to the wiki page that documents the article'''&lt;br /&gt;
* '''Expected submission date'''&lt;br /&gt;
&lt;br /&gt;
=== [David 2015] ===&lt;br /&gt;
&lt;br /&gt;
* '''Authors and affiliations:''' [[Cedric David]]&lt;br /&gt;
* '''Keywords of research area:''' Hydrology, Rivers, Modeling, Testing, Reproducibility. &lt;br /&gt;
* '''Tentative title:''' Going beyond triple-checking, allowing for peace of mind in community model development.&lt;br /&gt;
* '''Short abstract:''' The development of computer models in the general field of geoscience is often made incrementally over many years.  Endeavors that generally start on one single researcher's own machine evolve over time into software that are often much larger than was initially anticipated.  Looking at years of building on their computer code, sometimes without much training in computer science, geoscience software developers can easily experience an overwhelming sense of incompetence when contemplating ways to further community usage of their software.  How does one allow others to use their code?  How can one foster survival of their tool?  How could one possibly ensure the scientific integrity of ongoing developments including those made by others?  Common issues faced by geoscience developers include selecting a license, learning how to track and document past and ongoing changes, choosing a software repository, and allowing for community development.  This paper provides a brief summary of experience with the three former steps of software growth by focusing on the almost decade-long code development of a river routing model.  The core of this study, however, focuses on reproducing previously-published experiments.  This step is highly repetitive and can therefore benefit greatly from automation.  Additionally, enabling automated software testing can arguably be considered the final step for sustainable software sharing, by allowing the main software developer to let go of a mental block considering scientific integrity.  Creating tools to automatically compare the results of an updated version of a software with those of previous studies can not only save the main developer's own time, it can also empower other researchers to in their ability to check and justify that their potential additions have retained scientific integrity.   &lt;br /&gt;
* '''Challenge:''' Reproducibility; Ensure that updates to an existing model are able to reproduce a series of simulations published previously.&lt;br /&gt;
* '''Relationship to other publications:''' This research is related to past and ongoing development of the Routing Application for Parallel computatIon of Discharge (RAPID).  The primary focus of this paper is to allow automated reproducibility of at least the [http://dx.doi.org/10.1175/2011JHM1345.1 first RAPID publication].  The scientific subject of this GPF differs from the article(s) to be reproduced as its focus is on development of automatic testing methods.  In that regard, the paper is expected to be 95% new. &lt;br /&gt;
* '''Pointer to the wiki page that documents the article:''' [[Document_GPF_activities_by_Cedric_David | Page]]&lt;br /&gt;
* '''Expected submission date:'''&lt;br /&gt;
&lt;br /&gt;
=== [Demir 2015] ===&lt;br /&gt;
&lt;br /&gt;
* '''Authors and affiliations:''' [[Ibrahim Demir]]&lt;br /&gt;
* '''Keywords of research area:''' hydrological network, optimization, network representation, database query&lt;br /&gt;
* '''Tentative title:''' Analysis and Optimization of Hydrological Network Database Representation Methods for Fast Access and Query in Web-based System&lt;br /&gt;
* '''Short abstract:''' Web based systems allow users to delineate watersheds on interactive map environments using server side processing. With increasing resolution of hydrological networks, optimized methods for storage of network representation in databases, and efficient queries and actions on the river network structure become critical. This paper presents a detailed study on analysis of widely used methods for representing hydrological networks in relational databases, and benchmarking common queries and modifications on the network structure using these methods. The analysis has been applied to the hydrological network of Iowa utilizing 90m DEM and 600,000 network nodes. The application results indicate that the representation methods provide massive improvements on query times and storage of network structure in the database. Suggested method allows watershed delineation tools running on client-side with desktop-like performance. &lt;br /&gt;
* '''Challenge:''' Reproducibility, Transferability; Some of the internal steps to prepare data might require long computation time and different software environments.&lt;br /&gt;
* '''Relationship to other publications:''' The article is based on a new study&lt;br /&gt;
* '''Pointer to the wiki page that documents the article:''' [[Document_GPF_activities_by_Ibrahim_Demir | Page]]&lt;br /&gt;
* '''Expected submission date:'''&lt;br /&gt;
&lt;br /&gt;
=== [Fulweiler 2015] ===&lt;br /&gt;
&lt;br /&gt;
* '''Authors and affiliations:''' [[Wally Fulweiler]]&lt;br /&gt;
* '''Keywords of research area:'''&lt;br /&gt;
* '''Tentative title:'''&lt;br /&gt;
* '''Short abstract:'''&lt;br /&gt;
* '''Challenge:''' &lt;br /&gt;
* '''Relationship to other publications:''' (is the article based on a previously published article? is it new content?)&lt;br /&gt;
* '''Pointer to the wiki page that documents the article:''' [[Document_GPF_activities_by_Wally_Fulweiler | Page]]&lt;br /&gt;
* '''Expected submission date:'''&lt;br /&gt;
&lt;br /&gt;
=== [Loh and Karlstrom 2015] ===&lt;br /&gt;
&lt;br /&gt;
* '''Authors and affiliations:'''  [[Lay Kuan Loh]] and [[Leif Karlstrom]]&lt;br /&gt;
* '''Keywords of research area:''' Spatial clustering, Eigenvector selection, Entropy Ranking, Cascades Volcanic Region, [http://geosphere.gsapubs.org/content/3/3/152.abstract Afar Depression], [http://astrogeology.usgs.gov/search/details/Mars/Research/Volcanic/TharsisVents/zip Tharsis provonce]&lt;br /&gt;
* '''Tentative title:''' Characterization of volcanic vent distributions using spectral clustering with eigenvector selection and entropy ranking&lt;br /&gt;
* '''Short abstract:''' Volcanic vents on the surface of Earth and other planets often appear in groups that exhibit spatial patterning. Such vent distributions reflect complex interplay between time-evolving mechanical controls on the pathways of magma ascent, background tectonic stresses, and unsteady supply of rising magma. With the ultimate aim of connecting surface vent distributions with the dynamics of magma ascent, we have developed a clustering method to quantify spatial patterns in vents. Clustering is typically used in exploratory data analysis to identify groups with similar behavior by partitioning a dataset into clusters that share similar attributes. Traditional clustering algorithms that work well on simple point-cloud type synthetic datasets generally do not scale well the real-world data we are interested in, where there are poor boundaries between clusters and much ambiguity in cluster assignments. We instead use a spectral clustering algorithm with eigenvector selection based on entropy ranking based off work from [http://www.sciencedirect.com/science/article/pii/S0925231210001311 Zhao et al 2010] that outperforms traditional spectral clustering algorithms in choosing the right number of clusters for point data. We benchmark this algorithm on synthetic vent data with increasingly complex spatial distributions, to test the ability to accurately cluster vent data with variable spatial density, skewness, number of clusters, and proximity of clusters. We then apply our algorithm to several real-world datasets from the Cascades, Afar Depression and Mars. &lt;br /&gt;
* '''Challenge:''' Reproducibility (i.e., Quantifying clustering); We plan to study how varying the statistical distribution, density, skewness, background noise, number of clusters, proximity of clusters, and combinations of any of these factors affects the performance of our algorithm. We test it against man-made and real world datasets. ''' &lt;br /&gt;
* '''Relationship to other publications:''' New content, but one of the databases we are studying in the paper (Cascades Volcanic Range) would be based off a different paper we are preparing and planning to submit earlier. &lt;br /&gt;
* '''Pointer to the wiki page that documents the article:''' [[Document_GPF_activities_by_Leif_Karlstrom | Page]]&lt;br /&gt;
* '''Expected submission date:''' June 2015&lt;br /&gt;
&lt;br /&gt;
=== [Lee 2015] ===&lt;br /&gt;
&lt;br /&gt;
* '''Authors and affiliations:''' [[Kyo Lee]], Maziyar Boustani and Chris Mattmann, Jet Propulsion Laboratory&lt;br /&gt;
* '''Keywords of research area:'''North American regional climate, regional climate model evaluation system, Open Climate Workbench, &lt;br /&gt;
* '''Tentative title:''' Evaluation of simulated temperature, precipitation, cloud fraction and insolation over the conterminous United States using Regional Climate Model Evaluation System&lt;br /&gt;
* '''Short abstract:'''This study describes the detailed process of evaluating model fidelity in simulating four key climate variables, surface air temperature, precipitation, cloud fraction and insolation and their covariability over the conterminous United States region. Regional Climate Model Evaluation System (RCMES), a suite of public database and open-source software package, provides both observational datasets and data processors useful for evaluating any climate models. In this paper, we provide a clear and easy-to-follow workflow of RCMES to replicate published papers evaluating North American Regional Climate Change Assessment Program (NARCCAP) regional climate model (RCM) hindcast simulations using observations from variety of sources. &lt;br /&gt;
* '''Challenge:'''Big Data Sharing, Dark Code; Sharing big data, better documenting source codes, encouraging climate science community to use RCMES  &lt;br /&gt;
* '''Relationship to other publications:''' [http://journals.ametsoc.org/doi/abs/10.1175/JCLI-D-12-00452.1 Kim et al. 2013], [http://link.springer.com/article/10.1007/s00382-014-2253-y Lee et al. 2014]&lt;br /&gt;
* '''Pointer to the wiki page that documents the article:''' [[Document_GPF_activities_by_Kyo_Lee | Page]]&lt;br /&gt;
* '''Expected submission date:'''End of June 2015&lt;br /&gt;
&lt;br /&gt;
=== [Miller 2015] ===&lt;br /&gt;
&lt;br /&gt;
* '''Authors and affiliations:''' [[Kim Miller]]&lt;br /&gt;
* '''Keywords of research area:'''&lt;br /&gt;
* '''Tentative title:'''&lt;br /&gt;
* '''Short abstract:'''&lt;br /&gt;
* '''Challenge:''' &lt;br /&gt;
* '''Relationship to other publications:''' (is the article based on a previously published article? is it new content?)&lt;br /&gt;
* '''Pointer to the wiki page that documents the article:''' [[Document_GPF_activities_by_Kim_Miller | Page]]&lt;br /&gt;
* '''Expected submission date:'''&lt;br /&gt;
&lt;br /&gt;
=== [Mills 2015] ===&lt;br /&gt;
&lt;br /&gt;
* '''Authors and affiliations:''' [[Heath Mills]], University of Houston Clear Lake; Brandi Kiel Reese, Texas A&amp;amp;M Corpus Christi&lt;br /&gt;
* '''Keywords of research area:'''&lt;br /&gt;
* '''Tentative title:'''Iron and Sulfur Cycling Biogeography Using Advanced Geochemical and Molecular Analyses&lt;br /&gt;
* '''Short abstract:'''My paper will develop and document a new pipeline to analyze a combined and robust genetic and geochemical data set. New, reproducible methods will be highlighted in this manuscript to help others better analyze similar data sets. There is a general lack of guidance within my field for such challenges. This manuscript will be unique and helpful from an analysis standpoint as well as for the science being presented.&lt;br /&gt;
* '''Challenge:''' Reproducibility; Dark Code&lt;br /&gt;
* '''Relationship to other publications:''' Original Manuscript&lt;br /&gt;
* '''Pointer to the wiki page that documents the article:''' [[Document_GPF_activities_by_Heith_Mills | Page]]&lt;br /&gt;
* '''Expected submission date:'''&lt;br /&gt;
&lt;br /&gt;
=== [Oh 2015] ===&lt;br /&gt;
&lt;br /&gt;
* '''Authors and affiliations:''' [[Ji-Hyun Oh]] Jet Propulsion Laboratory/University of Southern California&lt;br /&gt;
* '''Keywords of research area:''' Tropical Meteorology, Madden-Julian Oscillation, Momentum budget analysis &lt;br /&gt;
* '''Tentative title:''' Tools for computing momentum budget for the westerly wind event associated with the Madden-Julian Oscillation&lt;br /&gt;
* '''Short abstract:'''As one of the most pronounced modes of tropical intraseasonal variability, the Madden-Julian Oscillation (MJO) prominently connects global weather and climate, and serves as one of critical predictability sources for extended-range forecasting. The zonal circulation of the MJO is characterized by low-level westerlies (easterlies) in and to the west (east) of the convective center, respectively. The direction of zonal winds in the upper troposphere is opposite to that in the lower troposphere. In addition to the convective signal as an identifier of the MJO initiation, certain characteristics of the zonal circulation been used as a standard metric for monitoring the state of MJO and investigating features of the MJO and its impact on other atmospheric phenomena. This paper documents a tool for  investigating  the generation of low-level westerly winds during the MJO life cycle. The tool is used for the momentum budget analysis to understand the respective contributions of various processes involved in the wind evolution associated with the MJO using European Centre for Medium-Range Weather Forecasts operational analyses during Dynamics of the Madden–Julian Oscillation field campaign.&lt;br /&gt;
&lt;br /&gt;
* '''Challenge:''' Reproducibility, Dark Code; This paper will cover how to reproduce two key figures from the paper that I recently submitted to Journal of Atmospheric Science. This will include detailed procedures related to generating the figures such as how/where to download data, how to transform the format of the data to be used as an input for my codes, and so on.. &lt;br /&gt;
* '''Relationship to other publications:''' (is the article based on a previously published article? is it new content?) This article is related to the part of the paper submitted to Journal of Atmospheric Science. &lt;br /&gt;
* '''Pointer to the wiki page that documents the article:''' [[Document_GPF_activities_by_Ji_Hyun | Page]]&lt;br /&gt;
* '''Expected submission date:'''&lt;br /&gt;
&lt;br /&gt;
=== [Pierce 2015] ===&lt;br /&gt;
&lt;br /&gt;
* '''Authors and affiliations:'''  [[Suzanne Pierce]], John Gentle, and Daniel Noll (Texas Advanced Computing Center and Jackson School of Geosciences, The University of Texas at Austin; US Department of Energy)&lt;br /&gt;
&lt;br /&gt;
* '''Keywords of research area:''' Decision Support Systems, Hydrogeology, Participatory Modeling, Data Fusion &lt;br /&gt;
* '''Tentative title:''' MCSDSS: An accessible platform and application to enable data fusion and interactive visualization for the Geosciences&lt;br /&gt;
* '''Short abstract:'''&lt;br /&gt;
&lt;br /&gt;
* '''Challenge:''' Reproducibility, Dark Code; Fully document a new software application and framework using example case study data and tutorials; Creation of an interface that enables non-programmers to build out interactive visualizations for their data&lt;br /&gt;
* '''Relationship to other publications:''' This article is new content, the proof of concept idea was developed with DOE funding for a student competition and resulted in an initial implementation that was reported in the DOE competition report and a masters thesis for co-author Daniel Noll&lt;br /&gt;
* '''Pointer to the wiki page that documents the article:''' [[Document_GPF_activities_by_Suzanne_Pierce | Page]]&lt;br /&gt;
* '''Expected submission date:''' mid- to late June 2015&lt;br /&gt;
&lt;br /&gt;
=== [Pope 2015] ===&lt;br /&gt;
&lt;br /&gt;
* '''Authors and affiliations:''' [[Allen Pope]], National Snow and Ice Data Center, University of Colorado, Boulder&lt;br /&gt;
* '''Keywords of research area:''' Glaciology, Remote Sensing, Landsat 8, Polar Science&lt;br /&gt;
* '''Tentative title:''' Data and Code for Estimating and Evaluating Supraglacial Lake Depth With Landsat 8 and other Multispectral Sensors&lt;br /&gt;
* '''Short abstract:''' Supraglacial lakes play a significant role in glacial hydrological systems – for example, transporting water to the glacier bed in Greenland or leading to ice shelf fracture and disintegration in Antarctica. To investigate these important processes, multispectral remote sensing provides multiple methods for estimating supraglacial lake depth – either through single-band or band-ratio methods, both empirical and physically-based. Landsat 8 is the newest satellite in the Landsat series. With new bands, higher dynamic range, and higher radiometric resolution, the Operational Land Imager (OLI) aboard Landsat 8 has a lot of potential. &lt;br /&gt;
&lt;br /&gt;
This paper will document the data and code used in processing in situ reflectance spectra and depth measurements to investigate the ability of Landsat 8 to estimate lake depths using multiple methods, as well as quantify improvements over Landsat 7’s ETM+. A workflow, data, and code are provided to detail promising methods as applied to Landsat 8 OLI imagery of case study areas in Greenland, allowing calculation of regional volume estimates using 2013 and 2014 summer-season imagery. Altimetry from WorldView DEMs are used to validate lake depth estimates. The optimal method for supraglacial lake depth estimation with Landsat 8 is shown to be an average of single band depths by red and panchromatic bands. With this best method, preliminary investigation of seasonal behavior and elevation distribution of lakes is also discussed and documented.&lt;br /&gt;
* '''Challenge:''' Reproducibility, Dark Code&lt;br /&gt;
* '''Relationship to other publications:''' Documenting and explaining the data and code behind the analysis and results presented in another paper.&lt;br /&gt;
* '''Pointer to the wiki page that documents the article:''' [[Document_GPF_activities_by_Allen_Pope | Page]]&lt;br /&gt;
* '''Expected submission date:''' Late June 2015&lt;br /&gt;
&lt;br /&gt;
=== [Read and Winslow 2015] ===&lt;br /&gt;
&lt;br /&gt;
* '''Authors and affiliations:''' [[Jordan Read]] and [[Luke Winslow]]&lt;br /&gt;
* '''Keywords of research area:'''&lt;br /&gt;
* '''Tentative title:'''&lt;br /&gt;
* '''Short abstract:'''&lt;br /&gt;
* '''Challenge:''' &lt;br /&gt;
* '''Relationship to other publications:''' (is the article based on a previously published article? is it new content?)&lt;br /&gt;
* '''Pointer to the wiki page that documents the article:''' [[Document_GPF_activities_by_Jordan_Read | Page]]&lt;br /&gt;
* '''Expected submission date:'''&lt;br /&gt;
&lt;br /&gt;
=== [Tzeng 2015] ===&lt;br /&gt;
&lt;br /&gt;
* '''Authors and affiliations:''' [[Mimi Tzeng]], Brian Dzwonkowski (DISL); Kyeong Park (TAMU Galveston)&lt;br /&gt;
* '''Keywords of research area:'''physical oceanography, remote sensing&lt;br /&gt;
* '''Tentative title:''' Fisheries Oceanography of Coastal Alabama (FOCAL): A Subset of a Time-Series of Hydrographic and Current Data from a Permanent Moored Station Outside Mobile Bay (27 Jan to 18 May 2011)&lt;br /&gt;
* '''Short abstract:'''The Fisheries Oceanography in Coastal Alabama (FOCAL) program began in 2006 as a way for scientists at Dauphin Island Sea Lab (DISL) to study the natural variability of Alabama's nearshore environment as it relates to fisheries production. FOCAL provided a long-term baseline data set that included time-series hydrographic data from a permanent offshore mooring (ADCP, vertical thermister array and CTDs at surface and bottom) and shipboard surveys (vertical CTD profiles and water sampling), as well as monthly ichthyoplankton and zooplankton (depth-discrete) sample collections at FOCAL sites. The subset of data presented here are from the mooring, and includes a vertical array of thermisters, CTDs at surface and bottom, an ADCP at the bottom, and vertical CTD profiles collected at the mooring during maintenance surveys. The mooring is located at 30 05.410'N 88 12.694'W, 25 km southwest of the entrance to Mobile Bay. Temperature, salinity, density, depth, and current velocity data were collected at 20-minute intervals from 2006 to 2012. Other parameters, such as dissolved oxygen, are available for portions of the time series depending on which instruments were deployed at the time.&lt;br /&gt;
* '''Challenge:''' Dark Code, Reproducibility; My paper will be about the processing of data in a larger dataset, from which peer-reviewed papers have been written. The processing I did was not specific to any particular paper. I can point to an example paper that used some of the data from this dataset, that I processed, however all of the figures in the paper are composites that also include other data from elsewhere that I had nothing to do with (and it wouldn't be feasible to try to get hold of the other data within our timeframe).&lt;br /&gt;
* '''Relationship to other publications:''' A recent paper that used the part of the FOCAL data I'm documenting as the sample from the larger dataset: Dzwonkowski, Brian, Kyeong Park, Jungwoo Lee, Bret M. Webb, and Arnoldo Valle-Levinson. 2014. &amp;quot;Spatial variability of flow over a river-influenced inner shelf in coastal Alabama during spring.&amp;quot; Continental Shelf Research 74:25-34.&lt;br /&gt;
* '''Pointer to the wiki page that documents the article:''' [[Document_GPF_activities_by_Mimi_Tzeng | Page]]&lt;br /&gt;
* '''Expected submission date:'''&lt;br /&gt;
&lt;br /&gt;
=== [Villamizar 2015] ===&lt;br /&gt;
&lt;br /&gt;
* '''Authors and affiliations:''' [[Sandra Villamizar]], University of California, Merced&lt;br /&gt;
* '''Keywords of research area:''' river ecohydrology&lt;br /&gt;
* '''Tentative title:''' Producing long-term series of whole-stream metabolism using readily available data.  &lt;br /&gt;
* '''Short abstract:''' Continuous water quality and river discharge data that are readily available through government websites may be used to produce valuable information about key processes within a river ecosystem. In this paper I describe in detail the steps for acquisition and processing of river flow, dissolved oxygen, temperature, and specific conductance data that, combined with atmospheric data and physical properties of the river reach of interest, allow for the production of a long-term series of whole stream metabolism. This information is key in understanding the structure and function of an ecosystem such as the San Joaquin River in the Central Valley of California which has been increasingly degraded during the last 60 years due to intensive human intervention but now, since 2010, has been going through a restoration effort. The key advantage of this tool is that it uses readily available information to produce knowledge about a river ecosystem. This set of scripts, written in the R code, can be used immediately for any other river for which the key parameters (river flow, dissolved oxygen, temperature, and specific conductivity) are available. The scripts can also be modified by users to fit their particular site conditions.&lt;br /&gt;
 &lt;br /&gt;
* '''Challenge:''' Reproducibility; Dark Code; Document new software/applications. This set of scripts was written after the necessity of generating daily estimates of metabolic rates for long periods of time and at various sites within the San Joaquin River.  &lt;br /&gt;
* '''Relationship to other publications:''' This will be a new publication&lt;br /&gt;
* '''Pointer to the wiki page that documents the article:''' [[Document_GPF_activities_by_Sandra_Villamizar | Page]]&lt;br /&gt;
* '''Expected submission date:''' To be defined&lt;br /&gt;
&lt;br /&gt;
=== [Yu and Bhatt 2015] ===&lt;br /&gt;
&lt;br /&gt;
* '''Authors and affiliations:''' [[Xuan Yu]], Department of Geological Sciences, University of Delaware. Gopal Bhatt, Department of Civil &amp;amp; Environmental Engineering, Pennsylvania State University. &lt;br /&gt;
* '''Keywords of research area:''' coupled processes, integrated hydrologic modeling, PIHM, surface flow, subsurface flow, open science&lt;br /&gt;
* '''Tentative title:''' Learning integrated modeling of surface and subsurface flow from scratch&lt;br /&gt;
* '''Short abstract:''' Integrated modeling of surface and subsurface flow has been of great interest in understanding not only intimate interconnectedness of hydrological processes, but also land-surface energy balance, biogeochemical and ecological processes, and landscape evolution. Although a growing number of complex hydrologic models have been used for resolving environmental processes, hypothesis testing, hydrologic predictions for effective management of watershed, very limited resources of the model implementation have been made accessible to a large group of model users. The users have to invest a significant amount of time and effort to reproduce, and to understand the workflow of hydrologic simulation in a modeling paper. To provide a challenging and stimulating introduction to integrated modeling of surface and subsurface flow in this paper, we revisit the development of Penn State Integrated Hydrologic Model (PIHM) by reproducing a numerical benchmarking example, and a real world catchment scale application. Specifically, we document PIHM and it’s modeling workflow to enable basic understanding of simulating coupled surface and subsurface flow processes. We provide model and data to highlight the reciprocal roles between the two. In addition, we incorporate user experience as third dimension in the modeling workflow to enable deeper communications between model developers and users. The workflow has important implications for smoothing and accelerating open scientific collaborations in geosciences research.&lt;br /&gt;
* '''Challenge:''' Reproducibility; Reproduce published simulations by a existing model with the latest version. Benchmarking modeling application for numerical experiment and field data.&lt;br /&gt;
* '''Relationship to other publications:''' The article is based on a previously published article. &lt;br /&gt;
* '''Pointer to the wiki page that documents the article:''' [[Document_GPF_activities_by_Xuan_Yu | Page]]&lt;br /&gt;
* '''Expected submission date:''' End of June 2015&lt;br /&gt;
&lt;br /&gt;
== Special Issue Editors ==&lt;br /&gt;
&lt;br /&gt;
* Co-editor: Chris Duffy and/or Scott Peckham&lt;br /&gt;
* Co-editor: Cedric David&lt;br /&gt;
* Co-editor: possibly Karan Venayagamoorthy&lt;br /&gt;
&lt;br /&gt;
The editors will only accept submissions that follow the [[Develop_proposal_for_special_issue#Special_Issue_Review_Criteria | special issue review criteria]].&lt;br /&gt;
&lt;br /&gt;
The editors will select a set of reviewers to handle the submissions.  Reviewers will include computer scientists, library scientists, and geoscientists.&lt;br /&gt;
&lt;br /&gt;
== Special Issue Review Criteria ==&lt;br /&gt;
&lt;br /&gt;
The reviewers will be asked to provide feedback on the papers according to the following criteria. Note that some papers will have good reasons for limiting the information (e.g. the data is from third parties and not openly available, etc), and in that case they would document those reasons.&lt;br /&gt;
&lt;br /&gt;
* Documentation of the datasets: descriptions of datasets, unique identifiers, repositories.&lt;br /&gt;
* Documentation of software: description of all software used (including pre-processing of data, visualization steps, etc), unique identifiers, repositories.&lt;br /&gt;
* Documentation of the provenance of results: provenance for each figure or result, such as the workflow or the provenance record.&lt;br /&gt;
&lt;br /&gt;
== Tentative Timeline ==&lt;br /&gt;
&lt;br /&gt;
* Journal committed to special issue: April 15, 2015&lt;br /&gt;
* Submissions due to editors: June 30, 2015&lt;br /&gt;
* Reviews due: Sept 15, 2015&lt;br /&gt;
* Decisions out to authors: Sept 30, 2015&lt;br /&gt;
* Revisions due: October 31, 2015&lt;br /&gt;
* Final versions due November 15, 2015&lt;br /&gt;
* Issue published December 31, 2015&lt;br /&gt;
&lt;br /&gt;
&amp;lt;!-- Add any wiki Text above this Line --&amp;gt;&lt;br /&gt;
&amp;lt;!-- Do NOT Edit below this Line --&amp;gt;&lt;br /&gt;
{{#set:&lt;br /&gt;
	Owner=Chris_Duffy|&lt;br /&gt;
	Participants=Yolanda_Gil|&lt;br /&gt;
	Participants=Scott_Peckham|&lt;br /&gt;
	Participants=Cedric_David|&lt;br /&gt;
	Participants=Ibrahim_Demir|&lt;br /&gt;
	Participants=Wally_Fulweiler|&lt;br /&gt;
	Participants=Leif_Karlstrom|&lt;br /&gt;
	Participants=Kyo_Lee|&lt;br /&gt;
	Participants=Kim_Miller|&lt;br /&gt;
	Participants=Heath_Mills|&lt;br /&gt;
	Participants=Ji-Hyun_Oh|&lt;br /&gt;
	Participants=Suzanne_Pierce|&lt;br /&gt;
	Participants=Allen_Pope|&lt;br /&gt;
	Participants=Jordan_Read|&lt;br /&gt;
	Participants=Mimi_Tzeng|&lt;br /&gt;
	Participants=Sandra_Villamizar|&lt;br /&gt;
	Participants=Xuan_Yu|&lt;br /&gt;
	Progress=20|&lt;br /&gt;
	StartDate=2015-03-10|&lt;br /&gt;
	TargetDate=2015-03-16|&lt;br /&gt;
	Type=Low}}&lt;/div&gt;</summary>
		<author><name>Suzanne</name></author>	</entry>

	<entry>
		<id>https://www.organicdatascience.org/gpf/index.php?title=Develop_proposal_for_special_issue&amp;diff=11709</id>
		<title>Develop proposal for special issue</title>
		<link rel="alternate" type="text/html" href="https://www.organicdatascience.org/gpf/index.php?title=Develop_proposal_for_special_issue&amp;diff=11709"/>
				<updated>2015-04-03T18:00:04Z</updated>
		
		<summary type="html">&lt;p&gt;Suzanne: /* [Pierce 2015] */&lt;/p&gt;
&lt;hr /&gt;
&lt;div&gt;[[Category:Task]]&lt;br /&gt;
&lt;br /&gt;
== Background: Why a Special Issue on Geoscience Papers of the Future? ==&lt;br /&gt;
&lt;br /&gt;
[[Discuss_what_we_will_consider_a_GPF#The_Vision | Include here our discussion for the vision]]&lt;br /&gt;
&lt;br /&gt;
Background should be 1-2 pages.&lt;br /&gt;
&lt;br /&gt;
Motivated by need to fully document and make research accessible and reproducible. &lt;br /&gt;
&lt;br /&gt;
=== Motivation: The EarthCube Initiative and the GeoSoft Project ===&lt;br /&gt;
&lt;br /&gt;
[http://www.geosoft-earthcube.org/about Include here background about GeoSoft from the web site]&lt;br /&gt;
&lt;br /&gt;
OSTP memo.  EarthCube reports.&lt;br /&gt;
Other reports that talk about the need for new approaches to editing.&lt;br /&gt;
&lt;br /&gt;
It's possible that small or very large contributions are not well captured in the current publishing paradigms.  Nanopublications.&lt;br /&gt;
&lt;br /&gt;
For example, nano-publications are a possible way to reflect advances in a research process that may not merit a full pubication but they are useful advances to share with the community. A challenge here is that there is a stigma in publishing for publishing units that are too small or very small.  &lt;br /&gt;
&lt;br /&gt;
Alternatively, a very large piece of research or work with many parts may be better suited to a GPF style publication.&lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
Perhaps, the concept of a 'paper' can be better reflected in the concept of a 'wrapper' or a collection of materials and resources. The purpose is to assure that publications are representative of the work, effort, and results achieved in the research process.&lt;br /&gt;
&lt;br /&gt;
=== What is a GPF ===&lt;br /&gt;
&lt;br /&gt;
[[Discuss_what_we_will_consider_a_GPF#What_is_a_Geoscience_Paper_of_the_Future.3F | Include here our discussion of what is a GPF]]&lt;br /&gt;
&lt;br /&gt;
=== The challenges of creating GPFs ===&lt;br /&gt;
&lt;br /&gt;
The articles in this issue reflect the current best practice for generating a Geoscience Paper of the Future.&lt;br /&gt;
&lt;br /&gt;
'''Figure discussions''': Do we want to do exactly the same figure automatically.  Figures in the paper may be a clean versions of an image generated by software.  To the extent possible, authors have included clear delineations of provenance. The goal is to assure that readers may regenerate the figures using documented workflows, data, and codes.  An important note (Allen, Sandra) is that frequently figures are generated by code, scripts, etc. yet the actual figure is finalized with user.....  Mimi is trying to say: is it really worth belaboring the point about how the prettified version of the figure is made? If it is: both of the visualization software I've used (Matlab and SigmaPlot) have actual code in the background that specifies how to set up the prettification, and this code can be found, copied out, and rerun to generate the exact same figure with all of the prettification in the same place. SigmaPlot uses Visual Basic (I think) in its macros. If it is an important point about explicit code, this should be doable. But I'm not sure it's strictly necessary to specify exactly where all the prettifications are to get the gist across.&lt;br /&gt;
&lt;br /&gt;
How much of your experimental history does one include?  (Ibrahim).  The experimental process often ends up nowhere.  Should we document all the failed experiments?  Get one DOI for the results of the successful experiment?  Another for failed trials?&lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
'''''Documenting: Timing and Intermediate proceses'''''&lt;br /&gt;
When should we document and what are the bounds on what we document?&lt;br /&gt;
For example, should we document and include data and workflows for 'failed' experiments? Or should we assign datasets DOIs before we know the results from using them?  &lt;br /&gt;
The group thinks that  good ideas/practices may include documenting and sharing data when you have a clear understanding of the outcomes worth reporting. For example successful experiments should have clear, clean data documented and shared. Whereas one strategy with 'failed' experiments could include bundling the intermediate datasets with one DOI and a more general discussion of the process/methods.&lt;br /&gt;
&lt;br /&gt;
=== Related work ===&lt;br /&gt;
&lt;br /&gt;
[[Discuss_what_we_will_consider_a_GPF#New_Frameworks_to_Create_a_New_Generation_of_Scientific_Articles | Include here the related work we have discussed]]&lt;br /&gt;
&lt;br /&gt;
== Papers to be included ==&lt;br /&gt;
&lt;br /&gt;
Would it be worthwhile to group the papers into broader categories rather than giving specifics about every single paper?&lt;br /&gt;
&lt;br /&gt;
For each submission, we describe:&lt;br /&gt;
&lt;br /&gt;
* '''Authors and affiliations'''&lt;br /&gt;
* '''Keywords of research area'''&lt;br /&gt;
* '''Tentative title'''&lt;br /&gt;
* '''Short abstract'''&lt;br /&gt;
* '''Challenge''' (including &amp;quot;Reproducibility,&amp;quot; &amp;quot;Dark Code,&amp;quot; &amp;quot;Sharing Big Data,&amp;quot; and &amp;quot;Transferability&amp;quot;)&lt;br /&gt;
* '''Relationship to other publications''' (is the article based on a previously published article? is it new content? IF PREVIOUSLY PUBLISHED, PLS PROVIDE A POINTER TO THE PUBLISHED ARTICLE AND SPECIFY WHAT PERCENTAGE OF THE WORK PRESENTED WILL BE NEW)&lt;br /&gt;
* '''Pointer to the wiki page that documents the article'''&lt;br /&gt;
* '''Expected submission date'''&lt;br /&gt;
&lt;br /&gt;
=== [David 2015] ===&lt;br /&gt;
&lt;br /&gt;
* '''Authors and affiliations:''' [[Cedric David]]&lt;br /&gt;
* '''Keywords of research area:''' Hydrology, Rivers, Modeling, Testing, Reproducibility. &lt;br /&gt;
* '''Tentative title:''' Going beyond triple-checking, allowing for peace of mind in community model development.&lt;br /&gt;
* '''Short abstract:''' The development of computer models in the general field of geoscience is often made incrementally over many years.  Endeavors that generally start on one single researcher's own machine evolve over time into software that are often much larger than was initially anticipated.  Looking at years of building on their computer code, sometimes without much training in computer science, geoscience software developers can easily experience an overwhelming sense of incompetence when contemplating ways to further community usage of their software.  How does one allow others to use their code?  How can one foster survival of their tool?  How could one possibly ensure the scientific integrity of ongoing developments including those made by others?  Common issues faced by geoscience developers include selecting a license, learning how to track and document past and ongoing changes, choosing a software repository, and allowing for community development.  This paper provides a brief summary of experience with the three former steps of software growth by focusing on the almost decade-long code development of a river routing model.  The core of this study, however, focuses on reproducing previously-published experiments.  This step is highly repetitive and can therefore benefit greatly from automation.  Additionally, enabling automated software testing can arguably be considered the final step for sustainable software sharing, by allowing the main software developer to let go of a mental block considering scientific integrity.  Creating tools to automatically compare the results of an updated version of a software with those of previous studies can not only save the main developer's own time, it can also empower other researchers to in their ability to check and justify that their potential additions have retained scientific integrity.   &lt;br /&gt;
* '''Challenge:''' Reproducibility; Ensure that updates to an existing model are able to reproduce a series of simulations published previously.&lt;br /&gt;
* '''Relationship to other publications:''' This research is related to past and ongoing development of the Routing Application for Parallel computatIon of Discharge (RAPID).  The primary focus of this paper is to allow automated reproducibility of at least the [http://dx.doi.org/10.1175/2011JHM1345.1 first RAPID publication].  The scientific subject of this GPF differs from the article(s) to be reproduced as its focus is on development of automatic testing methods.  In that regard, the paper is expected to be 95% new. &lt;br /&gt;
* '''Pointer to the wiki page that documents the article:''' [[Document_GPF_activities_by_Cedric_David | Page]]&lt;br /&gt;
* '''Expected submission date:'''&lt;br /&gt;
&lt;br /&gt;
=== [Demir 2015] ===&lt;br /&gt;
&lt;br /&gt;
* '''Authors and affiliations:''' [[Ibrahim Demir]]&lt;br /&gt;
* '''Keywords of research area:''' hydrological network, optimization, network representation, database query&lt;br /&gt;
* '''Tentative title:''' Analysis and Optimization of Hydrological Network Database Representation Methods for Fast Access and Query in Web-based System&lt;br /&gt;
* '''Short abstract:''' Web based systems allow users to delineate watersheds on interactive map environments using server side processing. With increasing resolution of hydrological networks, optimized methods for storage of network representation in databases, and efficient queries and actions on the river network structure become critical. This paper presents a detailed study on analysis of widely used methods for representing hydrological networks in relational databases, and benchmarking common queries and modifications on the network structure using these methods. The analysis has been applied to the hydrological network of Iowa utilizing 90m DEM and 600,000 network nodes. The application results indicate that the representation methods provide massive improvements on query times and storage of network structure in the database. Suggested method allows watershed delineation tools running on client-side with desktop-like performance. &lt;br /&gt;
* '''Challenge:''' Reproducibility, Transferability; Some of the internal steps to prepare data might require long computation time and different software environments.&lt;br /&gt;
* '''Relationship to other publications:''' The article is based on a new study&lt;br /&gt;
* '''Pointer to the wiki page that documents the article:''' [[Document_GPF_activities_by_Ibrahim_Demir | Page]]&lt;br /&gt;
* '''Expected submission date:'''&lt;br /&gt;
&lt;br /&gt;
=== [Fulweiler 2015] ===&lt;br /&gt;
&lt;br /&gt;
* '''Authors and affiliations:''' [[Wally Fulweiler]]&lt;br /&gt;
* '''Keywords of research area:'''&lt;br /&gt;
* '''Tentative title:'''&lt;br /&gt;
* '''Short abstract:'''&lt;br /&gt;
* '''Challenge:''' &lt;br /&gt;
* '''Relationship to other publications:''' (is the article based on a previously published article? is it new content?)&lt;br /&gt;
* '''Pointer to the wiki page that documents the article:''' [[Document_GPF_activities_by_Wally_Fulweiler | Page]]&lt;br /&gt;
* '''Expected submission date:'''&lt;br /&gt;
&lt;br /&gt;
=== [Loh and Karlstrom 2015] ===&lt;br /&gt;
&lt;br /&gt;
* '''Authors and affiliations:'''  [[Lay Kuan Loh]] and [[Leif Karlstrom]]&lt;br /&gt;
* '''Keywords of research area:''' Spatial clustering, Eigenvector selection, Entropy Ranking, Cascades Volcanic Region, [http://geosphere.gsapubs.org/content/3/3/152.abstract Afar Depression], [http://astrogeology.usgs.gov/search/details/Mars/Research/Volcanic/TharsisVents/zip Tharsis provonce]&lt;br /&gt;
* '''Tentative title:''' Characterization of volcanic vent distributions using spectral clustering with eigenvector selection and entropy ranking&lt;br /&gt;
* '''Short abstract:''' Volcanic vents on the surface of Earth and other planets often appear in groups that exhibit spatial patterning. Such vent distributions reflect complex interplay between time-evolving mechanical controls on the pathways of magma ascent, background tectonic stresses, and unsteady supply of rising magma. With the ultimate aim of connecting surface vent distributions with the dynamics of magma ascent, we have developed a clustering method to quantify spatial patterns in vents. Clustering is typically used in exploratory data analysis to identify groups with similar behavior by partitioning a dataset into clusters that share similar attributes. Traditional clustering algorithms that work well on simple point-cloud type synthetic datasets generally do not scale well the real-world data we are interested in, where there are poor boundaries between clusters and much ambiguity in cluster assignments. We instead use a spectral clustering algorithm with eigenvector selection based on entropy ranking based off work from [http://www.sciencedirect.com/science/article/pii/S0925231210001311 Zhao et al 2010] that outperforms traditional spectral clustering algorithms in choosing the right number of clusters for point data. We benchmark this algorithm on synthetic vent data with increasingly complex spatial distributions, to test the ability to accurately cluster vent data with variable spatial density, skewness, number of clusters, and proximity of clusters. We then apply our algorithm to several real-world datasets from the Cascades, Afar Depression and Mars. &lt;br /&gt;
* '''Challenge:''' Reproducibility (i.e., Quantifying clustering); We plan to study how varying the statistical distribution, density, skewness, background noise, number of clusters, proximity of clusters, and combinations of any of these factors affects the performance of our algorithm. We test it against man-made and real world datasets. ''' &lt;br /&gt;
* '''Relationship to other publications:''' New content, but one of the databases we are studying in the paper (Cascades Volcanic Range) would be based off a different paper we are preparing and planning to submit earlier. &lt;br /&gt;
* '''Pointer to the wiki page that documents the article:''' [[Document_GPF_activities_by_Leif_Karlstrom | Page]]&lt;br /&gt;
* '''Expected submission date:''' June 2015&lt;br /&gt;
&lt;br /&gt;
=== [Lee 2015] ===&lt;br /&gt;
&lt;br /&gt;
* '''Authors and affiliations:''' [[Kyo Lee]], Maziyar Boustani and Chris Mattmann, Jet Propulsion Laboratory&lt;br /&gt;
* '''Keywords of research area:'''North American regional climate, regional climate model evaluation system, Open Climate Workbench, &lt;br /&gt;
* '''Tentative title:''' Evaluation of simulated temperature, precipitation, cloud fraction and insolation over the conterminous United States using Regional Climate Model Evaluation System&lt;br /&gt;
* '''Short abstract:'''This study describes the detailed process of evaluating model fidelity in simulating four key climate variables, surface air temperature, precipitation, cloud fraction and insolation and their covariability over the conterminous United States region. Regional Climate Model Evaluation System (RCMES), a suite of public database and open-source software package, provides both observational datasets and data processors useful for evaluating any climate models. In this paper, we provide a clear and easy-to-follow workflow of RCMES to replicate published papers evaluating North American Regional Climate Change Assessment Program (NARCCAP) regional climate model (RCM) hindcast simulations using observations from variety of sources. &lt;br /&gt;
* '''Challenge:'''Big Data Sharing, Dark Code; Sharing big data, better documenting source codes, encouraging climate science community to use RCMES  &lt;br /&gt;
* '''Relationship to other publications:''' [http://journals.ametsoc.org/doi/abs/10.1175/JCLI-D-12-00452.1 Kim et al. 2013], [http://link.springer.com/article/10.1007/s00382-014-2253-y Lee et al. 2014]&lt;br /&gt;
* '''Pointer to the wiki page that documents the article:''' [[Document_GPF_activities_by_Kyo_Lee | Page]]&lt;br /&gt;
* '''Expected submission date:'''End of June 2015&lt;br /&gt;
&lt;br /&gt;
=== [Miller 2015] ===&lt;br /&gt;
&lt;br /&gt;
* '''Authors and affiliations:''' [[Kim Miller]]&lt;br /&gt;
* '''Keywords of research area:'''&lt;br /&gt;
* '''Tentative title:'''&lt;br /&gt;
* '''Short abstract:'''&lt;br /&gt;
* '''Challenge:''' &lt;br /&gt;
* '''Relationship to other publications:''' (is the article based on a previously published article? is it new content?)&lt;br /&gt;
* '''Pointer to the wiki page that documents the article:''' [[Document_GPF_activities_by_Kim_Miller | Page]]&lt;br /&gt;
* '''Expected submission date:'''&lt;br /&gt;
&lt;br /&gt;
=== [Mills 2015] ===&lt;br /&gt;
&lt;br /&gt;
* '''Authors and affiliations:''' [[Heath Mills]], University of Houston Clear Lake; Brandi Kiel Reese, Texas A&amp;amp;M Corpus Christi&lt;br /&gt;
* '''Keywords of research area:'''&lt;br /&gt;
* '''Tentative title:'''Iron and Sulfur Cycling Biogeography Using Advanced Geochemical and Molecular Analyses&lt;br /&gt;
* '''Short abstract:'''My paper will develop and document a new pipeline to analyze a combined and robust genetic and geochemical data set. New, reproducible methods will be highlighted in this manuscript to help others better analyze similar data sets. There is a general lack of guidance within my field for such challenges. This manuscript will be unique and helpful from an analysis standpoint as well as for the science being presented.&lt;br /&gt;
* '''Challenge:''' Reproducibility; Dark Code&lt;br /&gt;
* '''Relationship to other publications:''' Original Manuscript&lt;br /&gt;
* '''Pointer to the wiki page that documents the article:''' [[Document_GPF_activities_by_Heith_Mills | Page]]&lt;br /&gt;
* '''Expected submission date:'''&lt;br /&gt;
&lt;br /&gt;
=== [Oh 2015] ===&lt;br /&gt;
&lt;br /&gt;
* '''Authors and affiliations:''' [[Ji-Hyun Oh]] Jet Propulsion Laboratory/University of Southern California&lt;br /&gt;
* '''Keywords of research area:''' Tropical Meteorology, Madden-Julian Oscillation, Momentum budget analysis &lt;br /&gt;
* '''Tentative title:''' Tools for computing momentum budget for the westerly wind event associated with the Madden-Julian Oscillation&lt;br /&gt;
* '''Short abstract:'''As one of the most pronounced modes of tropical intraseasonal variability, the Madden-Julian Oscillation (MJO) prominently connects global weather and climate, and serves as one of critical predictability sources for extended-range forecasting. The zonal circulation of the MJO is characterized by low-level westerlies (easterlies) in and to the west (east) of the convective center, respectively. The direction of zonal winds in the upper troposphere is opposite to that in the lower troposphere. In addition to the convective signal as an identifier of the MJO initiation, certain characteristics of the zonal circulation been used as a standard metric for monitoring the state of MJO and investigating features of the MJO and its impact on other atmospheric phenomena. This paper documents a tool for  investigating  the generation of low-level westerly winds during the MJO life cycle. The tool is used for the momentum budget analysis to understand the respective contributions of various processes involved in the wind evolution associated with the MJO using European Centre for Medium-Range Weather Forecasts operational analyses during Dynamics of the Madden–Julian Oscillation field campaign.&lt;br /&gt;
&lt;br /&gt;
* '''Challenge:''' Reproducibility, Dark Code; This paper will cover how to reproduce two key figures from the paper that I recently submitted to Journal of Atmospheric Science. This will include detailed procedures related to generating the figures such as how/where to download data, how to transform the format of the data to be used as an input for my codes, and so on.. &lt;br /&gt;
* '''Relationship to other publications:''' (is the article based on a previously published article? is it new content?) This article is related to the part of the paper submitted to Journal of Atmospheric Science. &lt;br /&gt;
* '''Pointer to the wiki page that documents the article:''' [[Document_GPF_activities_by_Ji_Hyun | Page]]&lt;br /&gt;
* '''Expected submission date:'''&lt;br /&gt;
&lt;br /&gt;
=== [Pierce 2015] ===&lt;br /&gt;
&lt;br /&gt;
* '''Authors and affiliations:'''  [[Suzanne Pierce]] and John Gentle (Texas Advanced Computing Center and Jackson School of Geosciences, The University of Texas at Austin&lt;br /&gt;
&lt;br /&gt;
* '''Keywords of research area:''' Decision Support Systems, Hydrogeology, Participatory Modeling, Data Fusion &lt;br /&gt;
* '''Tentative title:''' MCSDSS: An accessible platform and application to enable data fusion and interactive visualization for the Geosciences&lt;br /&gt;
* '''Short abstract:'''&lt;br /&gt;
&lt;br /&gt;
* '''Challenge:''' Reproducibility, Dark Code; Fully document a new software application and framework using example case study data and tutorials.&lt;br /&gt;
* '''Relationship to other publications:''' This article is new content&lt;br /&gt;
* '''Pointer to the wiki page that documents the article:''' [[Document_GPF_activities_by_Suzanne_Pierce | Page]]&lt;br /&gt;
* '''Expected submission date:''' mid- to late June 2015&lt;br /&gt;
&lt;br /&gt;
=== [Pope 2015] ===&lt;br /&gt;
&lt;br /&gt;
* '''Authors and affiliations:''' [[Allen Pope]], National Snow and Ice Data Center, University of Colorado, Boulder&lt;br /&gt;
* '''Keywords of research area:''' Glaciology, Remote Sensing, Landsat 8, Polar Science&lt;br /&gt;
* '''Tentative title:''' Data and Code for Estimating and Evaluating Supraglacial Lake Depth With Landsat 8 and other Multispectral Sensors&lt;br /&gt;
* '''Short abstract:''' Supraglacial lakes play a significant role in glacial hydrological systems – for example, transporting water to the glacier bed in Greenland or leading to ice shelf fracture and disintegration in Antarctica. To investigate these important processes, multispectral remote sensing provides multiple methods for estimating supraglacial lake depth – either through single-band or band-ratio methods, both empirical and physically-based. Landsat 8 is the newest satellite in the Landsat series. With new bands, higher dynamic range, and higher radiometric resolution, the Operational Land Imager (OLI) aboard Landsat 8 has a lot of potential. &lt;br /&gt;
&lt;br /&gt;
This paper will document the data and code used in processing in situ reflectance spectra and depth measurements to investigate the ability of Landsat 8 to estimate lake depths using multiple methods, as well as quantify improvements over Landsat 7’s ETM+. A workflow, data, and code are provided to detail promising methods as applied to Landsat 8 OLI imagery of case study areas in Greenland, allowing calculation of regional volume estimates using 2013 and 2014 summer-season imagery. Altimetry from WorldView DEMs are used to validate lake depth estimates. The optimal method for supraglacial lake depth estimation with Landsat 8 is shown to be an average of single band depths by red and panchromatic bands. With this best method, preliminary investigation of seasonal behavior and elevation distribution of lakes is also discussed and documented.&lt;br /&gt;
* '''Challenge:''' Reproducibility, Dark Code&lt;br /&gt;
* '''Relationship to other publications:''' Documenting and explaining the data and code behind the analysis and results presented in another paper.&lt;br /&gt;
* '''Pointer to the wiki page that documents the article:''' [[Document_GPF_activities_by_Allen_Pope | Page]]&lt;br /&gt;
* '''Expected submission date:''' Late June 2015&lt;br /&gt;
&lt;br /&gt;
=== [Read and Winslow 2015] ===&lt;br /&gt;
&lt;br /&gt;
* '''Authors and affiliations:''' [[Jordan Read]] and [[Luke Winslow]]&lt;br /&gt;
* '''Keywords of research area:'''&lt;br /&gt;
* '''Tentative title:'''&lt;br /&gt;
* '''Short abstract:'''&lt;br /&gt;
* '''Challenge:''' &lt;br /&gt;
* '''Relationship to other publications:''' (is the article based on a previously published article? is it new content?)&lt;br /&gt;
* '''Pointer to the wiki page that documents the article:''' [[Document_GPF_activities_by_Jordan_Read | Page]]&lt;br /&gt;
* '''Expected submission date:'''&lt;br /&gt;
&lt;br /&gt;
=== [Tzeng 2015] ===&lt;br /&gt;
&lt;br /&gt;
* '''Authors and affiliations:''' [[Mimi Tzeng]], Brian Dzwonkowski (DISL); Kyeong Park (TAMU Galveston)&lt;br /&gt;
* '''Keywords of research area:'''physical oceanography, remote sensing&lt;br /&gt;
* '''Tentative title:''' Fisheries Oceanography of Coastal Alabama (FOCAL): A Subset of a Time-Series of Hydrographic and Current Data from a Permanent Moored Station Outside Mobile Bay (27 Jan to 18 May 2011)&lt;br /&gt;
* '''Short abstract:'''The Fisheries Oceanography in Coastal Alabama (FOCAL) program began in 2006 as a way for scientists at Dauphin Island Sea Lab (DISL) to study the natural variability of Alabama's nearshore environment as it relates to fisheries production. FOCAL provided a long-term baseline data set that included time-series hydrographic data from a permanent offshore mooring (ADCP, vertical thermister array and CTDs at surface and bottom) and shipboard surveys (vertical CTD profiles and water sampling), as well as monthly ichthyoplankton and zooplankton (depth-discrete) sample collections at FOCAL sites. The subset of data presented here are from the mooring, and includes a vertical array of thermisters, CTDs at surface and bottom, an ADCP at the bottom, and vertical CTD profiles collected at the mooring during maintenance surveys. The mooring is located at 30 05.410'N 88 12.694'W, 25 km southwest of the entrance to Mobile Bay. Temperature, salinity, density, depth, and current velocity data were collected at 20-minute intervals from 2006 to 2012. Other parameters, such as dissolved oxygen, are available for portions of the time series depending on which instruments were deployed at the time.&lt;br /&gt;
* '''Challenge:''' Dark Code, Reproducibility; My paper will be about the processing of data in a larger dataset, from which peer-reviewed papers have been written. The processing I did was not specific to any particular paper. I can point to an example paper that used some of the data from this dataset, that I processed, however all of the figures in the paper are composites that also include other data from elsewhere that I had nothing to do with (and it wouldn't be feasible to try to get hold of the other data within our timeframe).&lt;br /&gt;
* '''Relationship to other publications:''' A recent paper that used the part of the FOCAL data I'm documenting as the sample from the larger dataset: Dzwonkowski, Brian, Kyeong Park, Jungwoo Lee, Bret M. Webb, and Arnoldo Valle-Levinson. 2014. &amp;quot;Spatial variability of flow over a river-influenced inner shelf in coastal Alabama during spring.&amp;quot; Continental Shelf Research 74:25-34.&lt;br /&gt;
* '''Pointer to the wiki page that documents the article:''' [[Document_GPF_activities_by_Mimi_Tzeng | Page]]&lt;br /&gt;
* '''Expected submission date:'''&lt;br /&gt;
&lt;br /&gt;
=== [Villamizar 2015] ===&lt;br /&gt;
&lt;br /&gt;
* '''Authors and affiliations:''' [[Sandra Villamizar]], University of California, Merced&lt;br /&gt;
* '''Keywords of research area:''' river ecohydrology&lt;br /&gt;
* '''Tentative title:''' Producing long-term series of whole-stream metabolism using readily available data.  &lt;br /&gt;
* '''Short abstract:''' Continuous water quality and river discharge data that are readily available through government websites may be used to produce valuable information about key processes within a river ecosystem. In this paper I describe in detail the steps for acquisition and processing of river flow, dissolved oxygen, temperature, and specific conductance data that, combined with atmospheric data and physical properties of the river reach of interest, allow for the production of a long-term series of whole stream metabolism. This information is key in understanding the structure and function of an ecosystem such as the San Joaquin River in the Central Valley of California which has been increasingly degraded during the last 60 years due to intensive human intervention but now, since 2010, has been going through a restoration effort. The key advantage of this tool is that it uses readily available information to produce knowledge about a river ecosystem. This set of scripts, written in the R code, can be used immediately for any other river for which the key parameters (river flow, dissolved oxygen, temperature, and specific conductivity) are available. The scripts can also be modified by users to fit their particular site conditions.&lt;br /&gt;
 &lt;br /&gt;
* '''Challenge:''' Reproducibility; Dark Code; Document new software/applications. This set of scripts was written after the necessity of generating daily estimates of metabolic rates for long periods of time and at various sites within the San Joaquin River.  &lt;br /&gt;
* '''Relationship to other publications:''' This will be a new publication&lt;br /&gt;
* '''Pointer to the wiki page that documents the article:''' [[Document_GPF_activities_by_Sandra_Villamizar | Page]]&lt;br /&gt;
* '''Expected submission date:''' To be defined&lt;br /&gt;
&lt;br /&gt;
=== [Yu and Bhatt 2015] ===&lt;br /&gt;
&lt;br /&gt;
* '''Authors and affiliations:''' [[Xuan Yu]], Department of Geological Sciences, University of Delaware. Gopal Bhatt, Department of Civil &amp;amp; Environmental Engineering, Pennsylvania State University. &lt;br /&gt;
* '''Keywords of research area:''' coupled processes, integrated hydrologic modeling, PIHM, surface flow, subsurface flow, open science&lt;br /&gt;
* '''Tentative title:''' Learning integrated modeling of surface and subsurface flow from scratch&lt;br /&gt;
* '''Short abstract:''' Integrated modeling of surface and subsurface flow has been of great interest in understanding not only intimate interconnectedness of hydrological processes, but also land-surface energy balance, biogeochemical and ecological processes, and landscape evolution. Although a growing number of complex hydrologic models have been used for resolving environmental processes, hypothesis testing, hydrologic predictions for effective management of watershed, very limited resources of the model implementation have been made accessible to a large group of model users. The users have to invest a significant amount of time and effort to reproduce, and to understand the workflow of hydrologic simulation in a modeling paper. To provide a challenging and stimulating introduction to integrated modeling of surface and subsurface flow in this paper, we revisit the development of Penn State Integrated Hydrologic Model (PIHM) by reproducing a numerical benchmarking example, and a real world catchment scale application. Specifically, we document PIHM and it’s modeling workflow to enable basic understanding of simulating coupled surface and subsurface flow processes. We provide model and data to highlight the reciprocal roles between the two. In addition, we incorporate user experience as third dimension in the modeling workflow to enable deeper communications between model developers and users. The workflow has important implications for smoothing and accelerating open scientific collaborations in geosciences research.&lt;br /&gt;
* '''Challenge:''' Reproducibility; Reproduce published simulations by a existing model with the latest version. Benchmarking modeling application for numerical experiment and field data.&lt;br /&gt;
* '''Relationship to other publications:''' The article is based on a previously published article. &lt;br /&gt;
* '''Pointer to the wiki page that documents the article:''' [[Document_GPF_activities_by_Xuan_Yu | Page]]&lt;br /&gt;
* '''Expected submission date:''' End of June 2015&lt;br /&gt;
&lt;br /&gt;
== Special Issue Editors ==&lt;br /&gt;
&lt;br /&gt;
* Co-editor: Chris Duffy and/or Scott Peckham&lt;br /&gt;
* Co-editor: Cedric David&lt;br /&gt;
* Co-editor: possibly Karan Venayagamoorthy&lt;br /&gt;
&lt;br /&gt;
The editors will only accept submissions that follow the [[Develop_proposal_for_special_issue#Special_Issue_Review_Criteria | special issue review criteria]].&lt;br /&gt;
&lt;br /&gt;
The editors will select a set of reviewers to handle the submissions.  Reviewers will include computer scientists, library scientists, and geoscientists.&lt;br /&gt;
&lt;br /&gt;
== Special Issue Review Criteria ==&lt;br /&gt;
&lt;br /&gt;
The reviewers will be asked to provide feedback on the papers according to the following criteria. Note that some papers will have good reasons for limiting the information (e.g. the data is from third parties and not openly available, etc), and in that case they would document those reasons.&lt;br /&gt;
&lt;br /&gt;
* Documentation of the datasets: descriptions of datasets, unique identifiers, repositories.&lt;br /&gt;
* Documentation of software: description of all software used (including pre-processing of data, visualization steps, etc), unique identifiers, repositories.&lt;br /&gt;
* Documentation of the provenance of results: provenance for each figure or result, such as the workflow or the provenance record.&lt;br /&gt;
&lt;br /&gt;
== Tentative Timeline ==&lt;br /&gt;
&lt;br /&gt;
* Journal committed to special issue: April 15, 2015&lt;br /&gt;
* Submissions due to editors: June 30, 2015&lt;br /&gt;
* Reviews due: Sept 15, 2015&lt;br /&gt;
* Decisions out to authors: Sept 30, 2015&lt;br /&gt;
* Revisions due: October 31, 2015&lt;br /&gt;
* Final versions due November 15, 2015&lt;br /&gt;
* Issue published December 31, 2015&lt;br /&gt;
&lt;br /&gt;
&amp;lt;!-- Add any wiki Text above this Line --&amp;gt;&lt;br /&gt;
&amp;lt;!-- Do NOT Edit below this Line --&amp;gt;&lt;br /&gt;
{{#set:&lt;br /&gt;
	Owner=Chris_Duffy|&lt;br /&gt;
	Participants=Yolanda_Gil|&lt;br /&gt;
	Participants=Scott_Peckham|&lt;br /&gt;
	Participants=Cedric_David|&lt;br /&gt;
	Participants=Ibrahim_Demir|&lt;br /&gt;
	Participants=Wally_Fulweiler|&lt;br /&gt;
	Participants=Leif_Karlstrom|&lt;br /&gt;
	Participants=Kyo_Lee|&lt;br /&gt;
	Participants=Kim_Miller|&lt;br /&gt;
	Participants=Heath_Mills|&lt;br /&gt;
	Participants=Ji-Hyun_Oh|&lt;br /&gt;
	Participants=Suzanne_Pierce|&lt;br /&gt;
	Participants=Allen_Pope|&lt;br /&gt;
	Participants=Jordan_Read|&lt;br /&gt;
	Participants=Mimi_Tzeng|&lt;br /&gt;
	Participants=Sandra_Villamizar|&lt;br /&gt;
	Participants=Xuan_Yu|&lt;br /&gt;
	Progress=20|&lt;br /&gt;
	StartDate=2015-03-10|&lt;br /&gt;
	TargetDate=2015-03-16|&lt;br /&gt;
	Type=Low}}&lt;/div&gt;</summary>
		<author><name>Suzanne</name></author>	</entry>

	<entry>
		<id>https://www.organicdatascience.org/gpf/index.php?title=Develop_proposal_for_special_issue&amp;diff=11708</id>
		<title>Develop proposal for special issue</title>
		<link rel="alternate" type="text/html" href="https://www.organicdatascience.org/gpf/index.php?title=Develop_proposal_for_special_issue&amp;diff=11708"/>
				<updated>2015-04-03T17:59:15Z</updated>
		
		<summary type="html">&lt;p&gt;Suzanne: /* [Pierce 2015] */&lt;/p&gt;
&lt;hr /&gt;
&lt;div&gt;[[Category:Task]]&lt;br /&gt;
&lt;br /&gt;
== Background: Why a Special Issue on Geoscience Papers of the Future? ==&lt;br /&gt;
&lt;br /&gt;
[[Discuss_what_we_will_consider_a_GPF#The_Vision | Include here our discussion for the vision]]&lt;br /&gt;
&lt;br /&gt;
Background should be 1-2 pages.&lt;br /&gt;
&lt;br /&gt;
Motivated by need to fully document and make research accessible and reproducible. &lt;br /&gt;
&lt;br /&gt;
=== Motivation: The EarthCube Initiative and the GeoSoft Project ===&lt;br /&gt;
&lt;br /&gt;
[http://www.geosoft-earthcube.org/about Include here background about GeoSoft from the web site]&lt;br /&gt;
&lt;br /&gt;
OSTP memo.  EarthCube reports.&lt;br /&gt;
Other reports that talk about the need for new approaches to editing.&lt;br /&gt;
&lt;br /&gt;
It's possible that small or very large contributions are not well captured in the current publishing paradigms.  Nanopublications.&lt;br /&gt;
&lt;br /&gt;
For example, nano-publications are a possible way to reflect advances in a research process that may not merit a full pubication but they are useful advances to share with the community. A challenge here is that there is a stigma in publishing for publishing units that are too small or very small.  &lt;br /&gt;
&lt;br /&gt;
Alternatively, a very large piece of research or work with many parts may be better suited to a GPF style publication.&lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
Perhaps, the concept of a 'paper' can be better reflected in the concept of a 'wrapper' or a collection of materials and resources. The purpose is to assure that publications are representative of the work, effort, and results achieved in the research process.&lt;br /&gt;
&lt;br /&gt;
=== What is a GPF ===&lt;br /&gt;
&lt;br /&gt;
[[Discuss_what_we_will_consider_a_GPF#What_is_a_Geoscience_Paper_of_the_Future.3F | Include here our discussion of what is a GPF]]&lt;br /&gt;
&lt;br /&gt;
=== The challenges of creating GPFs ===&lt;br /&gt;
&lt;br /&gt;
The articles in this issue reflect the current best practice for generating a Geoscience Paper of the Future.&lt;br /&gt;
&lt;br /&gt;
'''Figure discussions''': Do we want to do exactly the same figure automatically.  Figures in the paper may be a clean versions of an image generated by software.  To the extent possible, authors have included clear delineations of provenance. The goal is to assure that readers may regenerate the figures using documented workflows, data, and codes.  An important note (Allen, Sandra) is that frequently figures are generated by code, scripts, etc. yet the actual figure is finalized with user.....  Mimi is trying to say: is it really worth belaboring the point about how the prettified version of the figure is made? If it is: both of the visualization software I've used (Matlab and SigmaPlot) have actual code in the background that specifies how to set up the prettification, and this code can be found, copied out, and rerun to generate the exact same figure with all of the prettification in the same place. SigmaPlot uses Visual Basic (I think) in its macros. If it is an important point about explicit code, this should be doable. But I'm not sure it's strictly necessary to specify exactly where all the prettifications are to get the gist across.&lt;br /&gt;
&lt;br /&gt;
How much of your experimental history does one include?  (Ibrahim).  The experimental process often ends up nowhere.  Should we document all the failed experiments?  Get one DOI for the results of the successful experiment?  Another for failed trials?&lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
'''''Documenting: Timing and Intermediate proceses'''''&lt;br /&gt;
When should we document and what are the bounds on what we document?&lt;br /&gt;
For example, should we document and include data and workflows for 'failed' experiments? Or should we assign datasets DOIs before we know the results from using them?  &lt;br /&gt;
The group thinks that  good ideas/practices may include documenting and sharing data when you have a clear understanding of the outcomes worth reporting. For example successful experiments should have clear, clean data documented and shared. Whereas one strategy with 'failed' experiments could include bundling the intermediate datasets with one DOI and a more general discussion of the process/methods.&lt;br /&gt;
&lt;br /&gt;
=== Related work ===&lt;br /&gt;
&lt;br /&gt;
[[Discuss_what_we_will_consider_a_GPF#New_Frameworks_to_Create_a_New_Generation_of_Scientific_Articles | Include here the related work we have discussed]]&lt;br /&gt;
&lt;br /&gt;
== Papers to be included ==&lt;br /&gt;
&lt;br /&gt;
Would it be worthwhile to group the papers into broader categories rather than giving specifics about every single paper?&lt;br /&gt;
&lt;br /&gt;
For each submission, we describe:&lt;br /&gt;
&lt;br /&gt;
* '''Authors and affiliations'''&lt;br /&gt;
* '''Keywords of research area'''&lt;br /&gt;
* '''Tentative title'''&lt;br /&gt;
* '''Short abstract'''&lt;br /&gt;
* '''Challenge''' (including &amp;quot;Reproducibility,&amp;quot; &amp;quot;Dark Code,&amp;quot; &amp;quot;Sharing Big Data,&amp;quot; and &amp;quot;Transferability&amp;quot;)&lt;br /&gt;
* '''Relationship to other publications''' (is the article based on a previously published article? is it new content? IF PREVIOUSLY PUBLISHED, PLS PROVIDE A POINTER TO THE PUBLISHED ARTICLE AND SPECIFY WHAT PERCENTAGE OF THE WORK PRESENTED WILL BE NEW)&lt;br /&gt;
* '''Pointer to the wiki page that documents the article'''&lt;br /&gt;
* '''Expected submission date'''&lt;br /&gt;
&lt;br /&gt;
=== [David 2015] ===&lt;br /&gt;
&lt;br /&gt;
* '''Authors and affiliations:''' [[Cedric David]]&lt;br /&gt;
* '''Keywords of research area:''' Hydrology, Rivers, Modeling, Testing, Reproducibility. &lt;br /&gt;
* '''Tentative title:''' Going beyond triple-checking, allowing for peace of mind in community model development.&lt;br /&gt;
* '''Short abstract:''' The development of computer models in the general field of geoscience is often made incrementally over many years.  Endeavors that generally start on one single researcher's own machine evolve over time into software that are often much larger than was initially anticipated.  Looking at years of building on their computer code, sometimes without much training in computer science, geoscience software developers can easily experience an overwhelming sense of incompetence when contemplating ways to further community usage of their software.  How does one allow others to use their code?  How can one foster survival of their tool?  How could one possibly ensure the scientific integrity of ongoing developments including those made by others?  Common issues faced by geoscience developers include selecting a license, learning how to track and document past and ongoing changes, choosing a software repository, and allowing for community development.  This paper provides a brief summary of experience with the three former steps of software growth by focusing on the almost decade-long code development of a river routing model.  The core of this study, however, focuses on reproducing previously-published experiments.  This step is highly repetitive and can therefore benefit greatly from automation.  Additionally, enabling automated software testing can arguably be considered the final step for sustainable software sharing, by allowing the main software developer to let go of a mental block considering scientific integrity.  Creating tools to automatically compare the results of an updated version of a software with those of previous studies can not only save the main developer's own time, it can also empower other researchers to in their ability to check and justify that their potential additions have retained scientific integrity.   &lt;br /&gt;
* '''Challenge:''' Reproducibility; Ensure that updates to an existing model are able to reproduce a series of simulations published previously.&lt;br /&gt;
* '''Relationship to other publications:''' This research is related to past and ongoing development of the Routing Application for Parallel computatIon of Discharge (RAPID).  The primary focus of this paper is to allow automated reproducibility of at least the [http://dx.doi.org/10.1175/2011JHM1345.1 first RAPID publication].  The scientific subject of this GPF differs from the article(s) to be reproduced as its focus is on development of automatic testing methods.  In that regard, the paper is expected to be 95% new. &lt;br /&gt;
* '''Pointer to the wiki page that documents the article:''' [[Document_GPF_activities_by_Cedric_David | Page]]&lt;br /&gt;
* '''Expected submission date:'''&lt;br /&gt;
&lt;br /&gt;
=== [Demir 2015] ===&lt;br /&gt;
&lt;br /&gt;
* '''Authors and affiliations:''' [[Ibrahim Demir]]&lt;br /&gt;
* '''Keywords of research area:''' hydrological network, optimization, network representation, database query&lt;br /&gt;
* '''Tentative title:''' Analysis and Optimization of Hydrological Network Database Representation Methods for Fast Access and Query in Web-based System&lt;br /&gt;
* '''Short abstract:''' Web based systems allow users to delineate watersheds on interactive map environments using server side processing. With increasing resolution of hydrological networks, optimized methods for storage of network representation in databases, and efficient queries and actions on the river network structure become critical. This paper presents a detailed study on analysis of widely used methods for representing hydrological networks in relational databases, and benchmarking common queries and modifications on the network structure using these methods. The analysis has been applied to the hydrological network of Iowa utilizing 90m DEM and 600,000 network nodes. The application results indicate that the representation methods provide massive improvements on query times and storage of network structure in the database. Suggested method allows watershed delineation tools running on client-side with desktop-like performance. &lt;br /&gt;
* '''Challenge:''' Reproducibility, Transferability; Some of the internal steps to prepare data might require long computation time and different software environments.&lt;br /&gt;
* '''Relationship to other publications:''' The article is based on a new study&lt;br /&gt;
* '''Pointer to the wiki page that documents the article:''' [[Document_GPF_activities_by_Ibrahim_Demir | Page]]&lt;br /&gt;
* '''Expected submission date:'''&lt;br /&gt;
&lt;br /&gt;
=== [Fulweiler 2015] ===&lt;br /&gt;
&lt;br /&gt;
* '''Authors and affiliations:''' [[Wally Fulweiler]]&lt;br /&gt;
* '''Keywords of research area:'''&lt;br /&gt;
* '''Tentative title:'''&lt;br /&gt;
* '''Short abstract:'''&lt;br /&gt;
* '''Challenge:''' &lt;br /&gt;
* '''Relationship to other publications:''' (is the article based on a previously published article? is it new content?)&lt;br /&gt;
* '''Pointer to the wiki page that documents the article:''' [[Document_GPF_activities_by_Wally_Fulweiler | Page]]&lt;br /&gt;
* '''Expected submission date:'''&lt;br /&gt;
&lt;br /&gt;
=== [Loh and Karlstrom 2015] ===&lt;br /&gt;
&lt;br /&gt;
* '''Authors and affiliations:'''  [[Lay Kuan Loh]] and [[Leif Karlstrom]]&lt;br /&gt;
* '''Keywords of research area:''' Spatial clustering, Eigenvector selection, Entropy Ranking, Cascades Volcanic Region, [http://geosphere.gsapubs.org/content/3/3/152.abstract Afar Depression], [http://astrogeology.usgs.gov/search/details/Mars/Research/Volcanic/TharsisVents/zip Tharsis provonce]&lt;br /&gt;
* '''Tentative title:''' Characterization of volcanic vent distributions using spectral clustering with eigenvector selection and entropy ranking&lt;br /&gt;
* '''Short abstract:''' Volcanic vents on the surface of Earth and other planets often appear in groups that exhibit spatial patterning. Such vent distributions reflect complex interplay between time-evolving mechanical controls on the pathways of magma ascent, background tectonic stresses, and unsteady supply of rising magma. With the ultimate aim of connecting surface vent distributions with the dynamics of magma ascent, we have developed a clustering method to quantify spatial patterns in vents. Clustering is typically used in exploratory data analysis to identify groups with similar behavior by partitioning a dataset into clusters that share similar attributes. Traditional clustering algorithms that work well on simple point-cloud type synthetic datasets generally do not scale well the real-world data we are interested in, where there are poor boundaries between clusters and much ambiguity in cluster assignments. We instead use a spectral clustering algorithm with eigenvector selection based on entropy ranking based off work from [http://www.sciencedirect.com/science/article/pii/S0925231210001311 Zhao et al 2010] that outperforms traditional spectral clustering algorithms in choosing the right number of clusters for point data. We benchmark this algorithm on synthetic vent data with increasingly complex spatial distributions, to test the ability to accurately cluster vent data with variable spatial density, skewness, number of clusters, and proximity of clusters. We then apply our algorithm to several real-world datasets from the Cascades, Afar Depression and Mars. &lt;br /&gt;
* '''Challenge:''' Reproducibility (i.e., Quantifying clustering); We plan to study how varying the statistical distribution, density, skewness, background noise, number of clusters, proximity of clusters, and combinations of any of these factors affects the performance of our algorithm. We test it against man-made and real world datasets. ''' &lt;br /&gt;
* '''Relationship to other publications:''' New content, but one of the databases we are studying in the paper (Cascades Volcanic Range) would be based off a different paper we are preparing and planning to submit earlier. &lt;br /&gt;
* '''Pointer to the wiki page that documents the article:''' [[Document_GPF_activities_by_Leif_Karlstrom | Page]]&lt;br /&gt;
* '''Expected submission date:''' June 2015&lt;br /&gt;
&lt;br /&gt;
=== [Lee 2015] ===&lt;br /&gt;
&lt;br /&gt;
* '''Authors and affiliations:''' [[Kyo Lee]], Maziyar Boustani and Chris Mattmann, Jet Propulsion Laboratory&lt;br /&gt;
* '''Keywords of research area:'''North American regional climate, regional climate model evaluation system, Open Climate Workbench, &lt;br /&gt;
* '''Tentative title:''' Evaluation of simulated temperature, precipitation, cloud fraction and insolation over the conterminous United States using Regional Climate Model Evaluation System&lt;br /&gt;
* '''Short abstract:'''This study describes the detailed process of evaluating model fidelity in simulating four key climate variables, surface air temperature, precipitation, cloud fraction and insolation and their covariability over the conterminous United States region. Regional Climate Model Evaluation System (RCMES), a suite of public database and open-source software package, provides both observational datasets and data processors useful for evaluating any climate models. In this paper, we provide a clear and easy-to-follow workflow of RCMES to replicate published papers evaluating North American Regional Climate Change Assessment Program (NARCCAP) regional climate model (RCM) hindcast simulations using observations from variety of sources. &lt;br /&gt;
* '''Challenge:'''Big Data Sharing, Dark Code; Sharing big data, better documenting source codes, encouraging climate science community to use RCMES  &lt;br /&gt;
* '''Relationship to other publications:''' [http://journals.ametsoc.org/doi/abs/10.1175/JCLI-D-12-00452.1 Kim et al. 2013], [http://link.springer.com/article/10.1007/s00382-014-2253-y Lee et al. 2014]&lt;br /&gt;
* '''Pointer to the wiki page that documents the article:''' [[Document_GPF_activities_by_Kyo_Lee | Page]]&lt;br /&gt;
* '''Expected submission date:'''End of June 2015&lt;br /&gt;
&lt;br /&gt;
=== [Miller 2015] ===&lt;br /&gt;
&lt;br /&gt;
* '''Authors and affiliations:''' [[Kim Miller]]&lt;br /&gt;
* '''Keywords of research area:'''&lt;br /&gt;
* '''Tentative title:'''&lt;br /&gt;
* '''Short abstract:'''&lt;br /&gt;
* '''Challenge:''' &lt;br /&gt;
* '''Relationship to other publications:''' (is the article based on a previously published article? is it new content?)&lt;br /&gt;
* '''Pointer to the wiki page that documents the article:''' [[Document_GPF_activities_by_Kim_Miller | Page]]&lt;br /&gt;
* '''Expected submission date:'''&lt;br /&gt;
&lt;br /&gt;
=== [Mills 2015] ===&lt;br /&gt;
&lt;br /&gt;
* '''Authors and affiliations:''' [[Heath Mills]], University of Houston Clear Lake; Brandi Kiel Reese, Texas A&amp;amp;M Corpus Christi&lt;br /&gt;
* '''Keywords of research area:'''&lt;br /&gt;
* '''Tentative title:'''Iron and Sulfur Cycling Biogeography Using Advanced Geochemical and Molecular Analyses&lt;br /&gt;
* '''Short abstract:'''My paper will develop and document a new pipeline to analyze a combined and robust genetic and geochemical data set. New, reproducible methods will be highlighted in this manuscript to help others better analyze similar data sets. There is a general lack of guidance within my field for such challenges. This manuscript will be unique and helpful from an analysis standpoint as well as for the science being presented.&lt;br /&gt;
* '''Challenge:''' Reproducibility; Dark Code&lt;br /&gt;
* '''Relationship to other publications:''' Original Manuscript&lt;br /&gt;
* '''Pointer to the wiki page that documents the article:''' [[Document_GPF_activities_by_Heith_Mills | Page]]&lt;br /&gt;
* '''Expected submission date:'''&lt;br /&gt;
&lt;br /&gt;
=== [Oh 2015] ===&lt;br /&gt;
&lt;br /&gt;
* '''Authors and affiliations:''' [[Ji-Hyun Oh]] Jet Propulsion Laboratory/University of Southern California&lt;br /&gt;
* '''Keywords of research area:''' Tropical Meteorology, Madden-Julian Oscillation, Momentum budget analysis &lt;br /&gt;
* '''Tentative title:''' Tools for computing momentum budget for the westerly wind event associated with the Madden-Julian Oscillation&lt;br /&gt;
* '''Short abstract:'''As one of the most pronounced modes of tropical intraseasonal variability, the Madden-Julian Oscillation (MJO) prominently connects global weather and climate, and serves as one of critical predictability sources for extended-range forecasting. The zonal circulation of the MJO is characterized by low-level westerlies (easterlies) in and to the west (east) of the convective center, respectively. The direction of zonal winds in the upper troposphere is opposite to that in the lower troposphere. In addition to the convective signal as an identifier of the MJO initiation, certain characteristics of the zonal circulation been used as a standard metric for monitoring the state of MJO and investigating features of the MJO and its impact on other atmospheric phenomena. This paper documents a tool for  investigating  the generation of low-level westerly winds during the MJO life cycle. The tool is used for the momentum budget analysis to understand the respective contributions of various processes involved in the wind evolution associated with the MJO using European Centre for Medium-Range Weather Forecasts operational analyses during Dynamics of the Madden–Julian Oscillation field campaign.&lt;br /&gt;
&lt;br /&gt;
* '''Challenge:''' Reproducibility, Dark Code; This paper will cover how to reproduce two key figures from the paper that I recently submitted to Journal of Atmospheric Science. This will include detailed procedures related to generating the figures such as how/where to download data, how to transform the format of the data to be used as an input for my codes, and so on.. &lt;br /&gt;
* '''Relationship to other publications:''' (is the article based on a previously published article? is it new content?) This article is related to the part of the paper submitted to Journal of Atmospheric Science. &lt;br /&gt;
* '''Pointer to the wiki page that documents the article:''' [[Document_GPF_activities_by_Ji_Hyun | Page]]&lt;br /&gt;
* '''Expected submission date:'''&lt;br /&gt;
&lt;br /&gt;
=== [Pierce 2015] ===&lt;br /&gt;
&lt;br /&gt;
* '''Authors and affiliations:'''  [[Suzanne Pierce]] and John Gentle (Texas Advanced Computing Center and Jackson School of Geosciences, The University of Texas at Austin&lt;br /&gt;
&lt;br /&gt;
* '''Keywords of research area:''' Decision Support Systems, Hydrogeology, Participatory Modeling, Data Fusion &lt;br /&gt;
* '''Tentative title:''' MCSDSS: An accessible platform and application to enable data fusion and interactive visualization for the Geosciences&lt;br /&gt;
* '''Short abstract:'''&lt;br /&gt;
&lt;br /&gt;
* '''Challenge:''' Reproducibility, Dark Code; Fully document a new software application and framework using example case study data and tutorials.&lt;br /&gt;
* '''Relationship to other publications:''' This article is new content&lt;br /&gt;
* '''Pointer to the wiki page that documents the article:''' [[Document_GPF_activities_by_Suzanne_Pierce | Page]]&lt;br /&gt;
* '''Expected submission date:'''&lt;br /&gt;
&lt;br /&gt;
=== [Pope 2015] ===&lt;br /&gt;
&lt;br /&gt;
* '''Authors and affiliations:''' [[Allen Pope]], National Snow and Ice Data Center, University of Colorado, Boulder&lt;br /&gt;
* '''Keywords of research area:''' Glaciology, Remote Sensing, Landsat 8, Polar Science&lt;br /&gt;
* '''Tentative title:''' Data and Code for Estimating and Evaluating Supraglacial Lake Depth With Landsat 8 and other Multispectral Sensors&lt;br /&gt;
* '''Short abstract:''' Supraglacial lakes play a significant role in glacial hydrological systems – for example, transporting water to the glacier bed in Greenland or leading to ice shelf fracture and disintegration in Antarctica. To investigate these important processes, multispectral remote sensing provides multiple methods for estimating supraglacial lake depth – either through single-band or band-ratio methods, both empirical and physically-based. Landsat 8 is the newest satellite in the Landsat series. With new bands, higher dynamic range, and higher radiometric resolution, the Operational Land Imager (OLI) aboard Landsat 8 has a lot of potential. &lt;br /&gt;
&lt;br /&gt;
This paper will document the data and code used in processing in situ reflectance spectra and depth measurements to investigate the ability of Landsat 8 to estimate lake depths using multiple methods, as well as quantify improvements over Landsat 7’s ETM+. A workflow, data, and code are provided to detail promising methods as applied to Landsat 8 OLI imagery of case study areas in Greenland, allowing calculation of regional volume estimates using 2013 and 2014 summer-season imagery. Altimetry from WorldView DEMs are used to validate lake depth estimates. The optimal method for supraglacial lake depth estimation with Landsat 8 is shown to be an average of single band depths by red and panchromatic bands. With this best method, preliminary investigation of seasonal behavior and elevation distribution of lakes is also discussed and documented.&lt;br /&gt;
* '''Challenge:''' Reproducibility, Dark Code&lt;br /&gt;
* '''Relationship to other publications:''' Documenting and explaining the data and code behind the analysis and results presented in another paper.&lt;br /&gt;
* '''Pointer to the wiki page that documents the article:''' [[Document_GPF_activities_by_Allen_Pope | Page]]&lt;br /&gt;
* '''Expected submission date:''' Late June 2015&lt;br /&gt;
&lt;br /&gt;
=== [Read and Winslow 2015] ===&lt;br /&gt;
&lt;br /&gt;
* '''Authors and affiliations:''' [[Jordan Read]] and [[Luke Winslow]]&lt;br /&gt;
* '''Keywords of research area:'''&lt;br /&gt;
* '''Tentative title:'''&lt;br /&gt;
* '''Short abstract:'''&lt;br /&gt;
* '''Challenge:''' &lt;br /&gt;
* '''Relationship to other publications:''' (is the article based on a previously published article? is it new content?)&lt;br /&gt;
* '''Pointer to the wiki page that documents the article:''' [[Document_GPF_activities_by_Jordan_Read | Page]]&lt;br /&gt;
* '''Expected submission date:'''&lt;br /&gt;
&lt;br /&gt;
=== [Tzeng 2015] ===&lt;br /&gt;
&lt;br /&gt;
* '''Authors and affiliations:''' [[Mimi Tzeng]], Brian Dzwonkowski (DISL); Kyeong Park (TAMU Galveston)&lt;br /&gt;
* '''Keywords of research area:'''physical oceanography, remote sensing&lt;br /&gt;
* '''Tentative title:''' Fisheries Oceanography of Coastal Alabama (FOCAL): A Subset of a Time-Series of Hydrographic and Current Data from a Permanent Moored Station Outside Mobile Bay (27 Jan to 18 May 2011)&lt;br /&gt;
* '''Short abstract:'''The Fisheries Oceanography in Coastal Alabama (FOCAL) program began in 2006 as a way for scientists at Dauphin Island Sea Lab (DISL) to study the natural variability of Alabama's nearshore environment as it relates to fisheries production. FOCAL provided a long-term baseline data set that included time-series hydrographic data from a permanent offshore mooring (ADCP, vertical thermister array and CTDs at surface and bottom) and shipboard surveys (vertical CTD profiles and water sampling), as well as monthly ichthyoplankton and zooplankton (depth-discrete) sample collections at FOCAL sites. The subset of data presented here are from the mooring, and includes a vertical array of thermisters, CTDs at surface and bottom, an ADCP at the bottom, and vertical CTD profiles collected at the mooring during maintenance surveys. The mooring is located at 30 05.410'N 88 12.694'W, 25 km southwest of the entrance to Mobile Bay. Temperature, salinity, density, depth, and current velocity data were collected at 20-minute intervals from 2006 to 2012. Other parameters, such as dissolved oxygen, are available for portions of the time series depending on which instruments were deployed at the time.&lt;br /&gt;
* '''Challenge:''' Dark Code, Reproducibility; My paper will be about the processing of data in a larger dataset, from which peer-reviewed papers have been written. The processing I did was not specific to any particular paper. I can point to an example paper that used some of the data from this dataset, that I processed, however all of the figures in the paper are composites that also include other data from elsewhere that I had nothing to do with (and it wouldn't be feasible to try to get hold of the other data within our timeframe).&lt;br /&gt;
* '''Relationship to other publications:''' A recent paper that used the part of the FOCAL data I'm documenting as the sample from the larger dataset: Dzwonkowski, Brian, Kyeong Park, Jungwoo Lee, Bret M. Webb, and Arnoldo Valle-Levinson. 2014. &amp;quot;Spatial variability of flow over a river-influenced inner shelf in coastal Alabama during spring.&amp;quot; Continental Shelf Research 74:25-34.&lt;br /&gt;
* '''Pointer to the wiki page that documents the article:''' [[Document_GPF_activities_by_Mimi_Tzeng | Page]]&lt;br /&gt;
* '''Expected submission date:'''&lt;br /&gt;
&lt;br /&gt;
=== [Villamizar 2015] ===&lt;br /&gt;
&lt;br /&gt;
* '''Authors and affiliations:''' [[Sandra Villamizar]], University of California, Merced&lt;br /&gt;
* '''Keywords of research area:''' river ecohydrology&lt;br /&gt;
* '''Tentative title:''' Producing long-term series of whole-stream metabolism using readily available data.  &lt;br /&gt;
* '''Short abstract:''' Continuous water quality and river discharge data that are readily available through government websites may be used to produce valuable information about key processes within a river ecosystem. In this paper I describe in detail the steps for acquisition and processing of river flow, dissolved oxygen, temperature, and specific conductance data that, combined with atmospheric data and physical properties of the river reach of interest, allow for the production of a long-term series of whole stream metabolism. This information is key in understanding the structure and function of an ecosystem such as the San Joaquin River in the Central Valley of California which has been increasingly degraded during the last 60 years due to intensive human intervention but now, since 2010, has been going through a restoration effort. The key advantage of this tool is that it uses readily available information to produce knowledge about a river ecosystem. This set of scripts, written in the R code, can be used immediately for any other river for which the key parameters (river flow, dissolved oxygen, temperature, and specific conductivity) are available. The scripts can also be modified by users to fit their particular site conditions.&lt;br /&gt;
 &lt;br /&gt;
* '''Challenge:''' Reproducibility; Dark Code; Document new software/applications. This set of scripts was written after the necessity of generating daily estimates of metabolic rates for long periods of time and at various sites within the San Joaquin River.  &lt;br /&gt;
* '''Relationship to other publications:''' This will be a new publication&lt;br /&gt;
* '''Pointer to the wiki page that documents the article:''' [[Document_GPF_activities_by_Sandra_Villamizar | Page]]&lt;br /&gt;
* '''Expected submission date:''' To be defined&lt;br /&gt;
&lt;br /&gt;
=== [Yu and Bhatt 2015] ===&lt;br /&gt;
&lt;br /&gt;
* '''Authors and affiliations:''' [[Xuan Yu]], Department of Geological Sciences, University of Delaware. Gopal Bhatt, Department of Civil &amp;amp; Environmental Engineering, Pennsylvania State University. &lt;br /&gt;
* '''Keywords of research area:''' coupled processes, integrated hydrologic modeling, PIHM, surface flow, subsurface flow, open science&lt;br /&gt;
* '''Tentative title:''' Learning integrated modeling of surface and subsurface flow from scratch&lt;br /&gt;
* '''Short abstract:''' Integrated modeling of surface and subsurface flow has been of great interest in understanding not only intimate interconnectedness of hydrological processes, but also land-surface energy balance, biogeochemical and ecological processes, and landscape evolution. Although a growing number of complex hydrologic models have been used for resolving environmental processes, hypothesis testing, hydrologic predictions for effective management of watershed, very limited resources of the model implementation have been made accessible to a large group of model users. The users have to invest a significant amount of time and effort to reproduce, and to understand the workflow of hydrologic simulation in a modeling paper. To provide a challenging and stimulating introduction to integrated modeling of surface and subsurface flow in this paper, we revisit the development of Penn State Integrated Hydrologic Model (PIHM) by reproducing a numerical benchmarking example, and a real world catchment scale application. Specifically, we document PIHM and it’s modeling workflow to enable basic understanding of simulating coupled surface and subsurface flow processes. We provide model and data to highlight the reciprocal roles between the two. In addition, we incorporate user experience as third dimension in the modeling workflow to enable deeper communications between model developers and users. The workflow has important implications for smoothing and accelerating open scientific collaborations in geosciences research.&lt;br /&gt;
* '''Challenge:''' Reproducibility; Reproduce published simulations by a existing model with the latest version. Benchmarking modeling application for numerical experiment and field data.&lt;br /&gt;
* '''Relationship to other publications:''' The article is based on a previously published article. &lt;br /&gt;
* '''Pointer to the wiki page that documents the article:''' [[Document_GPF_activities_by_Xuan_Yu | Page]]&lt;br /&gt;
* '''Expected submission date:''' End of June 2015&lt;br /&gt;
&lt;br /&gt;
== Special Issue Editors ==&lt;br /&gt;
&lt;br /&gt;
* Co-editor: Chris Duffy and/or Scott Peckham&lt;br /&gt;
* Co-editor: Cedric David&lt;br /&gt;
* Co-editor: possibly Karan Venayagamoorthy&lt;br /&gt;
&lt;br /&gt;
The editors will only accept submissions that follow the [[Develop_proposal_for_special_issue#Special_Issue_Review_Criteria | special issue review criteria]].&lt;br /&gt;
&lt;br /&gt;
The editors will select a set of reviewers to handle the submissions.  Reviewers will include computer scientists, library scientists, and geoscientists.&lt;br /&gt;
&lt;br /&gt;
== Special Issue Review Criteria ==&lt;br /&gt;
&lt;br /&gt;
The reviewers will be asked to provide feedback on the papers according to the following criteria. Note that some papers will have good reasons for limiting the information (e.g. the data is from third parties and not openly available, etc), and in that case they would document those reasons.&lt;br /&gt;
&lt;br /&gt;
* Documentation of the datasets: descriptions of datasets, unique identifiers, repositories.&lt;br /&gt;
* Documentation of software: description of all software used (including pre-processing of data, visualization steps, etc), unique identifiers, repositories.&lt;br /&gt;
* Documentation of the provenance of results: provenance for each figure or result, such as the workflow or the provenance record.&lt;br /&gt;
&lt;br /&gt;
== Tentative Timeline ==&lt;br /&gt;
&lt;br /&gt;
* Journal committed to special issue: April 15, 2015&lt;br /&gt;
* Submissions due to editors: June 30, 2015&lt;br /&gt;
* Reviews due: Sept 15, 2015&lt;br /&gt;
* Decisions out to authors: Sept 30, 2015&lt;br /&gt;
* Revisions due: October 31, 2015&lt;br /&gt;
* Final versions due November 15, 2015&lt;br /&gt;
* Issue published December 31, 2015&lt;br /&gt;
&lt;br /&gt;
&amp;lt;!-- Add any wiki Text above this Line --&amp;gt;&lt;br /&gt;
&amp;lt;!-- Do NOT Edit below this Line --&amp;gt;&lt;br /&gt;
{{#set:&lt;br /&gt;
	Owner=Chris_Duffy|&lt;br /&gt;
	Participants=Yolanda_Gil|&lt;br /&gt;
	Participants=Scott_Peckham|&lt;br /&gt;
	Participants=Cedric_David|&lt;br /&gt;
	Participants=Ibrahim_Demir|&lt;br /&gt;
	Participants=Wally_Fulweiler|&lt;br /&gt;
	Participants=Leif_Karlstrom|&lt;br /&gt;
	Participants=Kyo_Lee|&lt;br /&gt;
	Participants=Kim_Miller|&lt;br /&gt;
	Participants=Heath_Mills|&lt;br /&gt;
	Participants=Ji-Hyun_Oh|&lt;br /&gt;
	Participants=Suzanne_Pierce|&lt;br /&gt;
	Participants=Allen_Pope|&lt;br /&gt;
	Participants=Jordan_Read|&lt;br /&gt;
	Participants=Mimi_Tzeng|&lt;br /&gt;
	Participants=Sandra_Villamizar|&lt;br /&gt;
	Participants=Xuan_Yu|&lt;br /&gt;
	Progress=20|&lt;br /&gt;
	StartDate=2015-03-10|&lt;br /&gt;
	TargetDate=2015-03-16|&lt;br /&gt;
	Type=Low}}&lt;/div&gt;</summary>
		<author><name>Suzanne</name></author>	</entry>

	<entry>
		<id>https://www.organicdatascience.org/gpf/index.php?title=Develop_proposal_for_special_issue&amp;diff=11707</id>
		<title>Develop proposal for special issue</title>
		<link rel="alternate" type="text/html" href="https://www.organicdatascience.org/gpf/index.php?title=Develop_proposal_for_special_issue&amp;diff=11707"/>
				<updated>2015-04-03T17:56:03Z</updated>
		
		<summary type="html">&lt;p&gt;Suzanne: /* [Pierce 2015] */&lt;/p&gt;
&lt;hr /&gt;
&lt;div&gt;[[Category:Task]]&lt;br /&gt;
&lt;br /&gt;
== Background: Why a Special Issue on Geoscience Papers of the Future? ==&lt;br /&gt;
&lt;br /&gt;
[[Discuss_what_we_will_consider_a_GPF#The_Vision | Include here our discussion for the vision]]&lt;br /&gt;
&lt;br /&gt;
Background should be 1-2 pages.&lt;br /&gt;
&lt;br /&gt;
Motivated by need to fully document and make research accessible and reproducible. &lt;br /&gt;
&lt;br /&gt;
=== Motivation: The EarthCube Initiative and the GeoSoft Project ===&lt;br /&gt;
&lt;br /&gt;
[http://www.geosoft-earthcube.org/about Include here background about GeoSoft from the web site]&lt;br /&gt;
&lt;br /&gt;
OSTP memo.  EarthCube reports.&lt;br /&gt;
Other reports that talk about the need for new approaches to editing.&lt;br /&gt;
&lt;br /&gt;
It's possible that small or very large contributions are not well captured in the current publishing paradigms.  Nanopublications.&lt;br /&gt;
&lt;br /&gt;
For example, nano-publications are a possible way to reflect advances in a research process that may not merit a full pubication but they are useful advances to share with the community. A challenge here is that there is a stigma in publishing for publishing units that are too small or very small.  &lt;br /&gt;
&lt;br /&gt;
Alternatively, a very large piece of research or work with many parts may be better suited to a GPF style publication.&lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
Perhaps, the concept of a 'paper' can be better reflected in the concept of a 'wrapper' or a collection of materials and resources. The purpose is to assure that publications are representative of the work, effort, and results achieved in the research process.&lt;br /&gt;
&lt;br /&gt;
=== What is a GPF ===&lt;br /&gt;
&lt;br /&gt;
[[Discuss_what_we_will_consider_a_GPF#What_is_a_Geoscience_Paper_of_the_Future.3F | Include here our discussion of what is a GPF]]&lt;br /&gt;
&lt;br /&gt;
=== The challenges of creating GPFs ===&lt;br /&gt;
&lt;br /&gt;
The articles in this issue reflect the current best practice for generating a Geoscience Paper of the Future.&lt;br /&gt;
&lt;br /&gt;
'''Figure discussions''': Do we want to do exactly the same figure automatically.  Figures in the paper may be a clean versions of an image generated by software.  To the extent possible, authors have included clear delineations of provenance. The goal is to assure that readers may regenerate the figures using documented workflows, data, and codes.  An important note (Allen, Sandra) is that frequently figures are generated by code, scripts, etc. yet the actual figure is finalized with user.....  Mimi is trying to say: is it really worth belaboring the point about how the prettified version of the figure is made? If it is: both of the visualization software I've used (Matlab and SigmaPlot) have actual code in the background that specifies how to set up the prettification, and this code can be found, copied out, and rerun to generate the exact same figure with all of the prettification in the same place. SigmaPlot uses Visual Basic (I think) in its macros. If it is an important point about explicit code, this should be doable. But I'm not sure it's strictly necessary to specify exactly where all the prettifications are to get the gist across.&lt;br /&gt;
&lt;br /&gt;
How much of your experimental history does one include?  (Ibrahim).  The experimental process often ends up nowhere.  Should we document all the failed experiments?  Get one DOI for the results of the successful experiment?  Another for failed trials?&lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
'''''Documenting: Timing and Intermediate proceses'''''&lt;br /&gt;
When should we document and what are the bounds on what we document?&lt;br /&gt;
For example, should we document and include data and workflows for 'failed' experiments? Or should we assign datasets DOIs before we know the results from using them?  &lt;br /&gt;
The group thinks that  good ideas/practices may include documenting and sharing data when you have a clear understanding of the outcomes worth reporting. For example successful experiments should have clear, clean data documented and shared. Whereas one strategy with 'failed' experiments could include bundling the intermediate datasets with one DOI and a more general discussion of the process/methods.&lt;br /&gt;
&lt;br /&gt;
=== Related work ===&lt;br /&gt;
&lt;br /&gt;
[[Discuss_what_we_will_consider_a_GPF#New_Frameworks_to_Create_a_New_Generation_of_Scientific_Articles | Include here the related work we have discussed]]&lt;br /&gt;
&lt;br /&gt;
== Papers to be included ==&lt;br /&gt;
&lt;br /&gt;
Would it be worthwhile to group the papers into broader categories rather than giving specifics about every single paper?&lt;br /&gt;
&lt;br /&gt;
For each submission, we describe:&lt;br /&gt;
&lt;br /&gt;
* '''Authors and affiliations'''&lt;br /&gt;
* '''Keywords of research area'''&lt;br /&gt;
* '''Tentative title'''&lt;br /&gt;
* '''Short abstract'''&lt;br /&gt;
* '''Challenge''' (including &amp;quot;Reproducibility,&amp;quot; &amp;quot;Dark Code,&amp;quot; &amp;quot;Sharing Big Data,&amp;quot; and &amp;quot;Transferability&amp;quot;)&lt;br /&gt;
* '''Relationship to other publications''' (is the article based on a previously published article? is it new content? IF PREVIOUSLY PUBLISHED, PLS PROVIDE A POINTER TO THE PUBLISHED ARTICLE AND SPECIFY WHAT PERCENTAGE OF THE WORK PRESENTED WILL BE NEW)&lt;br /&gt;
* '''Pointer to the wiki page that documents the article'''&lt;br /&gt;
* '''Expected submission date'''&lt;br /&gt;
&lt;br /&gt;
=== [David 2015] ===&lt;br /&gt;
&lt;br /&gt;
* '''Authors and affiliations:''' [[Cedric David]]&lt;br /&gt;
* '''Keywords of research area:''' Hydrology, Rivers, Modeling, Testing, Reproducibility. &lt;br /&gt;
* '''Tentative title:''' Going beyond triple-checking, allowing for peace of mind in community model development.&lt;br /&gt;
* '''Short abstract:''' The development of computer models in the general field of geoscience is often made incrementally over many years.  Endeavors that generally start on one single researcher's own machine evolve over time into software that are often much larger than was initially anticipated.  Looking at years of building on their computer code, sometimes without much training in computer science, geoscience software developers can easily experience an overwhelming sense of incompetence when contemplating ways to further community usage of their software.  How does one allow others to use their code?  How can one foster survival of their tool?  How could one possibly ensure the scientific integrity of ongoing developments including those made by others?  Common issues faced by geoscience developers include selecting a license, learning how to track and document past and ongoing changes, choosing a software repository, and allowing for community development.  This paper provides a brief summary of experience with the three former steps of software growth by focusing on the almost decade-long code development of a river routing model.  The core of this study, however, focuses on reproducing previously-published experiments.  This step is highly repetitive and can therefore benefit greatly from automation.  Additionally, enabling automated software testing can arguably be considered the final step for sustainable software sharing, by allowing the main software developer to let go of a mental block considering scientific integrity.  Creating tools to automatically compare the results of an updated version of a software with those of previous studies can not only save the main developer's own time, it can also empower other researchers to in their ability to check and justify that their potential additions have retained scientific integrity.   &lt;br /&gt;
* '''Challenge:''' Reproducibility; Ensure that updates to an existing model are able to reproduce a series of simulations published previously.&lt;br /&gt;
* '''Relationship to other publications:''' This research is related to past and ongoing development of the Routing Application for Parallel computatIon of Discharge (RAPID).  The primary focus of this paper is to allow automated reproducibility of at least the [http://dx.doi.org/10.1175/2011JHM1345.1 first RAPID publication].  The scientific subject of this GPF differs from the article(s) to be reproduced as its focus is on development of automatic testing methods.  In that regard, the paper is expected to be 95% new. &lt;br /&gt;
* '''Pointer to the wiki page that documents the article:''' [[Document_GPF_activities_by_Cedric_David | Page]]&lt;br /&gt;
* '''Expected submission date:'''&lt;br /&gt;
&lt;br /&gt;
=== [Demir 2015] ===&lt;br /&gt;
&lt;br /&gt;
* '''Authors and affiliations:''' [[Ibrahim Demir]]&lt;br /&gt;
* '''Keywords of research area:''' hydrological network, optimization, network representation, database query&lt;br /&gt;
* '''Tentative title:''' Analysis and Optimization of Hydrological Network Database Representation Methods for Fast Access and Query in Web-based System&lt;br /&gt;
* '''Short abstract:''' Web based systems allow users to delineate watersheds on interactive map environments using server side processing. With increasing resolution of hydrological networks, optimized methods for storage of network representation in databases, and efficient queries and actions on the river network structure become critical. This paper presents a detailed study on analysis of widely used methods for representing hydrological networks in relational databases, and benchmarking common queries and modifications on the network structure using these methods. The analysis has been applied to the hydrological network of Iowa utilizing 90m DEM and 600,000 network nodes. The application results indicate that the representation methods provide massive improvements on query times and storage of network structure in the database. Suggested method allows watershed delineation tools running on client-side with desktop-like performance. &lt;br /&gt;
* '''Challenge:''' Reproducibility, Transferability; Some of the internal steps to prepare data might require long computation time and different software environments.&lt;br /&gt;
* '''Relationship to other publications:''' The article is based on a new study&lt;br /&gt;
* '''Pointer to the wiki page that documents the article:''' [[Document_GPF_activities_by_Ibrahim_Demir | Page]]&lt;br /&gt;
* '''Expected submission date:'''&lt;br /&gt;
&lt;br /&gt;
=== [Fulweiler 2015] ===&lt;br /&gt;
&lt;br /&gt;
* '''Authors and affiliations:''' [[Wally Fulweiler]]&lt;br /&gt;
* '''Keywords of research area:'''&lt;br /&gt;
* '''Tentative title:'''&lt;br /&gt;
* '''Short abstract:'''&lt;br /&gt;
* '''Challenge:''' &lt;br /&gt;
* '''Relationship to other publications:''' (is the article based on a previously published article? is it new content?)&lt;br /&gt;
* '''Pointer to the wiki page that documents the article:''' [[Document_GPF_activities_by_Wally_Fulweiler | Page]]&lt;br /&gt;
* '''Expected submission date:'''&lt;br /&gt;
&lt;br /&gt;
=== [Loh and Karlstrom 2015] ===&lt;br /&gt;
&lt;br /&gt;
* '''Authors and affiliations:'''  [[Lay Kuan Loh]] and [[Leif Karlstrom]]&lt;br /&gt;
* '''Keywords of research area:''' Spatial clustering, Eigenvector selection, Entropy Ranking, Cascades Volcanic Region, [http://geosphere.gsapubs.org/content/3/3/152.abstract Afar Depression], [http://astrogeology.usgs.gov/search/details/Mars/Research/Volcanic/TharsisVents/zip Tharsis provonce]&lt;br /&gt;
* '''Tentative title:''' Characterization of volcanic vent distributions using spectral clustering with eigenvector selection and entropy ranking&lt;br /&gt;
* '''Short abstract:''' Volcanic vents on the surface of Earth and other planets often appear in groups that exhibit spatial patterning. Such vent distributions reflect complex interplay between time-evolving mechanical controls on the pathways of magma ascent, background tectonic stresses, and unsteady supply of rising magma. With the ultimate aim of connecting surface vent distributions with the dynamics of magma ascent, we have developed a clustering method to quantify spatial patterns in vents. Clustering is typically used in exploratory data analysis to identify groups with similar behavior by partitioning a dataset into clusters that share similar attributes. Traditional clustering algorithms that work well on simple point-cloud type synthetic datasets generally do not scale well the real-world data we are interested in, where there are poor boundaries between clusters and much ambiguity in cluster assignments. We instead use a spectral clustering algorithm with eigenvector selection based on entropy ranking based off work from [http://www.sciencedirect.com/science/article/pii/S0925231210001311 Zhao et al 2010] that outperforms traditional spectral clustering algorithms in choosing the right number of clusters for point data. We benchmark this algorithm on synthetic vent data with increasingly complex spatial distributions, to test the ability to accurately cluster vent data with variable spatial density, skewness, number of clusters, and proximity of clusters. We then apply our algorithm to several real-world datasets from the Cascades, Afar Depression and Mars. &lt;br /&gt;
* '''Challenge:''' Reproducibility (i.e., Quantifying clustering); We plan to study how varying the statistical distribution, density, skewness, background noise, number of clusters, proximity of clusters, and combinations of any of these factors affects the performance of our algorithm. We test it against man-made and real world datasets. ''' &lt;br /&gt;
* '''Relationship to other publications:''' New content, but one of the databases we are studying in the paper (Cascades Volcanic Range) would be based off a different paper we are preparing and planning to submit earlier. &lt;br /&gt;
* '''Pointer to the wiki page that documents the article:''' [[Document_GPF_activities_by_Leif_Karlstrom | Page]]&lt;br /&gt;
* '''Expected submission date:''' June 2015&lt;br /&gt;
&lt;br /&gt;
=== [Lee 2015] ===&lt;br /&gt;
&lt;br /&gt;
* '''Authors and affiliations:''' [[Kyo Lee]], Maziyar Boustani and Chris Mattmann, Jet Propulsion Laboratory&lt;br /&gt;
* '''Keywords of research area:'''North American regional climate, regional climate model evaluation system, Open Climate Workbench, &lt;br /&gt;
* '''Tentative title:''' Evaluation of simulated temperature, precipitation, cloud fraction and insolation over the conterminous United States using Regional Climate Model Evaluation System&lt;br /&gt;
* '''Short abstract:'''This study describes the detailed process of evaluating model fidelity in simulating four key climate variables, surface air temperature, precipitation, cloud fraction and insolation and their covariability over the conterminous United States region. Regional Climate Model Evaluation System (RCMES), a suite of public database and open-source software package, provides both observational datasets and data processors useful for evaluating any climate models. In this paper, we provide a clear and easy-to-follow workflow of RCMES to replicate published papers evaluating North American Regional Climate Change Assessment Program (NARCCAP) regional climate model (RCM) hindcast simulations using observations from variety of sources. &lt;br /&gt;
* '''Challenge:'''Big Data Sharing, Dark Code; Sharing big data, better documenting source codes, encouraging climate science community to use RCMES  &lt;br /&gt;
* '''Relationship to other publications:''' [http://journals.ametsoc.org/doi/abs/10.1175/JCLI-D-12-00452.1 Kim et al. 2013], [http://link.springer.com/article/10.1007/s00382-014-2253-y Lee et al. 2014]&lt;br /&gt;
* '''Pointer to the wiki page that documents the article:''' [[Document_GPF_activities_by_Kyo_Lee | Page]]&lt;br /&gt;
* '''Expected submission date:'''End of June 2015&lt;br /&gt;
&lt;br /&gt;
=== [Miller 2015] ===&lt;br /&gt;
&lt;br /&gt;
* '''Authors and affiliations:''' [[Kim Miller]]&lt;br /&gt;
* '''Keywords of research area:'''&lt;br /&gt;
* '''Tentative title:'''&lt;br /&gt;
* '''Short abstract:'''&lt;br /&gt;
* '''Challenge:''' &lt;br /&gt;
* '''Relationship to other publications:''' (is the article based on a previously published article? is it new content?)&lt;br /&gt;
* '''Pointer to the wiki page that documents the article:''' [[Document_GPF_activities_by_Kim_Miller | Page]]&lt;br /&gt;
* '''Expected submission date:'''&lt;br /&gt;
&lt;br /&gt;
=== [Mills 2015] ===&lt;br /&gt;
&lt;br /&gt;
* '''Authors and affiliations:''' [[Heath Mills]], University of Houston Clear Lake; Brandi Kiel Reese, Texas A&amp;amp;M Corpus Christi&lt;br /&gt;
* '''Keywords of research area:'''&lt;br /&gt;
* '''Tentative title:'''Iron and Sulfur Cycling Biogeography Using Advanced Geochemical and Molecular Analyses&lt;br /&gt;
* '''Short abstract:'''My paper will develop and document a new pipeline to analyze a combined and robust genetic and geochemical data set. New, reproducible methods will be highlighted in this manuscript to help others better analyze similar data sets. There is a general lack of guidance within my field for such challenges. This manuscript will be unique and helpful from an analysis standpoint as well as for the science being presented.&lt;br /&gt;
* '''Challenge:''' Reproducibility; Dark Code&lt;br /&gt;
* '''Relationship to other publications:''' Original Manuscript&lt;br /&gt;
* '''Pointer to the wiki page that documents the article:''' [[Document_GPF_activities_by_Heith_Mills | Page]]&lt;br /&gt;
* '''Expected submission date:'''&lt;br /&gt;
&lt;br /&gt;
=== [Oh 2015] ===&lt;br /&gt;
&lt;br /&gt;
* '''Authors and affiliations:''' [[Ji-Hyun Oh]] Jet Propulsion Laboratory/University of Southern California&lt;br /&gt;
* '''Keywords of research area:''' Tropical Meteorology, Madden-Julian Oscillation, Momentum budget analysis &lt;br /&gt;
* '''Tentative title:''' Tools for computing momentum budget for the westerly wind event associated with the Madden-Julian Oscillation&lt;br /&gt;
* '''Short abstract:'''As one of the most pronounced modes of tropical intraseasonal variability, the Madden-Julian Oscillation (MJO) prominently connects global weather and climate, and serves as one of critical predictability sources for extended-range forecasting. The zonal circulation of the MJO is characterized by low-level westerlies (easterlies) in and to the west (east) of the convective center, respectively. The direction of zonal winds in the upper troposphere is opposite to that in the lower troposphere. In addition to the convective signal as an identifier of the MJO initiation, certain characteristics of the zonal circulation been used as a standard metric for monitoring the state of MJO and investigating features of the MJO and its impact on other atmospheric phenomena. This paper documents a tool for  investigating  the generation of low-level westerly winds during the MJO life cycle. The tool is used for the momentum budget analysis to understand the respective contributions of various processes involved in the wind evolution associated with the MJO using European Centre for Medium-Range Weather Forecasts operational analyses during Dynamics of the Madden–Julian Oscillation field campaign.&lt;br /&gt;
&lt;br /&gt;
* '''Challenge:''' Reproducibility, Dark Code; This paper will cover how to reproduce two key figures from the paper that I recently submitted to Journal of Atmospheric Science. This will include detailed procedures related to generating the figures such as how/where to download data, how to transform the format of the data to be used as an input for my codes, and so on.. &lt;br /&gt;
* '''Relationship to other publications:''' (is the article based on a previously published article? is it new content?) This article is related to the part of the paper submitted to Journal of Atmospheric Science. &lt;br /&gt;
* '''Pointer to the wiki page that documents the article:''' [[Document_GPF_activities_by_Ji_Hyun | Page]]&lt;br /&gt;
* '''Expected submission date:'''&lt;br /&gt;
&lt;br /&gt;
=== [Pierce 2015] ===&lt;br /&gt;
&lt;br /&gt;
* '''Authors and affiliations:'''  [[Suzanne Pierce]] and John Gentle (Texas Advanced Computing Center and Jackson School of Geosciences, The University of Texas at Austin&lt;br /&gt;
&lt;br /&gt;
* '''Keywords of research area:''' Decision Support Systems, Hydrogeology, Participatory Modeling, Data Fusion &lt;br /&gt;
* '''Tentative title:''' [[&lt;br /&gt;
* '''Short abstract:'''&lt;br /&gt;
&lt;br /&gt;
* '''Challenge:''' Reproducibility, Dark Code; Fully document a new software application and framework using example case study data and tutorials.&lt;br /&gt;
* '''Relationship to other publications:''' This article is new content&lt;br /&gt;
* '''Pointer to the wiki page that documents the article:''' [[Document_GPF_activities_by_Suzanne_Pierce | Page]]&lt;br /&gt;
* '''Expected submission date:'''&lt;br /&gt;
&lt;br /&gt;
=== [Pope 2015] ===&lt;br /&gt;
&lt;br /&gt;
* '''Authors and affiliations:''' [[Allen Pope]], National Snow and Ice Data Center, University of Colorado, Boulder&lt;br /&gt;
* '''Keywords of research area:''' Glaciology, Remote Sensing, Landsat 8, Polar Science&lt;br /&gt;
* '''Tentative title:''' Data and Code for Estimating and Evaluating Supraglacial Lake Depth With Landsat 8 and other Multispectral Sensors&lt;br /&gt;
* '''Short abstract:''' Supraglacial lakes play a significant role in glacial hydrological systems – for example, transporting water to the glacier bed in Greenland or leading to ice shelf fracture and disintegration in Antarctica. To investigate these important processes, multispectral remote sensing provides multiple methods for estimating supraglacial lake depth – either through single-band or band-ratio methods, both empirical and physically-based. Landsat 8 is the newest satellite in the Landsat series. With new bands, higher dynamic range, and higher radiometric resolution, the Operational Land Imager (OLI) aboard Landsat 8 has a lot of potential. &lt;br /&gt;
&lt;br /&gt;
This paper will document the data and code used in processing in situ reflectance spectra and depth measurements to investigate the ability of Landsat 8 to estimate lake depths using multiple methods, as well as quantify improvements over Landsat 7’s ETM+. A workflow, data, and code are provided to detail promising methods as applied to Landsat 8 OLI imagery of case study areas in Greenland, allowing calculation of regional volume estimates using 2013 and 2014 summer-season imagery. Altimetry from WorldView DEMs are used to validate lake depth estimates. The optimal method for supraglacial lake depth estimation with Landsat 8 is shown to be an average of single band depths by red and panchromatic bands. With this best method, preliminary investigation of seasonal behavior and elevation distribution of lakes is also discussed and documented.&lt;br /&gt;
* '''Challenge:''' Reproducibility, Dark Code&lt;br /&gt;
* '''Relationship to other publications:''' Documenting and explaining the data and code behind the analysis and results presented in another paper.&lt;br /&gt;
* '''Pointer to the wiki page that documents the article:''' [[Document_GPF_activities_by_Allen_Pope | Page]]&lt;br /&gt;
* '''Expected submission date:''' Late June 2015&lt;br /&gt;
&lt;br /&gt;
=== [Read and Winslow 2015] ===&lt;br /&gt;
&lt;br /&gt;
* '''Authors and affiliations:''' [[Jordan Read]] and [[Luke Winslow]]&lt;br /&gt;
* '''Keywords of research area:'''&lt;br /&gt;
* '''Tentative title:'''&lt;br /&gt;
* '''Short abstract:'''&lt;br /&gt;
* '''Challenge:''' &lt;br /&gt;
* '''Relationship to other publications:''' (is the article based on a previously published article? is it new content?)&lt;br /&gt;
* '''Pointer to the wiki page that documents the article:''' [[Document_GPF_activities_by_Jordan_Read | Page]]&lt;br /&gt;
* '''Expected submission date:'''&lt;br /&gt;
&lt;br /&gt;
=== [Tzeng 2015] ===&lt;br /&gt;
&lt;br /&gt;
* '''Authors and affiliations:''' [[Mimi Tzeng]], Brian Dzwonkowski (DISL); Kyeong Park (TAMU Galveston)&lt;br /&gt;
* '''Keywords of research area:'''physical oceanography, remote sensing&lt;br /&gt;
* '''Tentative title:''' Fisheries Oceanography of Coastal Alabama (FOCAL): A Subset of a Time-Series of Hydrographic and Current Data from a Permanent Moored Station Outside Mobile Bay (27 Jan to 18 May 2011)&lt;br /&gt;
* '''Short abstract:'''The Fisheries Oceanography in Coastal Alabama (FOCAL) program began in 2006 as a way for scientists at Dauphin Island Sea Lab (DISL) to study the natural variability of Alabama's nearshore environment as it relates to fisheries production. FOCAL provided a long-term baseline data set that included time-series hydrographic data from a permanent offshore mooring (ADCP, vertical thermister array and CTDs at surface and bottom) and shipboard surveys (vertical CTD profiles and water sampling), as well as monthly ichthyoplankton and zooplankton (depth-discrete) sample collections at FOCAL sites. The subset of data presented here are from the mooring, and includes a vertical array of thermisters, CTDs at surface and bottom, an ADCP at the bottom, and vertical CTD profiles collected at the mooring during maintenance surveys. The mooring is located at 30 05.410'N 88 12.694'W, 25 km southwest of the entrance to Mobile Bay. Temperature, salinity, density, depth, and current velocity data were collected at 20-minute intervals from 2006 to 2012. Other parameters, such as dissolved oxygen, are available for portions of the time series depending on which instruments were deployed at the time.&lt;br /&gt;
* '''Challenge:''' Dark Code, Reproducibility; My paper will be about the processing of data in a larger dataset, from which peer-reviewed papers have been written. The processing I did was not specific to any particular paper. I can point to an example paper that used some of the data from this dataset, that I processed, however all of the figures in the paper are composites that also include other data from elsewhere that I had nothing to do with (and it wouldn't be feasible to try to get hold of the other data within our timeframe).&lt;br /&gt;
* '''Relationship to other publications:''' A recent paper that used the part of the FOCAL data I'm documenting as the sample from the larger dataset: Dzwonkowski, Brian, Kyeong Park, Jungwoo Lee, Bret M. Webb, and Arnoldo Valle-Levinson. 2014. &amp;quot;Spatial variability of flow over a river-influenced inner shelf in coastal Alabama during spring.&amp;quot; Continental Shelf Research 74:25-34.&lt;br /&gt;
* '''Pointer to the wiki page that documents the article:''' [[Document_GPF_activities_by_Mimi_Tzeng | Page]]&lt;br /&gt;
* '''Expected submission date:'''&lt;br /&gt;
&lt;br /&gt;
=== [Villamizar 2015] ===&lt;br /&gt;
&lt;br /&gt;
* '''Authors and affiliations:''' [[Sandra Villamizar]], University of California, Merced&lt;br /&gt;
* '''Keywords of research area:''' river ecohydrology&lt;br /&gt;
* '''Tentative title:''' Producing long-term series of whole-stream metabolism using readily available data.  &lt;br /&gt;
* '''Short abstract:''' Continuous water quality and river discharge data that are readily available through government websites may be used to produce valuable information about key processes within a river ecosystem. In this paper I describe in detail the steps for acquisition and processing of river flow, dissolved oxygen, temperature, and specific conductance data that, combined with atmospheric data and physical properties of the river reach of interest, allow for the production of a long-term series of whole stream metabolism. This information is key in understanding the structure and function of an ecosystem such as the San Joaquin River in the Central Valley of California which has been increasingly degraded during the last 60 years due to intensive human intervention but now, since 2010, has been going through a restoration effort. The key advantage of this tool is that it uses readily available information to produce knowledge about a river ecosystem. This set of scripts, written in the R code, can be used immediately for any other river for which the key parameters (river flow, dissolved oxygen, temperature, and specific conductivity) are available. The scripts can also be modified by users to fit their particular site conditions.&lt;br /&gt;
 &lt;br /&gt;
* '''Challenge:''' Reproducibility; Dark Code; Document new software/applications. This set of scripts was written after the necessity of generating daily estimates of metabolic rates for long periods of time and at various sites within the San Joaquin River.  &lt;br /&gt;
* '''Relationship to other publications:''' This will be a new publication&lt;br /&gt;
* '''Pointer to the wiki page that documents the article:''' [[Document_GPF_activities_by_Sandra_Villamizar | Page]]&lt;br /&gt;
* '''Expected submission date:''' To be defined&lt;br /&gt;
&lt;br /&gt;
=== [Yu and Bhatt 2015] ===&lt;br /&gt;
&lt;br /&gt;
* '''Authors and affiliations:''' [[Xuan Yu]], Department of Geological Sciences, University of Delaware. Gopal Bhatt, Department of Civil &amp;amp; Environmental Engineering, Pennsylvania State University. &lt;br /&gt;
* '''Keywords of research area:''' coupled processes, integrated hydrologic modeling, PIHM, surface flow, subsurface flow, open science&lt;br /&gt;
* '''Tentative title:''' Learning integrated modeling of surface and subsurface flow from scratch&lt;br /&gt;
* '''Short abstract:''' Integrated modeling of surface and subsurface flow has been of great interest in understanding not only intimate interconnectedness of hydrological processes, but also land-surface energy balance, biogeochemical and ecological processes, and landscape evolution. Although a growing number of complex hydrologic models have been used for resolving environmental processes, hypothesis testing, hydrologic predictions for effective management of watershed, very limited resources of the model implementation have been made accessible to a large group of model users. The users have to invest a significant amount of time and effort to reproduce, and to understand the workflow of hydrologic simulation in a modeling paper. To provide a challenging and stimulating introduction to integrated modeling of surface and subsurface flow in this paper, we revisit the development of Penn State Integrated Hydrologic Model (PIHM) by reproducing a numerical benchmarking example, and a real world catchment scale application. Specifically, we document PIHM and it’s modeling workflow to enable basic understanding of simulating coupled surface and subsurface flow processes. We provide model and data to highlight the reciprocal roles between the two. In addition, we incorporate user experience as third dimension in the modeling workflow to enable deeper communications between model developers and users. The workflow has important implications for smoothing and accelerating open scientific collaborations in geosciences research.&lt;br /&gt;
* '''Challenge:''' Reproducibility; Reproduce published simulations by a existing model with the latest version. Benchmarking modeling application for numerical experiment and field data.&lt;br /&gt;
* '''Relationship to other publications:''' The article is based on a previously published article. &lt;br /&gt;
* '''Pointer to the wiki page that documents the article:''' [[Document_GPF_activities_by_Xuan_Yu | Page]]&lt;br /&gt;
* '''Expected submission date:''' End of June 2015&lt;br /&gt;
&lt;br /&gt;
== Special Issue Editors ==&lt;br /&gt;
&lt;br /&gt;
* Co-editor: Chris Duffy and/or Scott Peckham&lt;br /&gt;
* Co-editor: Cedric David&lt;br /&gt;
* Co-editor: possibly Karan Venayagamoorthy&lt;br /&gt;
&lt;br /&gt;
The editors will only accept submissions that follow the [[Develop_proposal_for_special_issue#Special_Issue_Review_Criteria | special issue review criteria]].&lt;br /&gt;
&lt;br /&gt;
The editors will select a set of reviewers to handle the submissions.  Reviewers will include computer scientists, library scientists, and geoscientists.&lt;br /&gt;
&lt;br /&gt;
== Special Issue Review Criteria ==&lt;br /&gt;
&lt;br /&gt;
The reviewers will be asked to provide feedback on the papers according to the following criteria. Note that some papers will have good reasons for limiting the information (e.g. the data is from third parties and not openly available, etc), and in that case they would document those reasons.&lt;br /&gt;
&lt;br /&gt;
* Documentation of the datasets: descriptions of datasets, unique identifiers, repositories.&lt;br /&gt;
* Documentation of software: description of all software used (including pre-processing of data, visualization steps, etc), unique identifiers, repositories.&lt;br /&gt;
* Documentation of the provenance of results: provenance for each figure or result, such as the workflow or the provenance record.&lt;br /&gt;
&lt;br /&gt;
== Tentative Timeline ==&lt;br /&gt;
&lt;br /&gt;
* Journal committed to special issue: April 15, 2015&lt;br /&gt;
* Submissions due to editors: June 30, 2015&lt;br /&gt;
* Reviews due: Sept 15, 2015&lt;br /&gt;
* Decisions out to authors: Sept 30, 2015&lt;br /&gt;
* Revisions due: October 31, 2015&lt;br /&gt;
* Final versions due November 15, 2015&lt;br /&gt;
* Issue published December 31, 2015&lt;br /&gt;
&lt;br /&gt;
&amp;lt;!-- Add any wiki Text above this Line --&amp;gt;&lt;br /&gt;
&amp;lt;!-- Do NOT Edit below this Line --&amp;gt;&lt;br /&gt;
{{#set:&lt;br /&gt;
	Owner=Chris_Duffy|&lt;br /&gt;
	Participants=Yolanda_Gil|&lt;br /&gt;
	Participants=Scott_Peckham|&lt;br /&gt;
	Participants=Cedric_David|&lt;br /&gt;
	Participants=Ibrahim_Demir|&lt;br /&gt;
	Participants=Wally_Fulweiler|&lt;br /&gt;
	Participants=Leif_Karlstrom|&lt;br /&gt;
	Participants=Kyo_Lee|&lt;br /&gt;
	Participants=Kim_Miller|&lt;br /&gt;
	Participants=Heath_Mills|&lt;br /&gt;
	Participants=Ji-Hyun_Oh|&lt;br /&gt;
	Participants=Suzanne_Pierce|&lt;br /&gt;
	Participants=Allen_Pope|&lt;br /&gt;
	Participants=Jordan_Read|&lt;br /&gt;
	Participants=Mimi_Tzeng|&lt;br /&gt;
	Participants=Sandra_Villamizar|&lt;br /&gt;
	Participants=Xuan_Yu|&lt;br /&gt;
	Progress=20|&lt;br /&gt;
	StartDate=2015-03-10|&lt;br /&gt;
	TargetDate=2015-03-16|&lt;br /&gt;
	Type=Low}}&lt;/div&gt;</summary>
		<author><name>Suzanne</name></author>	</entry>

	<entry>
		<id>https://www.organicdatascience.org/gpf/index.php?title=Develop_proposal_for_special_issue&amp;diff=11705</id>
		<title>Develop proposal for special issue</title>
		<link rel="alternate" type="text/html" href="https://www.organicdatascience.org/gpf/index.php?title=Develop_proposal_for_special_issue&amp;diff=11705"/>
				<updated>2015-04-03T17:55:32Z</updated>
		
		<summary type="html">&lt;p&gt;Suzanne: /* [Pierce 2015] */&lt;/p&gt;
&lt;hr /&gt;
&lt;div&gt;[[Category:Task]]&lt;br /&gt;
&lt;br /&gt;
== Background: Why a Special Issue on Geoscience Papers of the Future? ==&lt;br /&gt;
&lt;br /&gt;
[[Discuss_what_we_will_consider_a_GPF#The_Vision | Include here our discussion for the vision]]&lt;br /&gt;
&lt;br /&gt;
Background should be 1-2 pages.&lt;br /&gt;
&lt;br /&gt;
Motivated by need to fully document and make research accessible and reproducible. &lt;br /&gt;
&lt;br /&gt;
=== Motivation: The EarthCube Initiative and the GeoSoft Project ===&lt;br /&gt;
&lt;br /&gt;
[http://www.geosoft-earthcube.org/about Include here background about GeoSoft from the web site]&lt;br /&gt;
&lt;br /&gt;
OSTP memo.  EarthCube reports.&lt;br /&gt;
Other reports that talk about the need for new approaches to editing.&lt;br /&gt;
&lt;br /&gt;
It's possible that small or very large contributions are not well captured in the current publishing paradigms.  Nanopublications.&lt;br /&gt;
&lt;br /&gt;
For example, nano-publications are a possible way to reflect advances in a research process that may not merit a full pubication but they are useful advances to share with the community. A challenge here is that there is a stigma in publishing for publishing units that are too small or very small.  &lt;br /&gt;
&lt;br /&gt;
Alternatively, a very large piece of research or work with many parts may be better suited to a GPF style publication.&lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
Perhaps, the concept of a 'paper' can be better reflected in the concept of a 'wrapper' or a collection of materials and resources. The purpose is to assure that publications are representative of the work, effort, and results achieved in the research process.&lt;br /&gt;
&lt;br /&gt;
=== What is a GPF ===&lt;br /&gt;
&lt;br /&gt;
[[Discuss_what_we_will_consider_a_GPF#What_is_a_Geoscience_Paper_of_the_Future.3F | Include here our discussion of what is a GPF]]&lt;br /&gt;
&lt;br /&gt;
=== The challenges of creating GPFs ===&lt;br /&gt;
&lt;br /&gt;
The articles in this issue reflect the current best practice for generating a Geoscience Paper of the Future.&lt;br /&gt;
&lt;br /&gt;
'''Figure discussions''': Do we want to do exactly the same figure automatically.  Figures in the paper may be a clean versions of an image generated by software.  To the extent possible, authors have included clear delineations of provenance. The goal is to assure that readers may regenerate the figures using documented workflows, data, and codes.  An important note (Allen, Sandra) is that frequently figures are generated by code, scripts, etc. yet the actual figure is finalized with user.....  Mimi is trying to say: is it really worth belaboring the point about how the prettified version of the figure is made? If it is: both of the visualization software I've used (Matlab and SigmaPlot) have actual code in the background that specifies how to set up the prettification, and this code can be found, copied out, and rerun to generate the exact same figure with all of the prettification in the same place. SigmaPlot uses Visual Basic (I think) in its macros. If it is an important point about explicit code, this should be doable. But I'm not sure it's strictly necessary to specify exactly where all the prettifications are to get the gist across.&lt;br /&gt;
&lt;br /&gt;
How much of your experimental history does one include?  (Ibrahim).  The experimental process often ends up nowhere.  Should we document all the failed experiments?  Get one DOI for the results of the successful experiment?  Another for failed trials?&lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
'''''Documenting: Timing and Intermediate proceses'''''&lt;br /&gt;
When should we document and what are the bounds on what we document?&lt;br /&gt;
For example, should we document and include data and workflows for 'failed' experiments? Or should we assign datasets DOIs before we know the results from using them?  &lt;br /&gt;
The group thinks that  good ideas/practices may include documenting and sharing data when you have a clear understanding of the outcomes worth reporting. For example successful experiments should have clear, clean data documented and shared. Whereas one strategy with 'failed' experiments could include bundling the intermediate datasets with one DOI and a more general discussion of the process/methods.&lt;br /&gt;
&lt;br /&gt;
=== Related work ===&lt;br /&gt;
&lt;br /&gt;
[[Discuss_what_we_will_consider_a_GPF#New_Frameworks_to_Create_a_New_Generation_of_Scientific_Articles | Include here the related work we have discussed]]&lt;br /&gt;
&lt;br /&gt;
== Papers to be included ==&lt;br /&gt;
&lt;br /&gt;
Would it be worthwhile to group the papers into broader categories rather than giving specifics about every single paper?&lt;br /&gt;
&lt;br /&gt;
For each submission, we describe:&lt;br /&gt;
&lt;br /&gt;
* '''Authors and affiliations'''&lt;br /&gt;
* '''Keywords of research area'''&lt;br /&gt;
* '''Tentative title'''&lt;br /&gt;
* '''Short abstract'''&lt;br /&gt;
* '''Challenge''' (including &amp;quot;Reproducibility,&amp;quot; &amp;quot;Dark Code,&amp;quot; &amp;quot;Sharing Big Data,&amp;quot; ...)&lt;br /&gt;
* '''Relationship to other publications''' (is the article based on a previously published article? is it new content? IF PREVIOUSLY PUBLISHED, PLS PROVIDE A POINTER TO THE PUBLISHED ARTICLE AND SPECIFY WHAT PERCENTAGE OF THE WORK PRESENTED WILL BE NEW)&lt;br /&gt;
* '''Pointer to the wiki page that documents the article'''&lt;br /&gt;
* '''Expected submission date'''&lt;br /&gt;
&lt;br /&gt;
=== [David 2015] ===&lt;br /&gt;
&lt;br /&gt;
* '''Authors and affiliations:''' [[Cedric David]]&lt;br /&gt;
* '''Keywords of research area:''' Hydrology, Rivers, Modeling, Testing, Reproducibility. &lt;br /&gt;
* '''Tentative title:''' Going beyond triple-checking, allowing for peace of mind in community model development.&lt;br /&gt;
* '''Short abstract:''' The development of computer models in the general field of geoscience is often made incrementally over many years.  Endeavors that generally start on one single researcher's own machine evolve over time into software that are often much larger than was initially anticipated.  Looking at years of building on their computer code, sometimes without much training in computer science, geoscience software developers can easily experience an overwhelming sense of incompetence when contemplating ways to further community usage of their software.  How does one allow others to use their code?  How can one foster survival of their tool?  How could one possibly ensure the scientific integrity of ongoing developments including those made by others?  Common issues faced by geoscience developers include selecting a license, learning how to track and document past and ongoing changes, choosing a software repository, and allowing for community development.  This paper provides a brief summary of experience with the three former steps of software growth by focusing on the almost decade-long code development of a river routing model.  The core of this study, however, focuses on reproducing previously-published experiments.  This step is highly repetitive and can therefore benefit greatly from automation.  Additionally, enabling automated software testing can arguably be considered the final step for sustainable software sharing, by allowing the main software developer to let go of a mental block considering scientific integrity.  Creating tools to automatically compare the results of an updated version of a software with those of previous studies can not only save the main developer's own time, it can also empower other researchers to in their ability to check and justify that their potential additions have retained scientific integrity.   &lt;br /&gt;
* '''Challenge:''' Reproducibility; Ensure that updates to an existing model are able to reproduce a series of simulations published previously.&lt;br /&gt;
* '''Relationship to other publications:''' This research is related to past and ongoing development of the Routing Application for Parallel computatIon of Discharge (RAPID).  The primary focus of this paper is to allow automated reproducibility of at least the [http://dx.doi.org/10.1175/2011JHM1345.1 first RAPID publication].  The scientific subject of this GPF differs from the article(s) to be reproduced as its focus is on development of automatic testing methods.  In that regard, the paper is expected to be 95% new. &lt;br /&gt;
* '''Pointer to the wiki page that documents the article:''' [[Document_GPF_activities_by_Cedric_David | Page]]&lt;br /&gt;
* '''Expected submission date:'''&lt;br /&gt;
&lt;br /&gt;
=== [Demir 2015] ===&lt;br /&gt;
&lt;br /&gt;
* '''Authors and affiliations:''' [[Ibrahim Demir]]&lt;br /&gt;
* '''Keywords of research area:''' hydrological network, optimization, network representation, database query&lt;br /&gt;
* '''Tentative title:''' Analysis and Optimization of Hydrological Network Database Representation Methods for Fast Access and Query in Web-based System&lt;br /&gt;
* '''Short abstract:''' Web based systems allow users to delineate watersheds on interactive map environments using server side processing. With increasing resolution of hydrological networks, optimized methods for storage of network representation in databases, and efficient queries and actions on the river network structure become critical. This paper presents a detailed study on analysis of widely used methods for representing hydrological networks in relational databases, and benchmarking common queries and modifications on the network structure using these methods. The analysis has been applied to the hydrological network of Iowa utilizing 90m DEM and 600,000 network nodes. The application results indicate that the representation methods provide massive improvements on query times and storage of network structure in the database. Suggested method allows watershed delineation tools running on client-side with desktop-like performance. &lt;br /&gt;
* '''Challenge:''' Reproducibility, Transferability; Some of the internal steps to prepare data might require long computation time and different software environments.&lt;br /&gt;
* '''Relationship to other publications:''' The article is based on a new study&lt;br /&gt;
* '''Pointer to the wiki page that documents the article:''' [[Document_GPF_activities_by_Ibrahim_Demir | Page]]&lt;br /&gt;
* '''Expected submission date:'''&lt;br /&gt;
&lt;br /&gt;
=== [Fulweiler 2015] ===&lt;br /&gt;
&lt;br /&gt;
* '''Authors and affiliations:''' [[Wally Fulweiler]]&lt;br /&gt;
* '''Keywords of research area:'''&lt;br /&gt;
* '''Tentative title:'''&lt;br /&gt;
* '''Short abstract:'''&lt;br /&gt;
* '''Challenge:''' &lt;br /&gt;
* '''Relationship to other publications:''' (is the article based on a previously published article? is it new content?)&lt;br /&gt;
* '''Pointer to the wiki page that documents the article:''' [[Document_GPF_activities_by_Wally_Fulweiler | Page]]&lt;br /&gt;
* '''Expected submission date:'''&lt;br /&gt;
&lt;br /&gt;
=== [Loh and Karlstrom 2015] ===&lt;br /&gt;
&lt;br /&gt;
* '''Authors and affiliations:'''  [[Lay Kuan Loh]] and [[Leif Karlstrom]]&lt;br /&gt;
* '''Keywords of research area:''' Spatial clustering, Eigenvector selection, Entropy Ranking, Cascades Volcanic Region, [http://geosphere.gsapubs.org/content/3/3/152.abstract Afar Depression], [http://astrogeology.usgs.gov/search/details/Mars/Research/Volcanic/TharsisVents/zip Tharsis provonce]&lt;br /&gt;
* '''Tentative title:''' Characterization of volcanic vent distributions using spectral clustering with eigenvector selection and entropy ranking&lt;br /&gt;
* '''Short abstract:''' Volcanic vents on the surface of Earth and other planets often appear in groups that exhibit spatial patterning. Such vent distributions reflect complex interplay between time-evolving mechanical controls on the pathways of magma ascent, background tectonic stresses, and unsteady supply of rising magma. With the ultimate aim of connecting surface vent distributions with the dynamics of magma ascent, we have developed a clustering method to quantify spatial patterns in vents. Clustering is typically used in exploratory data analysis to identify groups with similar behavior by partitioning a dataset into clusters that share similar attributes. Traditional clustering algorithms that work well on simple point-cloud type synthetic datasets generally do not scale well the real-world data we are interested in, where there are poor boundaries between clusters and much ambiguity in cluster assignments. We instead use a spectral clustering algorithm with eigenvector selection based on entropy ranking based off work from [http://www.sciencedirect.com/science/article/pii/S0925231210001311 Zhao et al 2010] that outperforms traditional spectral clustering algorithms in choosing the right number of clusters for point data. We benchmark this algorithm on synthetic vent data with increasingly complex spatial distributions, to test the ability to accurately cluster vent data with variable spatial density, skewness, number of clusters, and proximity of clusters. We then apply our algorithm to several real-world datasets from the Cascades, Afar Depression and Mars. &lt;br /&gt;
* '''Challenge:''' Reproducibility (i.e., Quantifying clustering); We plan to study how varying the statistical distribution, density, skewness, background noise, number of clusters, proximity of clusters, and combinations of any of these factors affects the performance of our algorithm. We test it against man-made and real world datasets. ''' &lt;br /&gt;
* '''Relationship to other publications:''' New content, but one of the databases we are studying in the paper (Cascades Volcanic Range) would be based off a different paper we are preparing and planning to submit earlier. &lt;br /&gt;
* '''Pointer to the wiki page that documents the article:''' [[Document_GPF_activities_by_Leif_Karlstrom | Page]]&lt;br /&gt;
* '''Expected submission date:''' June 2015&lt;br /&gt;
&lt;br /&gt;
=== [Lee 2015] ===&lt;br /&gt;
&lt;br /&gt;
* '''Authors and affiliations:''' [[Kyo Lee]], Maziyar Boustani and Chris Mattmann, Jet Propulsion Laboratory&lt;br /&gt;
* '''Keywords of research area:'''North American regional climate, regional climate model evaluation system, Open Climate Workbench, &lt;br /&gt;
* '''Tentative title:''' Evaluation of simulated temperature, precipitation, cloud fraction and insolation over the conterminous United States using Regional Climate Model Evaluation System&lt;br /&gt;
* '''Short abstract:'''This study describes the detailed process of evaluating model fidelity in simulating four key climate variables, surface air temperature, precipitation, cloud fraction and insolation and their covariability over the conterminous United States region. Regional Climate Model Evaluation System (RCMES), a suite of public database and open-source software package, provides both observational datasets and data processors useful for evaluating any climate models. In this paper, we provide a clear and easy-to-follow workflow of RCMES to replicate published papers evaluating North American Regional Climate Change Assessment Program (NARCCAP) regional climate model (RCM) hindcast simulations using observations from variety of sources. &lt;br /&gt;
* '''Challenge:'''Big Data Sharing, Dark Code; Sharing big data, better documenting source codes, encouraging climate science community to use RCMES  &lt;br /&gt;
* '''Relationship to other publications:''' [http://journals.ametsoc.org/doi/abs/10.1175/JCLI-D-12-00452.1 Kim et al. 2013], [http://link.springer.com/article/10.1007/s00382-014-2253-y Lee et al. 2014]&lt;br /&gt;
* '''Pointer to the wiki page that documents the article:''' [[Document_GPF_activities_by_Kyo_Lee | Page]]&lt;br /&gt;
* '''Expected submission date:'''End of June 2015&lt;br /&gt;
&lt;br /&gt;
=== [Miller 2015] ===&lt;br /&gt;
&lt;br /&gt;
* '''Authors and affiliations:''' [[Kim Miller]]&lt;br /&gt;
* '''Keywords of research area:'''&lt;br /&gt;
* '''Tentative title:'''&lt;br /&gt;
* '''Short abstract:'''&lt;br /&gt;
* '''Challenge:''' &lt;br /&gt;
* '''Relationship to other publications:''' (is the article based on a previously published article? is it new content?)&lt;br /&gt;
* '''Pointer to the wiki page that documents the article:''' [[Document_GPF_activities_by_Kim_Miller | Page]]&lt;br /&gt;
* '''Expected submission date:'''&lt;br /&gt;
&lt;br /&gt;
=== [Mills 2015] ===&lt;br /&gt;
&lt;br /&gt;
* '''Authors and affiliations:''' [[Heath Mills]], University of Houston Clear Lake; Brandi Kiel Reese, Texas A&amp;amp;M Corpus Christi&lt;br /&gt;
* '''Keywords of research area:'''&lt;br /&gt;
* '''Tentative title:'''Iron and Sulfur Cycling Biogeography Using Advanced Geochemical and Molecular Analyses&lt;br /&gt;
* '''Short abstract:'''My paper will develop and document a new pipeline to analyze a combined and robust genetic and geochemical data set. New, reproducible methods will be highlighted in this manuscript to help others better analyze similar data sets. There is a general lack of guidance within my field for such challenges. This manuscript will be unique and helpful from an analysis standpoint as well as for the science being presented.&lt;br /&gt;
* '''Challenge:''' Reproducibility; Dark Code&lt;br /&gt;
* '''Relationship to other publications:''' Original Manuscript&lt;br /&gt;
* '''Pointer to the wiki page that documents the article:''' [[Document_GPF_activities_by_Heith_Mills | Page]]&lt;br /&gt;
* '''Expected submission date:'''&lt;br /&gt;
&lt;br /&gt;
=== [Oh 2015] ===&lt;br /&gt;
&lt;br /&gt;
* '''Authors and affiliations:''' [[Ji-Hyun Oh]] Jet Propulsion Laboratory/University of Southern California&lt;br /&gt;
* '''Keywords of research area:''' Tropical Meteorology, Madden-Julian Oscillation, Momentum budget analysis &lt;br /&gt;
* '''Tentative title:''' Tools for computing momentum budget for the westerly wind event associated with the Madden-Julian Oscillation&lt;br /&gt;
* '''Short abstract:'''As one of the most pronounced modes of tropical intraseasonal variability, the Madden-Julian Oscillation (MJO) prominently connects global weather and climate, and serves as one of critical predictability sources for extended-range forecasting. The zonal circulation of the MJO is characterized by low-level westerlies (easterlies) in and to the west (east) of the convective center, respectively. The direction of zonal winds in the upper troposphere is opposite to that in the lower troposphere. In addition to the convective signal as an identifier of the MJO initiation, certain characteristics of the zonal circulation been used as a standard metric for monitoring the state of MJO and investigating features of the MJO and its impact on other atmospheric phenomena. This paper documents a tool for  investigating  the generation of low-level westerly winds during the MJO life cycle. The tool is used for the momentum budget analysis to understand the respective contributions of various processes involved in the wind evolution associated with the MJO using European Centre for Medium-Range Weather Forecasts operational analyses during Dynamics of the Madden–Julian Oscillation field campaign.&lt;br /&gt;
&lt;br /&gt;
* '''Challenge:''' Reproducibility, Dark Code; This paper will cover how to reproduce two key figures from the paper that I recently submitted to Journal of Atmospheric Science. This will include detailed procedures related to generating the figures such as how/where to download data, how to transform the format of the data to be used as an input for my codes, and so on.. &lt;br /&gt;
* '''Relationship to other publications:''' (is the article based on a previously published article? is it new content?) This article is related to the part of the paper submitted to Journal of Atmospheric Science. &lt;br /&gt;
* '''Pointer to the wiki page that documents the article:''' [[Document_GPF_activities_by_Ji_Hyun | Page]]&lt;br /&gt;
* '''Expected submission date:'''&lt;br /&gt;
&lt;br /&gt;
=== [Pierce 2015] ===&lt;br /&gt;
&lt;br /&gt;
* '''Authors and affiliations:'''  [[Suzanne Pierce]] and John Gentle (Texas Advanced Computing Center and Jackson School of Geosciences, The University of Texas at Austi&lt;br /&gt;
&lt;br /&gt;
* '''Keywords of research area:''' Decision Support Systems, Hydrogeology, Participatory Modeling, Data Fusion &lt;br /&gt;
* '''Tentative title:''' [[&lt;br /&gt;
* '''Short abstract:'''&lt;br /&gt;
&lt;br /&gt;
* '''Challenge:''' Reproducibility, Dark Code; Fully document a new software application and framework using example case study data and tutorials.&lt;br /&gt;
* '''Relationship to other publications:''' This article is new content&lt;br /&gt;
* '''Pointer to the wiki page that documents the article:''' [[Document_GPF_activities_by_Suzanne_Pierce | Page]]&lt;br /&gt;
* '''Expected submission date:'''&lt;br /&gt;
&lt;br /&gt;
=== [Pope 2015] ===&lt;br /&gt;
&lt;br /&gt;
* '''Authors and affiliations:''' [[Allen Pope]], National Snow and Ice Data Center, University of Colorado, Boulder&lt;br /&gt;
* '''Keywords of research area:''' Glaciology, Remote Sensing, Landsat 8, Polar Science&lt;br /&gt;
* '''Tentative title:''' Data and Code for Estimating and Evaluating Supraglacial Lake Depth With Landsat 8 and other Multispectral Sensors&lt;br /&gt;
* '''Short abstract:''' Supraglacial lakes play a significant role in glacial hydrological systems – for example, transporting water to the glacier bed in Greenland or leading to ice shelf fracture and disintegration in Antarctica. To investigate these important processes, multispectral remote sensing provides multiple methods for estimating supraglacial lake depth – either through single-band or band-ratio methods, both empirical and physically-based. Landsat 8 is the newest satellite in the Landsat series. With new bands, higher dynamic range, and higher radiometric resolution, the Operational Land Imager (OLI) aboard Landsat 8 has a lot of potential. &lt;br /&gt;
&lt;br /&gt;
This paper will document the data and code used in processing in situ reflectance spectra and depth measurements to investigate the ability of Landsat 8 to estimate lake depths using multiple methods, as well as quantify improvements over Landsat 7’s ETM+. A workflow, data, and code are provided to detail promising methods as applied to Landsat 8 OLI imagery of case study areas in Greenland, allowing calculation of regional volume estimates using 2013 and 2014 summer-season imagery. Altimetry from WorldView DEMs are used to validate lake depth estimates. The optimal method for supraglacial lake depth estimation with Landsat 8 is shown to be an average of single band depths by red and panchromatic bands. With this best method, preliminary investigation of seasonal behavior and elevation distribution of lakes is also discussed and documented.&lt;br /&gt;
* '''Challenge:''' Reproducibility, Dark Code&lt;br /&gt;
* '''Relationship to other publications:''' Documenting and explaining the data and code behind the analysis and results presented in another paper.&lt;br /&gt;
* '''Pointer to the wiki page that documents the article:''' [[Document_GPF_activities_by_Allen_Pope | Page]]&lt;br /&gt;
* '''Expected submission date:''' Late June 2015&lt;br /&gt;
&lt;br /&gt;
=== [Read and Winslow 2015] ===&lt;br /&gt;
&lt;br /&gt;
* '''Authors and affiliations:''' [[Jordan Read]] and [[Luke Winslow]]&lt;br /&gt;
* '''Keywords of research area:'''&lt;br /&gt;
* '''Tentative title:'''&lt;br /&gt;
* '''Short abstract:'''&lt;br /&gt;
* '''Challenge:''' &lt;br /&gt;
* '''Relationship to other publications:''' (is the article based on a previously published article? is it new content?)&lt;br /&gt;
* '''Pointer to the wiki page that documents the article:''' [[Document_GPF_activities_by_Jordan_Read | Page]]&lt;br /&gt;
* '''Expected submission date:'''&lt;br /&gt;
&lt;br /&gt;
=== [Tzeng 2015] ===&lt;br /&gt;
&lt;br /&gt;
* '''Authors and affiliations:''' [[Mimi Tzeng]], Brian Dzwonkowski (DISL); Kyeong Park (TAMU Galveston)&lt;br /&gt;
* '''Keywords of research area:'''physical oceanography, remote sensing&lt;br /&gt;
* '''Tentative title:''' Fisheries Oceanography of Coastal Alabama (FOCAL): A Subset of a Time-Series of Hydrographic and Current Data from a Permanent Moored Station Outside Mobile Bay (27 Jan to 18 May 2011)&lt;br /&gt;
* '''Short abstract:'''The Fisheries Oceanography in Coastal Alabama (FOCAL) program began in 2006 as a way for scientists at Dauphin Island Sea Lab (DISL) to study the natural variability of Alabama's nearshore environment as it relates to fisheries production. FOCAL provided a long-term baseline data set that included time-series hydrographic data from a permanent offshore mooring (ADCP, vertical thermister array and CTDs at surface and bottom) and shipboard surveys (vertical CTD profiles and water sampling), as well as monthly ichthyoplankton and zooplankton (depth-discrete) sample collections at FOCAL sites. The subset of data presented here are from the mooring, and includes a vertical array of thermisters, CTDs at surface and bottom, an ADCP at the bottom, and vertical CTD profiles collected at the mooring during maintenance surveys. The mooring is located at 30 05.410'N 88 12.694'W, 25 km southwest of the entrance to Mobile Bay. Temperature, salinity, density, depth, and current velocity data were collected at 20-minute intervals from 2006 to 2012. Other parameters, such as dissolved oxygen, are available for portions of the time series depending on which instruments were deployed at the time.&lt;br /&gt;
* '''Challenge:''' Dark Code, Reproducibility; My paper will be about the processing of data in a larger dataset, from which peer-reviewed papers have been written. The processing I did was not specific to any particular paper. I can point to an example paper that used some of the data from this dataset, that I processed, however all of the figures in the paper are composites that also include other data from elsewhere that I had nothing to do with (and it wouldn't be feasible to try to get hold of the other data within our timeframe).&lt;br /&gt;
* '''Relationship to other publications:''' A recent paper that used the part of the FOCAL data I'm documenting as the sample from the larger dataset: Dzwonkowski, Brian, Kyeong Park, Jungwoo Lee, Bret M. Webb, and Arnoldo Valle-Levinson. 2014. &amp;quot;Spatial variability of flow over a river-influenced inner shelf in coastal Alabama during spring.&amp;quot; Continental Shelf Research 74:25-34.&lt;br /&gt;
* '''Pointer to the wiki page that documents the article:''' [[Document_GPF_activities_by_Mimi_Tzeng | Page]]&lt;br /&gt;
* '''Expected submission date:'''&lt;br /&gt;
&lt;br /&gt;
=== [Villamizar 2015] ===&lt;br /&gt;
&lt;br /&gt;
* '''Authors and affiliations:''' [[Sandra Villamizar]], University of California, Merced&lt;br /&gt;
* '''Keywords of research area:''' river ecohydrology&lt;br /&gt;
* '''Tentative title:''' Producing long-term series of whole-stream metabolism using readily available data.  &lt;br /&gt;
* '''Short abstract:''' Continuous water quality and river discharge data that are readily available through government websites may be used to produce valuable information about key processes within a river ecosystem. In this paper I describe in detail the steps for acquisition and processing of river flow, dissolved oxygen, temperature, and specific conductance data that, combined with atmospheric data and physical properties of the river reach of interest, allow for the production of a long-term series of whole stream metabolism. This information is key in understanding the structure and function of an ecosystem such as the San Joaquin River in the Central Valley of California which has been increasingly degraded during the last 60 years due to intensive human intervention but now, since 2010, has been going through a restoration effort. The key advantage of this tool is that it uses readily available information to produce knowledge about a river ecosystem. This set of scripts, written in the R code, can be used immediately for any other river for which the key parameters (river flow, dissolved oxygen, temperature, and specific conductivity) are available. The scripts can also be modified by users to fit their particular site conditions.&lt;br /&gt;
 &lt;br /&gt;
* '''Challenge:''' Reproducibility; Dark Code; Document new software/applications. This set of scripts was written after the necessity of generating daily estimates of metabolic rates for long periods of time and at various sites within the San Joaquin River.  &lt;br /&gt;
* '''Relationship to other publications:''' This will be a new publication&lt;br /&gt;
* '''Pointer to the wiki page that documents the article:''' [[Document_GPF_activities_by_Sandra_Villamizar | Page]]&lt;br /&gt;
* '''Expected submission date:''' To be defined&lt;br /&gt;
&lt;br /&gt;
=== [Yu and Bhatt 2015] ===&lt;br /&gt;
&lt;br /&gt;
* '''Authors and affiliations:''' [[Xuan Yu]], Department of Geological Sciences, University of Delaware. Gopal Bhatt, Department of Civil &amp;amp; Environmental Engineering, Pennsylvania State University. &lt;br /&gt;
* '''Keywords of research area:''' coupled processes, integrated hydrologic modeling, PIHM, surface flow, subsurface flow, open science&lt;br /&gt;
* '''Tentative title:''' Learning integrated modeling of surface and subsurface flow from scratch&lt;br /&gt;
* '''Short abstract:''' Integrated modeling of surface and subsurface flow has been of great interest in understanding not only intimate interconnectedness of hydrological processes, but also land-surface energy balance, biogeochemical and ecological processes, and landscape evolution. Although a growing number of complex hydrologic models have been used for resolving environmental processes, hypothesis testing, hydrologic predictions for effective management of watershed, very limited resources of the model implementation have been made accessible to a large group of model users. The users have to invest a significant amount of time and effort to reproduce, and to understand the workflow of hydrologic simulation in a modeling paper. To provide a challenging and stimulating introduction to integrated modeling of surface and subsurface flow in this paper, we revisit the development of Penn State Integrated Hydrologic Model (PIHM) by reproducing a numerical benchmarking example, and a real world catchment scale application. Specifically, we document PIHM and it’s modeling workflow to enable basic understanding of simulating coupled surface and subsurface flow processes. We provide model and data to highlight the reciprocal roles between the two. In addition, we incorporate user experience as third dimension in the modeling workflow to enable deeper communications between model developers and users. The workflow has important implications for smoothing and accelerating open scientific collaborations in geosciences research.&lt;br /&gt;
* '''Challenge:''' Reproducibility; Reproduce published simulations by a existing model with the latest version. Benchmarking modeling application for numerical experiment and field data.&lt;br /&gt;
* '''Relationship to other publications:''' The article is based on a previously published article. &lt;br /&gt;
* '''Pointer to the wiki page that documents the article:''' [[Document_GPF_activities_by_Xuan_Yu | Page]]&lt;br /&gt;
* '''Expected submission date:''' End of June 2015&lt;br /&gt;
&lt;br /&gt;
== Special Issue Editors ==&lt;br /&gt;
&lt;br /&gt;
* Co-editor: Chris Duffy and/or Scott Peckham&lt;br /&gt;
* Co-editor: Cedric David&lt;br /&gt;
* Co-editor: possibly Karan Venayagamoorthy&lt;br /&gt;
&lt;br /&gt;
The editors will only accept submissions that follow the [[Develop_proposal_for_special_issue#Special_Issue_Review_Criteria | special issue review criteria]].&lt;br /&gt;
&lt;br /&gt;
The editors will select a set of reviewers to handle the submissions.  Reviewers will include computer scientists, library scientists, and geoscientists.&lt;br /&gt;
&lt;br /&gt;
== Special Issue Review Criteria ==&lt;br /&gt;
&lt;br /&gt;
The reviewers will be asked to provide feedback on the papers according to the following criteria. Note that some papers will have good reasons for limiting the information (e.g. the data is from third parties and not openly available, etc), and in that case they would document those reasons.&lt;br /&gt;
&lt;br /&gt;
* Documentation of the datasets: descriptions of datasets, unique identifiers, repositories.&lt;br /&gt;
* Documentation of software: description of all software used (including pre-processing of data, visualization steps, etc), unique identifiers, repositories.&lt;br /&gt;
* Documentation of the provenance of results: provenance for each figure or result, such as the workflow or the provenance record.&lt;br /&gt;
&lt;br /&gt;
== Tentative Timeline ==&lt;br /&gt;
&lt;br /&gt;
* Journal committed to special issue: April 15, 2015&lt;br /&gt;
* Submissions due to editors: June 30, 2015&lt;br /&gt;
* Reviews due: Sept 15, 2015&lt;br /&gt;
* Decisions out to authors: Sept 30, 2015&lt;br /&gt;
* Revisions due: October 31, 2015&lt;br /&gt;
* Final versions due November 15, 2015&lt;br /&gt;
* Issue published December 31, 2015&lt;br /&gt;
&lt;br /&gt;
&amp;lt;!-- Add any wiki Text above this Line --&amp;gt;&lt;br /&gt;
&amp;lt;!-- Do NOT Edit below this Line --&amp;gt;&lt;br /&gt;
{{#set:&lt;br /&gt;
	Owner=Chris_Duffy|&lt;br /&gt;
	Participants=Yolanda_Gil|&lt;br /&gt;
	Participants=Scott_Peckham|&lt;br /&gt;
	Participants=Cedric_David|&lt;br /&gt;
	Participants=Ibrahim_Demir|&lt;br /&gt;
	Participants=Wally_Fulweiler|&lt;br /&gt;
	Participants=Leif_Karlstrom|&lt;br /&gt;
	Participants=Kyo_Lee|&lt;br /&gt;
	Participants=Kim_Miller|&lt;br /&gt;
	Participants=Heath_Mills|&lt;br /&gt;
	Participants=Ji-Hyun_Oh|&lt;br /&gt;
	Participants=Suzanne_Pierce|&lt;br /&gt;
	Participants=Allen_Pope|&lt;br /&gt;
	Participants=Jordan_Read|&lt;br /&gt;
	Participants=Mimi_Tzeng|&lt;br /&gt;
	Participants=Sandra_Villamizar|&lt;br /&gt;
	Participants=Xuan_Yu|&lt;br /&gt;
	Progress=20|&lt;br /&gt;
	StartDate=2015-03-10|&lt;br /&gt;
	TargetDate=2015-03-16|&lt;br /&gt;
	Type=Low}}&lt;/div&gt;</summary>
		<author><name>Suzanne</name></author>	</entry>

	<entry>
		<id>https://www.organicdatascience.org/gpf/index.php?title=Develop_proposal_for_special_issue&amp;diff=11703</id>
		<title>Develop proposal for special issue</title>
		<link rel="alternate" type="text/html" href="https://www.organicdatascience.org/gpf/index.php?title=Develop_proposal_for_special_issue&amp;diff=11703"/>
				<updated>2015-04-03T17:54:50Z</updated>
		
		<summary type="html">&lt;p&gt;Suzanne: /* [Pierce 2015] */&lt;/p&gt;
&lt;hr /&gt;
&lt;div&gt;[[Category:Task]]&lt;br /&gt;
&lt;br /&gt;
== Background: Why a Special Issue on Geoscience Papers of the Future? ==&lt;br /&gt;
&lt;br /&gt;
[[Discuss_what_we_will_consider_a_GPF#The_Vision | Include here our discussion for the vision]]&lt;br /&gt;
&lt;br /&gt;
Background should be 1-2 pages.&lt;br /&gt;
&lt;br /&gt;
Motivated by need to fully document and make research accessible and reproducible. &lt;br /&gt;
&lt;br /&gt;
=== Motivation: The EarthCube Initiative and the GeoSoft Project ===&lt;br /&gt;
&lt;br /&gt;
[http://www.geosoft-earthcube.org/about Include here background about GeoSoft from the web site]&lt;br /&gt;
&lt;br /&gt;
OSTP memo.  EarthCube reports.&lt;br /&gt;
Other reports that talk about the need for new approaches to editing.&lt;br /&gt;
&lt;br /&gt;
It's possible that small or very large contributions are not well captured in the current publishing paradigms.  Nanopublications.&lt;br /&gt;
&lt;br /&gt;
For example, nano-publications are a possible way to reflect advances in a research process that may not merit a full pubication but they are useful advances to share with the community. A challenge here is that there is a stigma in publishing for publishing units that are too small or very small.  &lt;br /&gt;
&lt;br /&gt;
Alternatively, a very large piece of research or work with many parts may be better suited to a GPF style publication.&lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
Perhaps, the concept of a 'paper' can be better reflected in the concept of a 'wrapper' or a collection of materials and resources. The purpose is to assure that publications are representative of the work, effort, and results achieved in the research process.&lt;br /&gt;
&lt;br /&gt;
=== What is a GPF ===&lt;br /&gt;
&lt;br /&gt;
[[Discuss_what_we_will_consider_a_GPF#What_is_a_Geoscience_Paper_of_the_Future.3F | Include here our discussion of what is a GPF]]&lt;br /&gt;
&lt;br /&gt;
=== The challenges of creating GPFs ===&lt;br /&gt;
&lt;br /&gt;
The articles in this issue reflect the current best practice for generating a Geoscience Paper of the Future.&lt;br /&gt;
&lt;br /&gt;
'''Figure discussions''': Do we want to do exactly the same figure automatically.  Figures in the paper may be a clean versions of an image generated by software.  To the extent possible, authors have included clear delineations of provenance. The goal is to assure that readers may regenerate the figures using documented workflows, data, and codes.  An important note (Allen, Sandra) is that frequently figures are generated by code, scripts, etc. yet the actual figure is finalized with user.....  Mimi is trying to say: is it really worth belaboring the point about how the prettified version of the figure is made? If it is: both of the visualization software I've used (Matlab and SigmaPlot) have actual code in the background that specifies how to set up the prettification, and this code can be found, copied out, and rerun to generate the exact same figure with all of the prettification in the same place. SigmaPlot uses Visual Basic (I think) in its macros. If it is an important point about explicit code, this should be doable. But I'm not sure it's strictly necessary to specify exactly where all the prettifications are to get the gist across.&lt;br /&gt;
&lt;br /&gt;
How much of your experimental history does one include?  (Ibrahim).  The experimental process often ends up nowhere.  Should we document all the failed experiments?  Get one DOI for the results of the successful experiment?  Another for failed trials?&lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
'''''Documenting: Timing and Intermediate proceses'''''&lt;br /&gt;
When should we document and what are the bounds on what we document?&lt;br /&gt;
For example, should we document and include data and workflows for 'failed' experiments? Or should we assign datasets DOIs before we know the results from using them?  &lt;br /&gt;
The group thinks that  good ideas/practices may include documenting and sharing data when you have a clear understanding of the outcomes worth reporting. For example successful experiments should have clear, clean data documented and shared. Whereas one strategy with 'failed' experiments could include bundling the intermediate datasets with one DOI and a more general discussion of the process/methods.&lt;br /&gt;
&lt;br /&gt;
=== Related work ===&lt;br /&gt;
&lt;br /&gt;
[[Discuss_what_we_will_consider_a_GPF#New_Frameworks_to_Create_a_New_Generation_of_Scientific_Articles | Include here the related work we have discussed]]&lt;br /&gt;
&lt;br /&gt;
== Papers to be included ==&lt;br /&gt;
&lt;br /&gt;
Would it be worthwhile to group the papers into broader categories rather than giving specifics about every single paper?&lt;br /&gt;
&lt;br /&gt;
For each submission, we describe:&lt;br /&gt;
&lt;br /&gt;
* '''Authors and affiliations'''&lt;br /&gt;
* '''Keywords of research area'''&lt;br /&gt;
* '''Tentative title'''&lt;br /&gt;
* '''Short abstract'''&lt;br /&gt;
* '''Challenge'''&lt;br /&gt;
* '''Relationship to other publications''' (is the article based on a previously published article? is it new content? IF PREVIOUSLY PUBLISHED, PLS PROVIDE A POINTER TO THE PUBLISHED ARTICLE AND SPECIFY WHAT PERCENTAGE OF THE WORK PRESENTED WILL BE NEW)&lt;br /&gt;
* '''Pointer to the wiki page that documents the article'''&lt;br /&gt;
* '''Expected submission date'''&lt;br /&gt;
&lt;br /&gt;
=== [David 2015] ===&lt;br /&gt;
&lt;br /&gt;
* '''Authors and affiliations:''' [[Cedric David]]&lt;br /&gt;
* '''Keywords of research area:''' Hydrology, Rivers, Modeling, Testing, Reproducibility. &lt;br /&gt;
* '''Tentative title:''' Going beyond triple-checking, allowing for peace of mind in community model development.&lt;br /&gt;
* '''Short abstract:''' The development of computer models in the general field of geoscience is often made incrementally over many years.  Endeavors that generally start on one single researcher's own machine evolve over time into software that are often much larger than was initially anticipated.  Looking at years of building on their computer code, sometimes without much training in computer science, geoscience software developers can easily experience an overwhelming sense of incompetence when contemplating ways to further community usage of their software.  How does one allow others to use their code?  How can one foster survival of their tool?  How could one possibly ensure the scientific integrity of ongoing developments including those made by others?  Common issues faced by geoscience developers include selecting a license, learning how to track and document past and ongoing changes, choosing a software repository, and allowing for community development.  This paper provides a brief summary of experience with the three former steps of software growth by focusing on the almost decade-long code development of a river routing model.  The core of this study, however, focuses on reproducing previously-published experiments.  This step is highly repetitive and can therefore benefit greatly from automation.  Additionally, enabling automated software testing can arguably be considered the final step for sustainable software sharing, by allowing the main software developer to let go of a mental block considering scientific integrity.  Creating tools to automatically compare the results of an updated version of a software with those of previous studies can not only save the main developer's own time, it can also empower other researchers to in their ability to check and justify that their potential additions have retained scientific integrity.   &lt;br /&gt;
* '''Challenge:''' Reproducibility; Ensure that updates to an existing model are able to reproduce a series of simulations published previously.&lt;br /&gt;
* '''Relationship to other publications:''' This research is related to past and ongoing development of the Routing Application for Parallel computatIon of Discharge (RAPID).  The primary focus of this paper is to allow automated reproducibility of at least the [http://dx.doi.org/10.1175/2011JHM1345.1 first RAPID publication].  The scientific subject of this GPF differs from the article(s) to be reproduced as its focus is on development of automatic testing methods.  In that regard, the paper is expected to be 95% new. &lt;br /&gt;
* '''Pointer to the wiki page that documents the article:''' [[Document_GPF_activities_by_Cedric_David | Page]]&lt;br /&gt;
* '''Expected submission date:'''&lt;br /&gt;
&lt;br /&gt;
=== [Demir 2015] ===&lt;br /&gt;
&lt;br /&gt;
* '''Authors and affiliations:''' [[Ibrahim Demir]]&lt;br /&gt;
* '''Keywords of research area:''' hydrological network, optimization, network representation, database query&lt;br /&gt;
* '''Tentative title:''' Analysis and Optimization of Hydrological Network Database Representation Methods for Fast Access and Query in Web-based System&lt;br /&gt;
* '''Short abstract:''' Web based systems allow users to delineate watersheds on interactive map environments using server side processing. With increasing resolution of hydrological networks, optimized methods for storage of network representation in databases, and efficient queries and actions on the river network structure become critical. This paper presents a detailed study on analysis of widely used methods for representing hydrological networks in relational databases, and benchmarking common queries and modifications on the network structure using these methods. The analysis has been applied to the hydrological network of Iowa utilizing 90m DEM and 600,000 network nodes. The application results indicate that the representation methods provide massive improvements on query times and storage of network structure in the database. Suggested method allows watershed delineation tools running on client-side with desktop-like performance. &lt;br /&gt;
* '''Challenge:''' Reproducibility, Transferability; Some of the internal steps to prepare data might require long computation time and different software environments.&lt;br /&gt;
* '''Relationship to other publications:''' The article is based on a new study&lt;br /&gt;
* '''Pointer to the wiki page that documents the article:''' [[Document_GPF_activities_by_Ibrahim_Demir | Page]]&lt;br /&gt;
* '''Expected submission date:'''&lt;br /&gt;
&lt;br /&gt;
=== [Fulweiler 2015] ===&lt;br /&gt;
&lt;br /&gt;
* '''Authors and affiliations:''' [[Wally Fulweiler]]&lt;br /&gt;
* '''Keywords of research area:'''&lt;br /&gt;
* '''Tentative title:'''&lt;br /&gt;
* '''Short abstract:'''&lt;br /&gt;
* '''Challenge:''' &lt;br /&gt;
* '''Relationship to other publications:''' (is the article based on a previously published article? is it new content?)&lt;br /&gt;
* '''Pointer to the wiki page that documents the article:''' [[Document_GPF_activities_by_Wally_Fulweiler | Page]]&lt;br /&gt;
* '''Expected submission date:'''&lt;br /&gt;
&lt;br /&gt;
=== [Loh and Karlstrom 2015] ===&lt;br /&gt;
&lt;br /&gt;
* '''Authors and affiliations:'''  [[Lay Kuan Loh]] and [[Leif Karlstrom]]&lt;br /&gt;
* '''Keywords of research area:''' Spatial clustering, Eigenvector selection, Entropy Ranking, Cascades Volcanic Region, [http://geosphere.gsapubs.org/content/3/3/152.abstract Afar Depression], [http://astrogeology.usgs.gov/search/details/Mars/Research/Volcanic/TharsisVents/zip Tharsis provonce]&lt;br /&gt;
* '''Tentative title:''' Characterization of volcanic vent distributions using spectral clustering with eigenvector selection and entropy ranking&lt;br /&gt;
* '''Short abstract:''' Volcanic vents on the surface of Earth and other planets often appear in groups that exhibit spatial patterning. Such vent distributions reflect complex interplay between time-evolving mechanical controls on the pathways of magma ascent, background tectonic stresses, and unsteady supply of rising magma. With the ultimate aim of connecting surface vent distributions with the dynamics of magma ascent, we have developed a clustering method to quantify spatial patterns in vents. Clustering is typically used in exploratory data analysis to identify groups with similar behavior by partitioning a dataset into clusters that share similar attributes. Traditional clustering algorithms that work well on simple point-cloud type synthetic datasets generally do not scale well the real-world data we are interested in, where there are poor boundaries between clusters and much ambiguity in cluster assignments. We instead use a spectral clustering algorithm with eigenvector selection based on entropy ranking based off work from [http://www.sciencedirect.com/science/article/pii/S0925231210001311 Zhao et al 2010] that outperforms traditional spectral clustering algorithms in choosing the right number of clusters for point data. We benchmark this algorithm on synthetic vent data with increasingly complex spatial distributions, to test the ability to accurately cluster vent data with variable spatial density, skewness, number of clusters, and proximity of clusters. We then apply our algorithm to several real-world datasets from the Cascades, Afar Depression and Mars. &lt;br /&gt;
* '''Challenge:''' Reproducibility (i.e., Quantifying clustering); We plan to study how varying the statistical distribution, density, skewness, background noise, number of clusters, proximity of clusters, and combinations of any of these factors affects the performance of our algorithm. We test it against man-made and real world datasets. ''' &lt;br /&gt;
* '''Relationship to other publications:''' New content, but one of the databases we are studying in the paper (Cascades Volcanic Range) would be based off a different paper we are preparing and planning to submit earlier. &lt;br /&gt;
* '''Pointer to the wiki page that documents the article:''' [[Document_GPF_activities_by_Leif_Karlstrom | Page]]&lt;br /&gt;
* '''Expected submission date:''' June 2015&lt;br /&gt;
&lt;br /&gt;
=== [Lee 2015] ===&lt;br /&gt;
&lt;br /&gt;
* '''Authors and affiliations:''' [[Kyo Lee]], Maziyar Boustani and Chris Mattmann, Jet Propulsion Laboratory&lt;br /&gt;
* '''Keywords of research area:'''North American regional climate, regional climate model evaluation system, Open Climate Workbench, &lt;br /&gt;
* '''Tentative title:''' Evaluation of simulated temperature, precipitation, cloud fraction and insolation over the conterminous United States using Regional Climate Model Evaluation System&lt;br /&gt;
* '''Short abstract:'''This study describes the detailed process of evaluating model fidelity in simulating four key climate variables, surface air temperature, precipitation, cloud fraction and insolation and their covariability over the conterminous United States region. Regional Climate Model Evaluation System (RCMES), a suite of public database and open-source software package, provides both observational datasets and data processors useful for evaluating any climate models. In this paper, we provide a clear and easy-to-follow workflow of RCMES to replicate published papers evaluating North American Regional Climate Change Assessment Program (NARCCAP) regional climate model (RCM) hindcast simulations using observations from variety of sources. &lt;br /&gt;
* '''Challenge:'''Big Data Sharing, Dark Code; Sharing big data, better documenting source codes, encouraging climate science community to use RCMES  &lt;br /&gt;
* '''Relationship to other publications:''' [http://journals.ametsoc.org/doi/abs/10.1175/JCLI-D-12-00452.1 Kim et al. 2013], [http://link.springer.com/article/10.1007/s00382-014-2253-y Lee et al. 2014]&lt;br /&gt;
* '''Pointer to the wiki page that documents the article:''' [[Document_GPF_activities_by_Kyo_Lee | Page]]&lt;br /&gt;
* '''Expected submission date:'''End of June 2015&lt;br /&gt;
&lt;br /&gt;
=== [Miller 2015] ===&lt;br /&gt;
&lt;br /&gt;
* '''Authors and affiliations:''' [[Kim Miller]]&lt;br /&gt;
* '''Keywords of research area:'''&lt;br /&gt;
* '''Tentative title:'''&lt;br /&gt;
* '''Short abstract:'''&lt;br /&gt;
* '''Challenge:''' &lt;br /&gt;
* '''Relationship to other publications:''' (is the article based on a previously published article? is it new content?)&lt;br /&gt;
* '''Pointer to the wiki page that documents the article:''' [[Document_GPF_activities_by_Kim_Miller | Page]]&lt;br /&gt;
* '''Expected submission date:'''&lt;br /&gt;
&lt;br /&gt;
=== [Mills 2015] ===&lt;br /&gt;
&lt;br /&gt;
* '''Authors and affiliations:''' [[Heath Mills]], University of Houston Clear Lake; Brandi Kiel Reese, Texas A&amp;amp;M Corpus Christi&lt;br /&gt;
* '''Keywords of research area:'''&lt;br /&gt;
* '''Tentative title:'''Iron and Sulfur Cycling Biogeography Using Advanced Geochemical and Molecular Analyses&lt;br /&gt;
* '''Short abstract:'''My paper will develop and document a new pipeline to analyze a combined and robust genetic and geochemical data set. New, reproducible methods will be highlighted in this manuscript to help others better analyze similar data sets. There is a general lack of guidance within my field for such challenges. This manuscript will be unique and helpful from an analysis standpoint as well as for the science being presented.&lt;br /&gt;
* '''Challenge:''' Reproducibility; Dark Code&lt;br /&gt;
* '''Relationship to other publications:''' Original Manuscript&lt;br /&gt;
* '''Pointer to the wiki page that documents the article:''' [[Document_GPF_activities_by_Heith_Mills | Page]]&lt;br /&gt;
* '''Expected submission date:'''&lt;br /&gt;
&lt;br /&gt;
=== [Oh 2015] ===&lt;br /&gt;
&lt;br /&gt;
* '''Authors and affiliations:''' [[Ji-Hyun Oh]] Jet Propulsion Laboratory/University of Southern California&lt;br /&gt;
* '''Keywords of research area:''' Tropical Meteorology, Madden-Julian Oscillation, Momentum budget analysis &lt;br /&gt;
* '''Tentative title:''' Tools for computing momentum budget for the westerly wind event associated with the Madden-Julian Oscillation&lt;br /&gt;
* '''Short abstract:'''As one of the most pronounced modes of tropical intraseasonal variability, the Madden-Julian Oscillation (MJO) prominently connects global weather and climate, and serves as one of critical predictability sources for extended-range forecasting. The zonal circulation of the MJO is characterized by low-level westerlies (easterlies) in and to the west (east) of the convective center, respectively. The direction of zonal winds in the upper troposphere is opposite to that in the lower troposphere. In addition to the convective signal as an identifier of the MJO initiation, certain characteristics of the zonal circulation been used as a standard metric for monitoring the state of MJO and investigating features of the MJO and its impact on other atmospheric phenomena. This paper documents a tool for  investigating  the generation of low-level westerly winds during the MJO life cycle. The tool is used for the momentum budget analysis to understand the respective contributions of various processes involved in the wind evolution associated with the MJO using European Centre for Medium-Range Weather Forecasts operational analyses during Dynamics of the Madden–Julian Oscillation field campaign.&lt;br /&gt;
&lt;br /&gt;
* '''Challenge:''' Reproducibility, Dark Code; This paper will cover how to reproduce two key figures from the paper that I recently submitted to Journal of Atmospheric Science. This will include detailed procedures related to generating the figures such as how/where to download data, how to transform the format of the data to be used as an input for my codes, and so on.. &lt;br /&gt;
* '''Relationship to other publications:''' (is the article based on a previously published article? is it new content?) This article is related to the part of the paper submitted to Journal of Atmospheric Science. &lt;br /&gt;
* '''Pointer to the wiki page that documents the article:''' [[Document_GPF_activities_by_Ji_Hyun | Page]]&lt;br /&gt;
* '''Expected submission date:'''&lt;br /&gt;
&lt;br /&gt;
=== [Pierce 2015] ===&lt;br /&gt;
&lt;br /&gt;
* '''Authors and affiliations:'''  [[Suzanne A Pierce]] and John Gentle (Texas Advanced Computing Center and Jackson School of Geosciences, The University of Texas at Austi&lt;br /&gt;
&lt;br /&gt;
* '''Keywords of research area:''' Decision Support Systems, Hydrogeology, Participatory Modeling, Data Fusion &lt;br /&gt;
* '''Tentative title:''' [[&lt;br /&gt;
* '''Short abstract:'''&lt;br /&gt;
&lt;br /&gt;
* '''Challenge:''' Reproducibility, Dark Code; Fully document a new software application and framework using example case study data and tutorials.&lt;br /&gt;
* '''Relationship to other publications:''' This article is new content&lt;br /&gt;
* '''Pointer to the wiki page that documents the article:''' [[Document_GPF_activities_by_Suzanne_Pierce | Page]]&lt;br /&gt;
* '''Expected submission date:'''&lt;br /&gt;
&lt;br /&gt;
=== [Pope 2015] ===&lt;br /&gt;
&lt;br /&gt;
* '''Authors and affiliations:''' [[Allen Pope]], National Snow and Ice Data Center, University of Colorado, Boulder&lt;br /&gt;
* '''Keywords of research area:''' Glaciology, Remote Sensing, Landsat 8, Polar Science&lt;br /&gt;
* '''Tentative title:''' Data and Code for Estimating and Evaluating Supraglacial Lake Depth With Landsat 8 and other Multispectral Sensors&lt;br /&gt;
* '''Short abstract:''' Supraglacial lakes play a significant role in glacial hydrological systems – for example, transporting water to the glacier bed in Greenland or leading to ice shelf fracture and disintegration in Antarctica. To investigate these important processes, multispectral remote sensing provides multiple methods for estimating supraglacial lake depth – either through single-band or band-ratio methods, both empirical and physically-based. Landsat 8 is the newest satellite in the Landsat series. With new bands, higher dynamic range, and higher radiometric resolution, the Operational Land Imager (OLI) aboard Landsat 8 has a lot of potential. &lt;br /&gt;
&lt;br /&gt;
This paper will document the data and code used in processing in situ reflectance spectra and depth measurements to investigate the ability of Landsat 8 to estimate lake depths using multiple methods, as well as quantify improvements over Landsat 7’s ETM+. A workflow, data, and code are provided to detail promising methods as applied to Landsat 8 OLI imagery of case study areas in Greenland, allowing calculation of regional volume estimates using 2013 and 2014 summer-season imagery. Altimetry from WorldView DEMs are used to validate lake depth estimates. The optimal method for supraglacial lake depth estimation with Landsat 8 is shown to be an average of single band depths by red and panchromatic bands. With this best method, preliminary investigation of seasonal behavior and elevation distribution of lakes is also discussed and documented.&lt;br /&gt;
* '''Challenge:''' Reproducibility, Dark Code&lt;br /&gt;
* '''Relationship to other publications:''' Documenting and explaining the data and code behind the analysis and results presented in another paper.&lt;br /&gt;
* '''Pointer to the wiki page that documents the article:''' [[Document_GPF_activities_by_Allen_Pope | Page]]&lt;br /&gt;
* '''Expected submission date:''' Late June 2015&lt;br /&gt;
&lt;br /&gt;
=== [Read and Winslow 2015] ===&lt;br /&gt;
&lt;br /&gt;
* '''Authors and affiliations:''' [[Jordan Read]] and [[Luke Winslow]]&lt;br /&gt;
* '''Keywords of research area:'''&lt;br /&gt;
* '''Tentative title:'''&lt;br /&gt;
* '''Short abstract:'''&lt;br /&gt;
* '''Challenge:''' &lt;br /&gt;
* '''Relationship to other publications:''' (is the article based on a previously published article? is it new content?)&lt;br /&gt;
* '''Pointer to the wiki page that documents the article:''' [[Document_GPF_activities_by_Jordan_Read | Page]]&lt;br /&gt;
* '''Expected submission date:'''&lt;br /&gt;
&lt;br /&gt;
=== [Tzeng 2015] ===&lt;br /&gt;
&lt;br /&gt;
* '''Authors and affiliations:''' [[Mimi Tzeng]], Brian Dzwonkowski (DISL); Kyeong Park (TAMU Galveston)&lt;br /&gt;
* '''Keywords of research area:'''physical oceanography, remote sensing&lt;br /&gt;
* '''Tentative title:''' Fisheries Oceanography of Coastal Alabama (FOCAL): A Subset of a Time-Series of Hydrographic and Current Data from a Permanent Moored Station Outside Mobile Bay (27 Jan to 18 May 2011)&lt;br /&gt;
* '''Short abstract:'''The Fisheries Oceanography in Coastal Alabama (FOCAL) program began in 2006 as a way for scientists at Dauphin Island Sea Lab (DISL) to study the natural variability of Alabama's nearshore environment as it relates to fisheries production. FOCAL provided a long-term baseline data set that included time-series hydrographic data from a permanent offshore mooring (ADCP, vertical thermister array and CTDs at surface and bottom) and shipboard surveys (vertical CTD profiles and water sampling), as well as monthly ichthyoplankton and zooplankton (depth-discrete) sample collections at FOCAL sites. The subset of data presented here are from the mooring, and includes a vertical array of thermisters, CTDs at surface and bottom, an ADCP at the bottom, and vertical CTD profiles collected at the mooring during maintenance surveys. The mooring is located at 30 05.410'N 88 12.694'W, 25 km southwest of the entrance to Mobile Bay. Temperature, salinity, density, depth, and current velocity data were collected at 20-minute intervals from 2006 to 2012. Other parameters, such as dissolved oxygen, are available for portions of the time series depending on which instruments were deployed at the time.&lt;br /&gt;
* '''Challenge:''' Dark Code, Reproducibility; My paper will be about the processing of data in a larger dataset, from which peer-reviewed papers have been written. The processing I did was not specific to any particular paper. I can point to an example paper that used some of the data from this dataset, that I processed, however all of the figures in the paper are composites that also include other data from elsewhere that I had nothing to do with (and it wouldn't be feasible to try to get hold of the other data within our timeframe).&lt;br /&gt;
* '''Relationship to other publications:''' A recent paper that used the part of the FOCAL data I'm documenting as the sample from the larger dataset: Dzwonkowski, Brian, Kyeong Park, Jungwoo Lee, Bret M. Webb, and Arnoldo Valle-Levinson. 2014. &amp;quot;Spatial variability of flow over a river-influenced inner shelf in coastal Alabama during spring.&amp;quot; Continental Shelf Research 74:25-34.&lt;br /&gt;
* '''Pointer to the wiki page that documents the article:''' [[Document_GPF_activities_by_Mimi_Tzeng | Page]]&lt;br /&gt;
* '''Expected submission date:'''&lt;br /&gt;
&lt;br /&gt;
=== [Villamizar 2015] ===&lt;br /&gt;
&lt;br /&gt;
* '''Authors and affiliations:''' [[Sandra Villamizar]], University of California, Merced&lt;br /&gt;
* '''Keywords of research area:''' river ecohydrology&lt;br /&gt;
* '''Tentative title:''' Producing long-term series of whole-stream metabolism using readily available data.  &lt;br /&gt;
* '''Short abstract:''' Continuous water quality and river discharge data that are readily available through government websites may be used to produce valuable information about key processes within a river ecosystem. In this paper I describe in detail the steps for acquisition and processing of river flow, dissolved oxygen, temperature, and specific conductance data that, combined with atmospheric data and physical properties of the river reach of interest, allow for the production of a long-term series of whole stream metabolism. This information is key in understanding the structure and function of an ecosystem such as the San Joaquin River in the Central Valley of California which has been increasingly degraded during the last 60 years due to intensive human intervention but now, since 2010, has been going through a restoration effort. The key advantage of this tool is that it uses readily available information to produce knowledge about a river ecosystem. This set of scripts, written in the R code, can be used immediately for any other river for which the key parameters (river flow, dissolved oxygen, temperature, and specific conductivity) are available. The scripts can also be modified by users to fit their particular site conditions.&lt;br /&gt;
 &lt;br /&gt;
* '''Challenge:''' Reproducibility; Dark Code; Document new software/applications. This set of scripts was written after the necessity of generating daily estimates of metabolic rates for long periods of time and at various sites within the San Joaquin River.  &lt;br /&gt;
* '''Relationship to other publications:''' This will be a new publication&lt;br /&gt;
* '''Pointer to the wiki page that documents the article:''' [[Document_GPF_activities_by_Sandra_Villamizar | Page]]&lt;br /&gt;
* '''Expected submission date:''' To be defined&lt;br /&gt;
&lt;br /&gt;
=== [Yu and Bhatt 2015] ===&lt;br /&gt;
&lt;br /&gt;
* '''Authors and affiliations:''' [[Xuan Yu]], Department of Geological Sciences, University of Delaware. Gopal Bhatt, Department of Civil &amp;amp; Environmental Engineering, Pennsylvania State University. &lt;br /&gt;
* '''Keywords of research area:''' coupled processes, integrated hydrologic modeling, PIHM, surface flow, subsurface flow, open science&lt;br /&gt;
* '''Tentative title:''' Learning integrated modeling of surface and subsurface flow from scratch&lt;br /&gt;
* '''Short abstract:''' Integrated modeling of surface and subsurface flow has been of great interest in understanding not only intimate interconnectedness of hydrological processes, but also land-surface energy balance, biogeochemical and ecological processes, and landscape evolution. Although a growing number of complex hydrologic models have been used for resolving environmental processes, hypothesis testing, hydrologic predictions for effective management of watershed, very limited resources of the model implementation have been made accessible to a large group of model users. The users have to invest a significant amount of time and effort to reproduce, and to understand the workflow of hydrologic simulation in a modeling paper. To provide a challenging and stimulating introduction to integrated modeling of surface and subsurface flow in this paper, we revisit the development of Penn State Integrated Hydrologic Model (PIHM) by reproducing a numerical benchmarking example, and a real world catchment scale application. Specifically, we document PIHM and it’s modeling workflow to enable basic understanding of simulating coupled surface and subsurface flow processes. We provide model and data to highlight the reciprocal roles between the two. In addition, we incorporate user experience as third dimension in the modeling workflow to enable deeper communications between model developers and users. The workflow has important implications for smoothing and accelerating open scientific collaborations in geosciences research.&lt;br /&gt;
* '''Challenge:''' Reproducibility; Reproduce published simulations by a existing model with the latest version. Benchmarking modeling application for numerical experiment and field data.&lt;br /&gt;
* '''Relationship to other publications:''' The article is based on a previously published article. &lt;br /&gt;
* '''Pointer to the wiki page that documents the article:''' [[Document_GPF_activities_by_Xuan_Yu | Page]]&lt;br /&gt;
* '''Expected submission date:''' End of June 2015&lt;br /&gt;
&lt;br /&gt;
== Special Issue Editors ==&lt;br /&gt;
&lt;br /&gt;
* Co-editor: Chris Duffy and/or Scott Peckham&lt;br /&gt;
* Co-editor: Cedric David&lt;br /&gt;
* Co-editor: possibly Karan Venayagamoorthy&lt;br /&gt;
&lt;br /&gt;
The editors will only accept submissions that follow the [[Develop_proposal_for_special_issue#Special_Issue_Review_Criteria | special issue review criteria]].&lt;br /&gt;
&lt;br /&gt;
The editors will select a set of reviewers to handle the submissions.  Reviewers will include computer scientists, library scientists, and geoscientists.&lt;br /&gt;
&lt;br /&gt;
== Special Issue Review Criteria ==&lt;br /&gt;
&lt;br /&gt;
The reviewers will be asked to provide feedback on the papers according to the following criteria. Note that some papers will have good reasons for limiting the information (e.g. the data is from third parties and not openly available, etc), and in that case they would document those reasons.&lt;br /&gt;
&lt;br /&gt;
* Documentation of the datasets: descriptions of datasets, unique identifiers, repositories.&lt;br /&gt;
* Documentation of software: description of all software used (including pre-processing of data, visualization steps, etc), unique identifiers, repositories.&lt;br /&gt;
* Documentation of the provenance of results: provenance for each figure or result, such as the workflow or the provenance record.&lt;br /&gt;
&lt;br /&gt;
== Tentative Timeline ==&lt;br /&gt;
&lt;br /&gt;
* Journal committed to special issue: April 15, 2015&lt;br /&gt;
* Submissions due to editors: June 30, 2015&lt;br /&gt;
* Reviews due: Sept 15, 2015&lt;br /&gt;
* Decisions out to authors: Sept 30, 2015&lt;br /&gt;
* Revisions due: October 31, 2015&lt;br /&gt;
* Final versions due November 15, 2015&lt;br /&gt;
* Issue published December 31, 2015&lt;br /&gt;
&lt;br /&gt;
&amp;lt;!-- Add any wiki Text above this Line --&amp;gt;&lt;br /&gt;
&amp;lt;!-- Do NOT Edit below this Line --&amp;gt;&lt;br /&gt;
{{#set:&lt;br /&gt;
	Owner=Chris_Duffy|&lt;br /&gt;
	Participants=Yolanda_Gil|&lt;br /&gt;
	Participants=Scott_Peckham|&lt;br /&gt;
	Participants=Cedric_David|&lt;br /&gt;
	Participants=Ibrahim_Demir|&lt;br /&gt;
	Participants=Wally_Fulweiler|&lt;br /&gt;
	Participants=Leif_Karlstrom|&lt;br /&gt;
	Participants=Kyo_Lee|&lt;br /&gt;
	Participants=Kim_Miller|&lt;br /&gt;
	Participants=Heath_Mills|&lt;br /&gt;
	Participants=Ji-Hyun_Oh|&lt;br /&gt;
	Participants=Suzanne_Pierce|&lt;br /&gt;
	Participants=Allen_Pope|&lt;br /&gt;
	Participants=Jordan_Read|&lt;br /&gt;
	Participants=Mimi_Tzeng|&lt;br /&gt;
	Participants=Sandra_Villamizar|&lt;br /&gt;
	Participants=Xuan_Yu|&lt;br /&gt;
	Progress=20|&lt;br /&gt;
	StartDate=2015-03-10|&lt;br /&gt;
	TargetDate=2015-03-16|&lt;br /&gt;
	Type=Low}}&lt;/div&gt;</summary>
		<author><name>Suzanne</name></author>	</entry>

	<entry>
		<id>https://www.organicdatascience.org/gpf/index.php?title=Develop_proposal_for_special_issue&amp;diff=11700</id>
		<title>Develop proposal for special issue</title>
		<link rel="alternate" type="text/html" href="https://www.organicdatascience.org/gpf/index.php?title=Develop_proposal_for_special_issue&amp;diff=11700"/>
				<updated>2015-04-03T17:53:50Z</updated>
		
		<summary type="html">&lt;p&gt;Suzanne: /* [Pierce 2015] */&lt;/p&gt;
&lt;hr /&gt;
&lt;div&gt;[[Category:Task]]&lt;br /&gt;
&lt;br /&gt;
== Background: Why a Special Issue on Geoscience Papers of the Future? ==&lt;br /&gt;
&lt;br /&gt;
[[Discuss_what_we_will_consider_a_GPF#The_Vision | Include here our discussion for the vision]]&lt;br /&gt;
&lt;br /&gt;
Background should be 1-2 pages.&lt;br /&gt;
&lt;br /&gt;
Motivated by need to fully document and make research accessible and reproducible. &lt;br /&gt;
&lt;br /&gt;
=== Motivation: The EarthCube Initiative and the GeoSoft Project ===&lt;br /&gt;
&lt;br /&gt;
[http://www.geosoft-earthcube.org/about Include here background about GeoSoft from the web site]&lt;br /&gt;
&lt;br /&gt;
OSTP memo.  EarthCube reports.&lt;br /&gt;
Other reports that talk about the need for new approaches to editing.&lt;br /&gt;
&lt;br /&gt;
It's possible that small or very large contributions are not well captured in the current publishing paradigms.  Nanopublications.&lt;br /&gt;
&lt;br /&gt;
For example, nano-publications are a possible way to reflect advances in a research process that may not merit a full pubication but they are useful advances to share with the community. A challenge here is that there is a stigma in publishing for publishing units that are too small or very small.  &lt;br /&gt;
&lt;br /&gt;
Alternatively, a very large piece of research or work with many parts may be better suited to a GPF style publication.&lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
Perhaps, the concept of a 'paper' can be better reflected in the concept of a 'wrapper' or a collection of materials and resources. The purpose is to assure that publications are representative of the work, effort, and results achieved in the research process.&lt;br /&gt;
&lt;br /&gt;
=== What is a GPF ===&lt;br /&gt;
&lt;br /&gt;
[[Discuss_what_we_will_consider_a_GPF#What_is_a_Geoscience_Paper_of_the_Future.3F | Include here our discussion of what is a GPF]]&lt;br /&gt;
&lt;br /&gt;
=== The challenges of creating GPFs ===&lt;br /&gt;
&lt;br /&gt;
The articles in this issue reflect the current best practice for generating a Geoscience Paper of the Future.&lt;br /&gt;
&lt;br /&gt;
'''Figure discussions''': Do we want to do exactly the same figure automatically.  Figures in the paper may be a clean versions of an image generated by software.  To the extent possible, authors have included clear delineations of provenance. The goal is to assure that readers may regenerate the figures using documented workflows, data, and codes.  An important note (Allen, Sandra) is that frequently figures are generated by code, scripts, etc. yet the actual figure is finalized with user.....  Mimi is trying to say: is it really worth belaboring the point about how the prettified version of the figure is made? If it is: both of the visualization software I've used (Matlab and SigmaPlot) have actual code in the background that specifies how to set up the prettification, and this code can be found, copied out, and rerun to generate the exact same figure with all of the prettification in the same place. SigmaPlot uses Visual Basic (I think) in its macros. If it is an important point about explicit code, this should be doable. But I'm not sure it's strictly necessary to specify exactly where all the prettifications are to get the gist across.&lt;br /&gt;
&lt;br /&gt;
How much of your experimental history does one include?  (Ibrahim).  The experimental process often ends up nowhere.  Should we document all the failed experiments?  Get one DOI for the results of the successful experiment?  Another for failed trials?&lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
'''''Documenting: Timing and Intermediate proceses'''''&lt;br /&gt;
When should we document and what are the bounds on what we document?&lt;br /&gt;
For example, should we document and include data and workflows for 'failed' experiments? Or should we assign datasets DOIs before we know the results from using them?  &lt;br /&gt;
The group thinks that  good ideas/practices may include documenting and sharing data when you have a clear understanding of the outcomes worth reporting. For example successful experiments should have clear, clean data documented and shared. Whereas one strategy with 'failed' experiments could include bundling the intermediate datasets with one DOI and a more general discussion of the process/methods.&lt;br /&gt;
&lt;br /&gt;
=== Related work ===&lt;br /&gt;
&lt;br /&gt;
[[Discuss_what_we_will_consider_a_GPF#New_Frameworks_to_Create_a_New_Generation_of_Scientific_Articles | Include here the related work we have discussed]]&lt;br /&gt;
&lt;br /&gt;
== Papers to be included ==&lt;br /&gt;
&lt;br /&gt;
Would it be worthwhile to group the papers into broader categories rather than giving specifics about every single paper?&lt;br /&gt;
&lt;br /&gt;
For each submission, we describe:&lt;br /&gt;
&lt;br /&gt;
* '''Authors and affiliations'''&lt;br /&gt;
* '''Keywords of research area'''&lt;br /&gt;
* '''Tentative title'''&lt;br /&gt;
* '''Short abstract'''&lt;br /&gt;
* '''Challenge'''&lt;br /&gt;
* '''Relationship to other publications''' (is the article based on a previously published article? is it new content? IF PREVIOUSLY PUBLISHED, PLS PROVIDE A POINTER TO THE PUBLISHED ARTICLE AND SPECIFY WHAT PERCENTAGE OF THE WORK PRESENTED WILL BE NEW)&lt;br /&gt;
* '''Pointer to the wiki page that documents the article'''&lt;br /&gt;
* '''Expected submission date'''&lt;br /&gt;
&lt;br /&gt;
=== [David 2015] ===&lt;br /&gt;
&lt;br /&gt;
* '''Authors and affiliations:''' [[Cedric David]]&lt;br /&gt;
* '''Keywords of research area:''' Hydrology, Rivers, Modeling, Testing, Reproducibility. &lt;br /&gt;
* '''Tentative title:''' Going beyond triple-checking, allowing for peace of mind in community model development.&lt;br /&gt;
* '''Short abstract:''' The development of computer models in the general field of geoscience is often made incrementally over many years.  Endeavors that generally start on one single researcher's own machine evolve over time into software that are often much larger than was initially anticipated.  Looking at years of building on their computer code, sometimes without much training in computer science, geoscience software developers can easily experience an overwhelming sense of incompetence when contemplating ways to further community usage of their software.  How does one allow others to use their code?  How can one foster survival of their tool?  How could one possibly ensure the scientific integrity of ongoing developments including those made by others?  Common issues faced by geoscience developers include selecting a license, learning how to track and document past and ongoing changes, choosing a software repository, and allowing for community development.  This paper provides a brief summary of experience with the three former steps of software growth by focusing on the almost decade-long code development of a river routing model.  The core of this study, however, focuses on reproducing previously-published experiments.  This step is highly repetitive and can therefore benefit greatly from automation.  Additionally, enabling automated software testing can arguably be considered the final step for sustainable software sharing, by allowing the main software developer to let go of a mental block considering scientific integrity.  Creating tools to automatically compare the results of an updated version of a software with those of previous studies can not only save the main developer's own time, it can also empower other researchers to in their ability to check and justify that their potential additions have retained scientific integrity.   &lt;br /&gt;
* '''Challenge:''' Reproducibility; Ensure that updates to an existing model are able to reproduce a series of simulations published previously.&lt;br /&gt;
* '''Relationship to other publications:''' This research is related to past and ongoing development of the Routing Application for Parallel computatIon of Discharge (RAPID).  The primary focus of this paper is to allow automated reproducibility of at least the [http://dx.doi.org/10.1175/2011JHM1345.1 first RAPID publication].  The scientific subject of this GPF differs from the article(s) to be reproduced as its focus is on development of automatic testing methods.  In that regard, the paper is expected to be 95% new. &lt;br /&gt;
* '''Pointer to the wiki page that documents the article:''' [[Document_GPF_activities_by_Cedric_David | Page]]&lt;br /&gt;
* '''Expected submission date:'''&lt;br /&gt;
&lt;br /&gt;
=== [Demir 2015] ===&lt;br /&gt;
&lt;br /&gt;
* '''Authors and affiliations:''' [[Ibrahim Demir]]&lt;br /&gt;
* '''Keywords of research area:''' hydrological network, optimization, network representation, database query&lt;br /&gt;
* '''Tentative title:''' Analysis and Optimization of Hydrological Network Database Representation Methods for Fast Access and Query in Web-based System&lt;br /&gt;
* '''Short abstract:''' Web based systems allow users to delineate watersheds on interactive map environments using server side processing. With increasing resolution of hydrological networks, optimized methods for storage of network representation in databases, and efficient queries and actions on the river network structure become critical. This paper presents a detailed study on analysis of widely used methods for representing hydrological networks in relational databases, and benchmarking common queries and modifications on the network structure using these methods. The analysis has been applied to the hydrological network of Iowa utilizing 90m DEM and 600,000 network nodes. The application results indicate that the representation methods provide massive improvements on query times and storage of network structure in the database. Suggested method allows watershed delineation tools running on client-side with desktop-like performance. &lt;br /&gt;
* '''Challenge:''' Reproducibility, Transferability; Some of the internal steps to prepare data might require long computation time and different software environments.&lt;br /&gt;
* '''Relationship to other publications:''' The article is based on a new study&lt;br /&gt;
* '''Pointer to the wiki page that documents the article:''' [[Document_GPF_activities_by_Ibrahim_Demir | Page]]&lt;br /&gt;
* '''Expected submission date:'''&lt;br /&gt;
&lt;br /&gt;
=== [Fulweiler 2015] ===&lt;br /&gt;
&lt;br /&gt;
* '''Authors and affiliations:''' [[Wally Fulweiler]]&lt;br /&gt;
* '''Keywords of research area:'''&lt;br /&gt;
* '''Tentative title:'''&lt;br /&gt;
* '''Short abstract:'''&lt;br /&gt;
* '''Challenge:''' &lt;br /&gt;
* '''Relationship to other publications:''' (is the article based on a previously published article? is it new content?)&lt;br /&gt;
* '''Pointer to the wiki page that documents the article:''' [[Document_GPF_activities_by_Wally_Fulweiler | Page]]&lt;br /&gt;
* '''Expected submission date:'''&lt;br /&gt;
&lt;br /&gt;
=== [Loh and Karlstrom 2015] ===&lt;br /&gt;
&lt;br /&gt;
* '''Authors and affiliations:'''  [[Lay Kuan Loh]] and [[Leif Karlstrom]]&lt;br /&gt;
* '''Keywords of research area:''' Spatial clustering, Eigenvector selection, Entropy Ranking, Cascades Volcanic Region, [http://geosphere.gsapubs.org/content/3/3/152.abstract Afar Depression], [http://astrogeology.usgs.gov/search/details/Mars/Research/Volcanic/TharsisVents/zip Tharsis provonce]&lt;br /&gt;
* '''Tentative title:''' Characterization of volcanic vent distributions using spectral clustering with eigenvector selection and entropy ranking&lt;br /&gt;
* '''Short abstract:''' Volcanic vents on the surface of Earth and other planets often appear in groups that exhibit spatial patterning. Such vent distributions reflect complex interplay between time-evolving mechanical controls on the pathways of magma ascent, background tectonic stresses, and unsteady supply of rising magma. With the ultimate aim of connecting surface vent distributions with the dynamics of magma ascent, we have developed a clustering method to quantify spatial patterns in vents. Clustering is typically used in exploratory data analysis to identify groups with similar behavior by partitioning a dataset into clusters that share similar attributes. Traditional clustering algorithms that work well on simple point-cloud type synthetic datasets generally do not scale well the real-world data we are interested in, where there are poor boundaries between clusters and much ambiguity in cluster assignments. We instead use a spectral clustering algorithm with eigenvector selection based on entropy ranking based off work from [http://www.sciencedirect.com/science/article/pii/S0925231210001311 Zhao et al 2010] that outperforms traditional spectral clustering algorithms in choosing the right number of clusters for point data. We benchmark this algorithm on synthetic vent data with increasingly complex spatial distributions, to test the ability to accurately cluster vent data with variable spatial density, skewness, number of clusters, and proximity of clusters. We then apply our algorithm to several real-world datasets from the Cascades, Afar Depression and Mars. &lt;br /&gt;
* '''Challenge:''' Reproducibility (i.e., Quantifying clustering); We plan to study how varying the statistical distribution, density, skewness, background noise, number of clusters, proximity of clusters, and combinations of any of these factors affects the performance of our algorithm. We test it against man-made and real world datasets. ''' &lt;br /&gt;
* '''Relationship to other publications:''' New content, but one of the databases we are studying in the paper (Cascades Volcanic Range) would be based off a different paper we are preparing and planning to submit earlier. &lt;br /&gt;
* '''Pointer to the wiki page that documents the article:''' [[Document_GPF_activities_by_Leif_Karlstrom | Page]]&lt;br /&gt;
* '''Expected submission date:''' June 2015&lt;br /&gt;
&lt;br /&gt;
=== [Lee 2015] ===&lt;br /&gt;
&lt;br /&gt;
* '''Authors and affiliations:''' [[Kyo Lee]], Maziyar Boustani and Chris Mattmann, Jet Propulsion Laboratory&lt;br /&gt;
* '''Keywords of research area:'''North American regional climate, regional climate model evaluation system, Open Climate Workbench, &lt;br /&gt;
* '''Tentative title:''' Evaluation of simulated temperature, precipitation, cloud fraction and insolation over the conterminous United States using Regional Climate Model Evaluation System&lt;br /&gt;
* '''Short abstract:'''This study describes the detailed process of evaluating model fidelity in simulating four key climate variables, surface air temperature, precipitation, cloud fraction and insolation and their covariability over the conterminous United States region. Regional Climate Model Evaluation System (RCMES), a suite of public database and open-source software package, provides both observational datasets and data processors useful for evaluating any climate models. In this paper, we provide a clear and easy-to-follow workflow of RCMES to replicate published papers evaluating North American Regional Climate Change Assessment Program (NARCCAP) regional climate model (RCM) hindcast simulations using observations from variety of sources. &lt;br /&gt;
* '''Challenge:'''Big Data Sharing, Dark Code; Sharing big data, better documenting source codes, encouraging climate science community to use RCMES  &lt;br /&gt;
* '''Relationship to other publications:''' [http://journals.ametsoc.org/doi/abs/10.1175/JCLI-D-12-00452.1 Kim et al. 2013], [http://link.springer.com/article/10.1007/s00382-014-2253-y Lee et al. 2014]&lt;br /&gt;
* '''Pointer to the wiki page that documents the article:''' [[Document_GPF_activities_by_Kyo_Lee | Page]]&lt;br /&gt;
* '''Expected submission date:'''End of June 2015&lt;br /&gt;
&lt;br /&gt;
=== [Miller 2015] ===&lt;br /&gt;
&lt;br /&gt;
* '''Authors and affiliations:''' [[Kim Miller]]&lt;br /&gt;
* '''Keywords of research area:'''&lt;br /&gt;
* '''Tentative title:'''&lt;br /&gt;
* '''Short abstract:'''&lt;br /&gt;
* '''Challenge:''' &lt;br /&gt;
* '''Relationship to other publications:''' (is the article based on a previously published article? is it new content?)&lt;br /&gt;
* '''Pointer to the wiki page that documents the article:''' [[Document_GPF_activities_by_Kim_Miller | Page]]&lt;br /&gt;
* '''Expected submission date:'''&lt;br /&gt;
&lt;br /&gt;
=== [Mills 2015] ===&lt;br /&gt;
&lt;br /&gt;
* '''Authors and affiliations:''' [[Heath Mills]], University of Houston Clear Lake; Brandi Kiel Reese, Texas A&amp;amp;M Corpus Christi&lt;br /&gt;
* '''Keywords of research area:'''&lt;br /&gt;
* '''Tentative title:'''Iron and Sulfur Cycling Biogeography Using Advanced Geochemical and Molecular Analyses&lt;br /&gt;
* '''Short abstract:'''My paper will develop and document a new pipeline to analyze a combined and robust genetic and geochemical data set. New, reproducible methods will be highlighted in this manuscript to help others better analyze similar data sets. There is a general lack of guidance within my field for such challenges. This manuscript will be unique and helpful from an analysis standpoint as well as for the science being presented.&lt;br /&gt;
* '''Challenge:''' Reproducibility; Dark Code&lt;br /&gt;
* '''Relationship to other publications:''' Original Manuscript&lt;br /&gt;
* '''Pointer to the wiki page that documents the article:''' [[Document_GPF_activities_by_Heith_Mills | Page]]&lt;br /&gt;
* '''Expected submission date:'''&lt;br /&gt;
&lt;br /&gt;
=== [Oh 2015] ===&lt;br /&gt;
&lt;br /&gt;
* '''Authors and affiliations:''' [[Ji-Hyun Oh]] Jet Propulsion Laboratory/University of Southern California&lt;br /&gt;
* '''Keywords of research area:''' Tropical Meteorology, Madden-Julian Oscillation, Momentum budget analysis &lt;br /&gt;
* '''Tentative title:''' Tools for computing momentum budget for the westerly wind event associated with the Madden-Julian Oscillation&lt;br /&gt;
* '''Short abstract:'''As one of the most pronounced modes of tropical intraseasonal variability, the Madden-Julian Oscillation (MJO) prominently connects global weather and climate, and serves as one of critical predictability sources for extended-range forecasting. The zonal circulation of the MJO is characterized by low-level westerlies (easterlies) in and to the west (east) of the convective center, respectively. The direction of zonal winds in the upper troposphere is opposite to that in the lower troposphere. In addition to the convective signal as an identifier of the MJO initiation, certain characteristics of the zonal circulation been used as a standard metric for monitoring the state of MJO and investigating features of the MJO and its impact on other atmospheric phenomena. This paper documents a tool for  investigating  the generation of low-level westerly winds during the MJO life cycle. The tool is used for the momentum budget analysis to understand the respective contributions of various processes involved in the wind evolution associated with the MJO using European Centre for Medium-Range Weather Forecasts operational analyses during Dynamics of the Madden–Julian Oscillation field campaign.&lt;br /&gt;
&lt;br /&gt;
* '''Challenge:''' Reproducibility, Dark Code; This paper will cover how to reproduce two key figures from the paper that I recently submitted to Journal of Atmospheric Science. This will include detailed procedures related to generating the figures such as how/where to download data, how to transform the format of the data to be used as an input for my codes, and so on.. &lt;br /&gt;
* '''Relationship to other publications:''' (is the article based on a previously published article? is it new content?) This article is related to the part of the paper submitted to Journal of Atmospheric Science. &lt;br /&gt;
* '''Pointer to the wiki page that documents the article:''' [[Document_GPF_activities_by_Ji_Hyun | Page]]&lt;br /&gt;
* '''Expected submission date:'''&lt;br /&gt;
&lt;br /&gt;
=== [Pierce 2015] ===&lt;br /&gt;
&lt;br /&gt;
* '''Authors and affiliations:''' [[Suzanne A Pierce]] and John Gentle (Texas Advanced Computing Center and Jackson School of Geosciences, The University of Texas at Austi&lt;br /&gt;
&lt;br /&gt;
* '''Keywords of research area:''' Decision Support Systems, Hydrogeology, Participatory Modeling, Data Fusion &lt;br /&gt;
* '''Tentative title:''' [[&lt;br /&gt;
* '''Short abstract:'''&lt;br /&gt;
&lt;br /&gt;
* '''Challenge:''' Reproducibility, Dark Code; Fully document a new software application and framework using example case study data and tutorials.&lt;br /&gt;
* '''Relationship to other publications:''' This article is new content&lt;br /&gt;
* '''Pointer to the wiki page that documents the article:''' [[Document_GPF_activities_by_Suzanne_Pierce | Page]]&lt;br /&gt;
* '''Expected submission date:'''&lt;br /&gt;
&lt;br /&gt;
=== [Pope 2015] ===&lt;br /&gt;
&lt;br /&gt;
* '''Authors and affiliations:''' [[Allen Pope]], National Snow and Ice Data Center, University of Colorado, Boulder&lt;br /&gt;
* '''Keywords of research area:''' Glaciology, Remote Sensing, Landsat 8, Polar Science&lt;br /&gt;
* '''Tentative title:''' Data and Code for Estimating and Evaluating Supraglacial Lake Depth With Landsat 8 and other Multispectral Sensors&lt;br /&gt;
* '''Short abstract:''' Supraglacial lakes play a significant role in glacial hydrological systems – for example, transporting water to the glacier bed in Greenland or leading to ice shelf fracture and disintegration in Antarctica. To investigate these important processes, multispectral remote sensing provides multiple methods for estimating supraglacial lake depth – either through single-band or band-ratio methods, both empirical and physically-based. Landsat 8 is the newest satellite in the Landsat series. With new bands, higher dynamic range, and higher radiometric resolution, the Operational Land Imager (OLI) aboard Landsat 8 has a lot of potential. &lt;br /&gt;
&lt;br /&gt;
This paper will document the data and code used in processing in situ reflectance spectra and depth measurements to investigate the ability of Landsat 8 to estimate lake depths using multiple methods, as well as quantify improvements over Landsat 7’s ETM+. A workflow, data, and code are provided to detail promising methods as applied to Landsat 8 OLI imagery of case study areas in Greenland, allowing calculation of regional volume estimates using 2013 and 2014 summer-season imagery. Altimetry from WorldView DEMs are used to validate lake depth estimates. The optimal method for supraglacial lake depth estimation with Landsat 8 is shown to be an average of single band depths by red and panchromatic bands. With this best method, preliminary investigation of seasonal behavior and elevation distribution of lakes is also discussed and documented.&lt;br /&gt;
* '''Challenge:''' Reproducibility, Dark Code&lt;br /&gt;
* '''Relationship to other publications:''' Documenting and explaining the data and code behind the analysis and results presented in another paper.&lt;br /&gt;
* '''Pointer to the wiki page that documents the article:''' [[Document_GPF_activities_by_Allen_Pope | Page]]&lt;br /&gt;
* '''Expected submission date:''' Late June 2015&lt;br /&gt;
&lt;br /&gt;
=== [Read and Winslow 2015] ===&lt;br /&gt;
&lt;br /&gt;
* '''Authors and affiliations:''' [[Jordan Read]] and [[Luke Winslow]]&lt;br /&gt;
* '''Keywords of research area:'''&lt;br /&gt;
* '''Tentative title:'''&lt;br /&gt;
* '''Short abstract:'''&lt;br /&gt;
* '''Challenge:''' &lt;br /&gt;
* '''Relationship to other publications:''' (is the article based on a previously published article? is it new content?)&lt;br /&gt;
* '''Pointer to the wiki page that documents the article:''' [[Document_GPF_activities_by_Jordan_Read | Page]]&lt;br /&gt;
* '''Expected submission date:'''&lt;br /&gt;
&lt;br /&gt;
=== [Tzeng 2015] ===&lt;br /&gt;
&lt;br /&gt;
* '''Authors and affiliations:''' [[Mimi Tzeng]], Brian Dzwonkowski (DISL); Kyeong Park (TAMU Galveston)&lt;br /&gt;
* '''Keywords of research area:'''physical oceanography, remote sensing&lt;br /&gt;
* '''Tentative title:''' Fisheries Oceanography of Coastal Alabama (FOCAL): A Subset of a Time-Series of Hydrographic and Current Data from a Permanent Moored Station Outside Mobile Bay (27 Jan to 18 May 2011)&lt;br /&gt;
* '''Short abstract:'''The Fisheries Oceanography in Coastal Alabama (FOCAL) program began in 2006 as a way for scientists at Dauphin Island Sea Lab (DISL) to study the natural variability of Alabama's nearshore environment as it relates to fisheries production. FOCAL provided a long-term baseline data set that included time-series hydrographic data from a permanent offshore mooring (ADCP, vertical thermister array and CTDs at surface and bottom) and shipboard surveys (vertical CTD profiles and water sampling), as well as monthly ichthyoplankton and zooplankton (depth-discrete) sample collections at FOCAL sites. The subset of data presented here are from the mooring, and includes a vertical array of thermisters, CTDs at surface and bottom, an ADCP at the bottom, and vertical CTD profiles collected at the mooring during maintenance surveys. The mooring is located at 30 05.410'N 88 12.694'W, 25 km southwest of the entrance to Mobile Bay. Temperature, salinity, density, depth, and current velocity data were collected at 20-minute intervals from 2006 to 2012. Other parameters, such as dissolved oxygen, are available for portions of the time series depending on which instruments were deployed at the time.&lt;br /&gt;
* '''Challenge:''' Dark Code, Reproducibility; My paper will be about the processing of data in a larger dataset, from which peer-reviewed papers have been written. The processing I did was not specific to any particular paper. I can point to an example paper that used some of the data from this dataset, that I processed, however all of the figures in the paper are composites that also include other data from elsewhere that I had nothing to do with (and it wouldn't be feasible to try to get hold of the other data within our timeframe).&lt;br /&gt;
* '''Relationship to other publications:''' A recent paper that used the part of the FOCAL data I'm documenting as the sample from the larger dataset: Dzwonkowski, Brian, Kyeong Park, Jungwoo Lee, Bret M. Webb, and Arnoldo Valle-Levinson. 2014. &amp;quot;Spatial variability of flow over a river-influenced inner shelf in coastal Alabama during spring.&amp;quot; Continental Shelf Research 74:25-34.&lt;br /&gt;
* '''Pointer to the wiki page that documents the article:''' [[Document_GPF_activities_by_Mimi_Tzeng | Page]]&lt;br /&gt;
* '''Expected submission date:'''&lt;br /&gt;
&lt;br /&gt;
=== [Villamizar 2015] ===&lt;br /&gt;
&lt;br /&gt;
* '''Authors and affiliations:''' [[Sandra Villamizar]], University of California, Merced&lt;br /&gt;
* '''Keywords of research area:''' river ecohydrology&lt;br /&gt;
* '''Tentative title:''' Producing long-term series of whole-stream metabolism using readily available data.  &lt;br /&gt;
* '''Short abstract:''' Continuous water quality and river discharge data that are readily available through government websites may be used to produce valuable information about key processes within a river ecosystem. In this paper I describe in detail the steps for acquisition and processing of river flow, dissolved oxygen, temperature, and specific conductance data that, combined with atmospheric data and physical properties of the river reach of interest, allow for the production of a long-term series of whole stream metabolism. This information is key in understanding the structure and function of an ecosystem such as the San Joaquin River in the Central Valley of California which has been increasingly degraded during the last 60 years due to intensive human intervention but now, since 2010, has been going through a restoration effort. The key advantage of this tool is that it uses readily available information to produce knowledge about a river ecosystem. This set of scripts, written in the R code, can be used immediately for any other river for which the key parameters (river flow, dissolved oxygen, temperature, and specific conductivity) are available. The scripts can also be modified by users to fit their particular site conditions.&lt;br /&gt;
 &lt;br /&gt;
* '''Challenge:''' Document new software/applications. This set of scripts was written after the necessity of generating daily estimates of metabolic rates for long periods of time and at various sites within the San Joaquin River.  &lt;br /&gt;
* '''Relationship to other publications:''' This will be a new publication&lt;br /&gt;
* '''Pointer to the wiki page that documents the article:''' [[Document_GPF_activities_by_Sandra_Villamizar | Page]]&lt;br /&gt;
* '''Expected submission date:''' To be defined&lt;br /&gt;
&lt;br /&gt;
=== [Yu and Bhatt 2015] ===&lt;br /&gt;
&lt;br /&gt;
* '''Authors and affiliations:''' [[Xuan Yu]], Department of Geological Sciences, University of Delaware. Gopal Bhatt, Department of Civil &amp;amp; Environmental Engineering, Pennsylvania State University. &lt;br /&gt;
* '''Keywords of research area:''' coupled processes, integrated hydrologic modeling, PIHM, surface flow, subsurface flow, open science&lt;br /&gt;
* '''Tentative title:''' Learning integrated modeling of surface and subsurface flow from scratch&lt;br /&gt;
* '''Short abstract:''' Integrated modeling of surface and subsurface flow has been of great interest in understanding not only intimate interconnectedness of hydrological processes, but also land-surface energy balance, biogeochemical and ecological processes, and landscape evolution. Although a growing number of complex hydrologic models have been used for resolving environmental processes, hypothesis testing, hydrologic predictions for effective management of watershed, very limited resources of the model implementation have been made accessible to a large group of model users. The users have to invest a significant amount of time and effort to reproduce, and to understand the workflow of hydrologic simulation in a modeling paper. To provide a challenging and stimulating introduction to integrated modeling of surface and subsurface flow in this paper, we revisit the development of Penn State Integrated Hydrologic Model (PIHM) by reproducing a numerical benchmarking example, and a real world catchment scale application. Specifically, we document PIHM and it’s modeling workflow to enable basic understanding of simulating coupled surface and subsurface flow processes. We provide model and data to highlight the reciprocal roles between the two. In addition, we incorporate user experience as third dimension in the modeling workflow to enable deeper communications between model developers and users. The workflow has important implications for smoothing and accelerating open scientific collaborations in geosciences research.&lt;br /&gt;
* '''Challenge:''' Reproduce published simulations by a existing model with the latest version. Benchmarking modeling application for numerical experiment and field data.&lt;br /&gt;
* '''Relationship to other publications:''' The article is based on a previously published article. &lt;br /&gt;
* '''Pointer to the wiki page that documents the article:''' [[Document_GPF_activities_by_Xuan_Yu | Page]]&lt;br /&gt;
* '''Expected submission date:''' End of June 2015&lt;br /&gt;
&lt;br /&gt;
== Special Issue Editors ==&lt;br /&gt;
&lt;br /&gt;
* Co-editor: Chris Duffy and/or Scott Peckham&lt;br /&gt;
* Co-editor: Cedric David&lt;br /&gt;
* Co-editor: possibly Karan Venayagamoorthy&lt;br /&gt;
&lt;br /&gt;
The editors will only accept submissions that follow the [[Develop_proposal_for_special_issue#Special_Issue_Review_Criteria | special issue review criteria]].&lt;br /&gt;
&lt;br /&gt;
The editors will select a set of reviewers to handle the submissions.  Reviewers will include computer scientists, library scientists, and geoscientists.&lt;br /&gt;
&lt;br /&gt;
== Special Issue Review Criteria ==&lt;br /&gt;
&lt;br /&gt;
The reviewers will be asked to provide feedback on the papers according to the following criteria. Note that some papers will have good reasons for limiting the information (e.g. the data is from third parties and not openly available, etc), and in that case they would document those reasons.&lt;br /&gt;
&lt;br /&gt;
* Documentation of the datasets: descriptions of datasets, unique identifiers, repositories.&lt;br /&gt;
* Documentation of software: description of all software used (including pre-processing of data, visualization steps, etc), unique identifiers, repositories.&lt;br /&gt;
* Documentation of the provenance of results: provenance for each figure or result, such as the workflow or the provenance record.&lt;br /&gt;
&lt;br /&gt;
== Tentative Timeline ==&lt;br /&gt;
&lt;br /&gt;
* Journal committed to special issue: April 15, 2015&lt;br /&gt;
* Submissions due to editors: June 30, 2015&lt;br /&gt;
* Reviews due: Sept 15, 2015&lt;br /&gt;
* Decisions out to authors: Sept 30, 2015&lt;br /&gt;
* Revisions due: October 31, 2015&lt;br /&gt;
* Final versions due November 15, 2015&lt;br /&gt;
* Issue published December 31, 2015&lt;br /&gt;
&lt;br /&gt;
&amp;lt;!-- Add any wiki Text above this Line --&amp;gt;&lt;br /&gt;
&amp;lt;!-- Do NOT Edit below this Line --&amp;gt;&lt;br /&gt;
{{#set:&lt;br /&gt;
	Owner=Chris_Duffy|&lt;br /&gt;
	Participants=Yolanda_Gil|&lt;br /&gt;
	Participants=Scott_Peckham|&lt;br /&gt;
	Participants=Cedric_David|&lt;br /&gt;
	Participants=Ibrahim_Demir|&lt;br /&gt;
	Participants=Wally_Fulweiler|&lt;br /&gt;
	Participants=Leif_Karlstrom|&lt;br /&gt;
	Participants=Kyo_Lee|&lt;br /&gt;
	Participants=Kim_Miller|&lt;br /&gt;
	Participants=Heath_Mills|&lt;br /&gt;
	Participants=Ji-Hyun_Oh|&lt;br /&gt;
	Participants=Suzanne_Pierce|&lt;br /&gt;
	Participants=Allen_Pope|&lt;br /&gt;
	Participants=Jordan_Read|&lt;br /&gt;
	Participants=Mimi_Tzeng|&lt;br /&gt;
	Participants=Sandra_Villamizar|&lt;br /&gt;
	Participants=Xuan_Yu|&lt;br /&gt;
	Progress=20|&lt;br /&gt;
	StartDate=2015-03-10|&lt;br /&gt;
	TargetDate=2015-03-16|&lt;br /&gt;
	Type=Low}}&lt;/div&gt;</summary>
		<author><name>Suzanne</name></author>	</entry>

	<entry>
		<id>https://www.organicdatascience.org/gpf/index.php?title=Develop_proposal_for_special_issue&amp;diff=11697</id>
		<title>Develop proposal for special issue</title>
		<link rel="alternate" type="text/html" href="https://www.organicdatascience.org/gpf/index.php?title=Develop_proposal_for_special_issue&amp;diff=11697"/>
				<updated>2015-04-03T17:52:51Z</updated>
		
		<summary type="html">&lt;p&gt;Suzanne: /* [Pierce 2015] */&lt;/p&gt;
&lt;hr /&gt;
&lt;div&gt;[[Category:Task]]&lt;br /&gt;
&lt;br /&gt;
== Background: Why a Special Issue on Geoscience Papers of the Future? ==&lt;br /&gt;
&lt;br /&gt;
[[Discuss_what_we_will_consider_a_GPF#The_Vision | Include here our discussion for the vision]]&lt;br /&gt;
&lt;br /&gt;
Background should be 1-2 pages.&lt;br /&gt;
&lt;br /&gt;
Motivated by need to fully document and make research accessible and reproducible. &lt;br /&gt;
&lt;br /&gt;
=== Motivation: The EarthCube Initiative and the GeoSoft Project ===&lt;br /&gt;
&lt;br /&gt;
[http://www.geosoft-earthcube.org/about Include here background about GeoSoft from the web site]&lt;br /&gt;
&lt;br /&gt;
OSTP memo.  EarthCube reports.&lt;br /&gt;
Other reports that talk about the need for new approaches to editing.&lt;br /&gt;
&lt;br /&gt;
It's possible that small or very large contributions are not well captured in the current publishing paradigms.  Nanopublications.&lt;br /&gt;
&lt;br /&gt;
For example, nano-publications are a possible way to reflect advances in a research process that may not merit a full pubication but they are useful advances to share with the community. A challenge here is that there is a stigma in publishing for publishing units that are too small or very small.  &lt;br /&gt;
&lt;br /&gt;
Alternatively, a very large piece of research or work with many parts may be better suited to a GPF style publication.&lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
Perhaps, the concept of a 'paper' can be better reflected in the concept of a 'wrapper' or a collection of materials and resources. The purpose is to assure that publications are representative of the work, effort, and results achieved in the research process.&lt;br /&gt;
&lt;br /&gt;
=== What is a GPF ===&lt;br /&gt;
&lt;br /&gt;
[[Discuss_what_we_will_consider_a_GPF#What_is_a_Geoscience_Paper_of_the_Future.3F | Include here our discussion of what is a GPF]]&lt;br /&gt;
&lt;br /&gt;
=== The challenges of creating GPFs ===&lt;br /&gt;
&lt;br /&gt;
The articles in this issue reflect the current best practice for generating a Geoscience Paper of the Future.&lt;br /&gt;
&lt;br /&gt;
'''Figure discussions''': Do we want to do exactly the same figure automatically.  Figures in the paper may be a clean versions of an image generated by software.  To the extent possible, authors have included clear delineations of provenance. The goal is to assure that readers may regenerate the figures using documented workflows, data, and codes.  An important note (Allen, Sandra) is that frequently figures are generated by code, scripts, etc. yet the actual figure is finalized with user.....  Mimi is trying to say: is it really worth belaboring the point about how the prettified version of the figure is made? If it is: both of the visualization software I've used (Matlab and SigmaPlot) have actual code in the background that specifies how to set up the prettification, and this code can be found, copied out, and rerun to generate the exact same figure with all of the prettification in the same place. SigmaPlot uses Visual Basic (I think) in its macros. If it is an important point about explicit code, this should be doable. But I'm not sure it's strictly necessary to specify exactly where all the prettifications are to get the gist across.&lt;br /&gt;
&lt;br /&gt;
How much of your experimental history does one include?  (Ibrahim).  The experimental process often ends up nowhere.  Should we document all the failed experiments?  Get one DOI for the results of the successful experiment?  Another for failed trials?&lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
'''''Documenting: Timing and Intermediate proceses'''''&lt;br /&gt;
When should we document and what are the bounds on what we document?&lt;br /&gt;
For example, should we document and include data and workflows for 'failed' experiments? Or should we assign datasets DOIs before we know the results from using them?  &lt;br /&gt;
The group thinks that  good ideas/practices may include documenting and sharing data when you have a clear understanding of the outcomes worth reporting. For example successful experiments should have clear, clean data documented and shared. Whereas one strategy with 'failed' experiments could include bundling the intermediate datasets with one DOI and a more general discussion of the process/methods.&lt;br /&gt;
&lt;br /&gt;
=== Related work ===&lt;br /&gt;
&lt;br /&gt;
[[Discuss_what_we_will_consider_a_GPF#New_Frameworks_to_Create_a_New_Generation_of_Scientific_Articles | Include here the related work we have discussed]]&lt;br /&gt;
&lt;br /&gt;
== Papers to be included ==&lt;br /&gt;
&lt;br /&gt;
Would it be worthwhile to group the papers into broader categories rather than giving specifics about every single paper?&lt;br /&gt;
&lt;br /&gt;
For each submission, we describe:&lt;br /&gt;
&lt;br /&gt;
* '''Authors and affiliations'''&lt;br /&gt;
* '''Keywords of research area'''&lt;br /&gt;
* '''Tentative title'''&lt;br /&gt;
* '''Short abstract'''&lt;br /&gt;
* '''Challenge'''&lt;br /&gt;
* '''Relationship to other publications''' (is the article based on a previously published article? is it new content? IF PREVIOUSLY PUBLISHED, PLS PROVIDE A POINTER TO THE PUBLISHED ARTICLE AND SPECIFY WHAT PERCENTAGE OF THE WORK PRESENTED WILL BE NEW)&lt;br /&gt;
* '''Pointer to the wiki page that documents the article'''&lt;br /&gt;
* '''Expected submission date'''&lt;br /&gt;
&lt;br /&gt;
=== [David 2015] ===&lt;br /&gt;
&lt;br /&gt;
* '''Authors and affiliations:''' [[Cedric David]]&lt;br /&gt;
* '''Keywords of research area:''' Hydrology, Rivers, Modeling, Testing, Reproducibility. &lt;br /&gt;
* '''Tentative title:''' Going beyond triple-checking, allowing for peace of mind in community model development.&lt;br /&gt;
* '''Short abstract:''' The development of computer models in the general field of geoscience is often made incrementally over many years.  Endeavors that generally start on one single researcher's own machine evolve over time into software that are often much larger than was initially anticipated.  Looking at years of building on their computer code, sometimes without much training in computer science, geoscience software developers can easily experience an overwhelming sense of incompetence when contemplating ways to further community usage of their software.  How does one allow others to use their code?  How can one foster survival of their tool?  How could one possibly ensure the scientific integrity of ongoing developments including those made by others?  Common issues faced by geoscience developers include selecting a license, learning how to track and document past and ongoing changes, choosing a software repository, and allowing for community development.  This paper provides a brief summary of experience with the three former steps of software growth by focusing on the almost decade-long code development of a river routing model.  The core of this study, however, focuses on reproducing previously-published experiments.  This step is highly repetitive and can therefore benefit greatly from automation.  Additionally, enabling automated software testing can arguably be considered the final step for sustainable software sharing, by allowing the main software developer to let go of a mental block considering scientific integrity.  Creating tools to automatically compare the results of an updated version of a software with those of previous studies can not only save the main developer's own time, it can also empower other researchers to in their ability to check and justify that their potential additions have retained scientific integrity.   &lt;br /&gt;
* '''Challenge:''' Reproducibility; Ensure that updates to an existing model are able to reproduce a series of simulations published previously.&lt;br /&gt;
* '''Relationship to other publications:''' This research is related to past and ongoing development of the Routing Application for Parallel computatIon of Discharge (RAPID).  The primary focus of this paper is to allow automated reproducibility of at least the [http://dx.doi.org/10.1175/2011JHM1345.1 first RAPID publication].  The scientific subject of this GPF differs from the article(s) to be reproduced as its focus is on development of automatic testing methods.  In that regard, the paper is expected to be 95% new. &lt;br /&gt;
* '''Pointer to the wiki page that documents the article:''' [[Document_GPF_activities_by_Cedric_David | Page]]&lt;br /&gt;
* '''Expected submission date:'''&lt;br /&gt;
&lt;br /&gt;
=== [Demir 2015] ===&lt;br /&gt;
&lt;br /&gt;
* '''Authors and affiliations:''' [[Ibrahim Demir]]&lt;br /&gt;
* '''Keywords of research area:''' hydrological network, optimization, network representation, database query&lt;br /&gt;
* '''Tentative title:''' Analysis and Optimization of Hydrological Network Database Representation Methods for Fast Access and Query in Web-based System&lt;br /&gt;
* '''Short abstract:''' Web based systems allow users to delineate watersheds on interactive map environments using server side processing. With increasing resolution of hydrological networks, optimized methods for storage of network representation in databases, and efficient queries and actions on the river network structure become critical. This paper presents a detailed study on analysis of widely used methods for representing hydrological networks in relational databases, and benchmarking common queries and modifications on the network structure using these methods. The analysis has been applied to the hydrological network of Iowa utilizing 90m DEM and 600,000 network nodes. The application results indicate that the representation methods provide massive improvements on query times and storage of network structure in the database. Suggested method allows watershed delineation tools running on client-side with desktop-like performance. &lt;br /&gt;
* '''Challenge:''' Reproducibility, Transferability; Some of the internal steps to prepare data might require long computation time and different software environments.&lt;br /&gt;
* '''Relationship to other publications:''' The article is based on a new study&lt;br /&gt;
* '''Pointer to the wiki page that documents the article:''' [[Document_GPF_activities_by_Ibrahim_Demir | Page]]&lt;br /&gt;
* '''Expected submission date:'''&lt;br /&gt;
&lt;br /&gt;
=== [Fulweiler 2015] ===&lt;br /&gt;
&lt;br /&gt;
* '''Authors and affiliations:''' [[Wally Fulweiler]]&lt;br /&gt;
* '''Keywords of research area:'''&lt;br /&gt;
* '''Tentative title:'''&lt;br /&gt;
* '''Short abstract:'''&lt;br /&gt;
* '''Challenge:''' &lt;br /&gt;
* '''Relationship to other publications:''' (is the article based on a previously published article? is it new content?)&lt;br /&gt;
* '''Pointer to the wiki page that documents the article:''' [[Document_GPF_activities_by_Wally_Fulweiler | Page]]&lt;br /&gt;
* '''Expected submission date:'''&lt;br /&gt;
&lt;br /&gt;
=== [Loh and Karlstrom 2015] ===&lt;br /&gt;
&lt;br /&gt;
* '''Authors and affiliations:'''  [[Lay Kuan Loh]] and [[Leif Karlstrom]]&lt;br /&gt;
* '''Keywords of research area:''' Spatial clustering, Eigenvector selection, Entropy Ranking, Cascades Volcanic Region, [http://geosphere.gsapubs.org/content/3/3/152.abstract Afar Depression], [http://astrogeology.usgs.gov/search/details/Mars/Research/Volcanic/TharsisVents/zip Tharsis provonce]&lt;br /&gt;
* '''Tentative title:''' Characterization of volcanic vent distributions using spectral clustering with eigenvector selection and entropy ranking&lt;br /&gt;
* '''Short abstract:''' Volcanic vents on the surface of Earth and other planets often appear in groups that exhibit spatial patterning. Such vent distributions reflect complex interplay between time-evolving mechanical controls on the pathways of magma ascent, background tectonic stresses, and unsteady supply of rising magma. With the ultimate aim of connecting surface vent distributions with the dynamics of magma ascent, we have developed a clustering method to quantify spatial patterns in vents. Clustering is typically used in exploratory data analysis to identify groups with similar behavior by partitioning a dataset into clusters that share similar attributes. Traditional clustering algorithms that work well on simple point-cloud type synthetic datasets generally do not scale well the real-world data we are interested in, where there are poor boundaries between clusters and much ambiguity in cluster assignments. We instead use a spectral clustering algorithm with eigenvector selection based on entropy ranking based off work from [http://www.sciencedirect.com/science/article/pii/S0925231210001311 Zhao et al 2010] that outperforms traditional spectral clustering algorithms in choosing the right number of clusters for point data. We benchmark this algorithm on synthetic vent data with increasingly complex spatial distributions, to test the ability to accurately cluster vent data with variable spatial density, skewness, number of clusters, and proximity of clusters. We then apply our algorithm to several real-world datasets from the Cascades, Afar Depression and Mars. &lt;br /&gt;
* '''Challenge:''' Reproducibility (i.e., Quantifying clustering); We plan to study how varying the statistical distribution, density, skewness, background noise, number of clusters, proximity of clusters, and combinations of any of these factors affects the performance of our algorithm. We test it against man-made and real world datasets. ''' &lt;br /&gt;
* '''Relationship to other publications:''' New content, but one of the databases we are studying in the paper (Cascades Volcanic Range) would be based off a different paper we are preparing and planning to submit earlier. &lt;br /&gt;
* '''Pointer to the wiki page that documents the article:''' [[Document_GPF_activities_by_Leif_Karlstrom | Page]]&lt;br /&gt;
* '''Expected submission date:''' June 2015&lt;br /&gt;
&lt;br /&gt;
=== [Lee 2015] ===&lt;br /&gt;
&lt;br /&gt;
* '''Authors and affiliations:''' [[Kyo Lee]], Maziyar Boustani and Chris Mattmann, Jet Propulsion Laboratory&lt;br /&gt;
* '''Keywords of research area:'''North American regional climate, regional climate model evaluation system, Open Climate Workbench, &lt;br /&gt;
* '''Tentative title:''' Evaluation of simulated temperature, precipitation, cloud fraction and insolation over the conterminous United States using Regional Climate Model Evaluation System&lt;br /&gt;
* '''Short abstract:'''This study describes the detailed process of evaluating model fidelity in simulating four key climate variables, surface air temperature, precipitation, cloud fraction and insolation and their covariability over the conterminous United States region. Regional Climate Model Evaluation System (RCMES), a suite of public database and open-source software package, provides both observational datasets and data processors useful for evaluating any climate models. In this paper, we provide a clear and easy-to-follow workflow of RCMES to replicate published papers evaluating North American Regional Climate Change Assessment Program (NARCCAP) regional climate model (RCM) hindcast simulations using observations from variety of sources. &lt;br /&gt;
* '''Challenge:'''Big Data Sharing, Dark Code; Sharing big data, better documenting source codes, encouraging climate science community to use RCMES  &lt;br /&gt;
* '''Relationship to other publications:''' [http://journals.ametsoc.org/doi/abs/10.1175/JCLI-D-12-00452.1 Kim et al. 2013], [http://link.springer.com/article/10.1007/s00382-014-2253-y Lee et al. 2014]&lt;br /&gt;
* '''Pointer to the wiki page that documents the article:''' [[Document_GPF_activities_by_Kyo_Lee | Page]]&lt;br /&gt;
* '''Expected submission date:'''End of June 2015&lt;br /&gt;
&lt;br /&gt;
=== [Miller 2015] ===&lt;br /&gt;
&lt;br /&gt;
* '''Authors and affiliations:''' [[Kim Miller]]&lt;br /&gt;
* '''Keywords of research area:'''&lt;br /&gt;
* '''Tentative title:'''&lt;br /&gt;
* '''Short abstract:'''&lt;br /&gt;
* '''Challenge:''' &lt;br /&gt;
* '''Relationship to other publications:''' (is the article based on a previously published article? is it new content?)&lt;br /&gt;
* '''Pointer to the wiki page that documents the article:''' [[Document_GPF_activities_by_Kim_Miller | Page]]&lt;br /&gt;
* '''Expected submission date:'''&lt;br /&gt;
&lt;br /&gt;
=== [Mills 2015] ===&lt;br /&gt;
&lt;br /&gt;
* '''Authors and affiliations:''' [[Heath Mills]], University of Houston Clear Lake; Brandi Kiel Reese, Texas A&amp;amp;M Corpus Christi&lt;br /&gt;
* '''Keywords of research area:'''&lt;br /&gt;
* '''Tentative title:'''Iron and Sulfur Cycling Biogeography Using Advanced Geochemical and Molecular Analyses&lt;br /&gt;
* '''Short abstract:'''My paper will develop and document a new pipeline to analyze a combined and robust genetic and geochemical data set. New, reproducible methods will be highlighted in this manuscript to help others better analyze similar data sets. There is a general lack of guidance within my field for such challenges. This manuscript will be unique and helpful from an analysis standpoint as well as for the science being presented.&lt;br /&gt;
* '''Challenge:''' Reproducibility; Dark Code&lt;br /&gt;
* '''Relationship to other publications:''' Original Manuscript&lt;br /&gt;
* '''Pointer to the wiki page that documents the article:''' [[Document_GPF_activities_by_Heith_Mills | Page]]&lt;br /&gt;
* '''Expected submission date:'''&lt;br /&gt;
&lt;br /&gt;
=== [Oh 2015] ===&lt;br /&gt;
&lt;br /&gt;
* '''Authors and affiliations:''' [[Ji-Hyun Oh]] Jet Propulsion Laboratory/University of Southern California&lt;br /&gt;
* '''Keywords of research area:''' Tropical Meteorology, Madden-Julian Oscillation, Momentum budget analysis &lt;br /&gt;
* '''Tentative title:''' Tools for computing momentum budget for the westerly wind event associated with the Madden-Julian Oscillation&lt;br /&gt;
* '''Short abstract:'''As one of the most pronounced modes of tropical intraseasonal variability, the Madden-Julian Oscillation (MJO) prominently connects global weather and climate, and serves as one of critical predictability sources for extended-range forecasting. The zonal circulation of the MJO is characterized by low-level westerlies (easterlies) in and to the west (east) of the convective center, respectively. The direction of zonal winds in the upper troposphere is opposite to that in the lower troposphere. In addition to the convective signal as an identifier of the MJO initiation, certain characteristics of the zonal circulation been used as a standard metric for monitoring the state of MJO and investigating features of the MJO and its impact on other atmospheric phenomena. This paper documents a tool for  investigating  the generation of low-level westerly winds during the MJO life cycle. The tool is used for the momentum budget analysis to understand the respective contributions of various processes involved in the wind evolution associated with the MJO using European Centre for Medium-Range Weather Forecasts operational analyses during Dynamics of the Madden–Julian Oscillation field campaign.&lt;br /&gt;
&lt;br /&gt;
* '''Challenge:''' Reproducibility, Dark Code; This paper will cover how to reproduce two key figures from the paper that I recently submitted to Journal of Atmospheric Science. This will include detailed procedures related to generating the figures such as how/where to download data, how to transform the format of the data to be used as an input for my codes, and so on.. &lt;br /&gt;
* '''Relationship to other publications:''' (is the article based on a previously published article? is it new content?) This article is related to the part of the paper submitted to Journal of Atmospheric Science. &lt;br /&gt;
* '''Pointer to the wiki page that documents the article:''' [[Document_GPF_activities_by_Ji_Hyun | Page]]&lt;br /&gt;
* '''Expected submission date:'''&lt;br /&gt;
&lt;br /&gt;
=== [Pierce 2015] ===&lt;br /&gt;
&lt;br /&gt;
* '''Authors and affiliations:''' [[Suzanne Pierce]] and John Gentle (Texas Advanced Computing Center and Jackson School of Geosciences, The University of Texas at Austi&lt;br /&gt;
&lt;br /&gt;
* '''Keywords of research area:''' Hydrogeology, Risk &lt;br /&gt;
* '''Tentative title:''' [[&lt;br /&gt;
* '''Short abstract:'''&lt;br /&gt;
&lt;br /&gt;
* '''Challenge:''' Fully document a new software application and framework using example case study data and tutorials.&lt;br /&gt;
* '''Relationship to other publications:''' This article is new content&lt;br /&gt;
* '''Pointer to the wiki page that documents the article:''' [[Document_GPF_activities_by_Suzanne_Pierce | Page]]&lt;br /&gt;
* '''Expected submission date:'''&lt;br /&gt;
&lt;br /&gt;
=== [Pope 2015] ===&lt;br /&gt;
&lt;br /&gt;
* '''Authors and affiliations:''' [[Allen Pope]], National Snow and Ice Data Center, University of Colorado, Boulder&lt;br /&gt;
* '''Keywords of research area:''' Glaciology, Remote Sensing, Landsat 8, Polar Science&lt;br /&gt;
* '''Tentative title:''' Data and Code for Estimating and Evaluating Supraglacial Lake Depth With Landsat 8 and other Multispectral Sensors&lt;br /&gt;
* '''Short abstract:''' Supraglacial lakes play a significant role in glacial hydrological systems – for example, transporting water to the glacier bed in Greenland or leading to ice shelf fracture and disintegration in Antarctica. To investigate these important processes, multispectral remote sensing provides multiple methods for estimating supraglacial lake depth – either through single-band or band-ratio methods, both empirical and physically-based. Landsat 8 is the newest satellite in the Landsat series. With new bands, higher dynamic range, and higher radiometric resolution, the Operational Land Imager (OLI) aboard Landsat 8 has a lot of potential. &lt;br /&gt;
&lt;br /&gt;
This paper will document the data and code used in processing in situ reflectance spectra and depth measurements to investigate the ability of Landsat 8 to estimate lake depths using multiple methods, as well as quantify improvements over Landsat 7’s ETM+. A workflow, data, and code are provided to detail promising methods as applied to Landsat 8 OLI imagery of case study areas in Greenland, allowing calculation of regional volume estimates using 2013 and 2014 summer-season imagery. Altimetry from WorldView DEMs are used to validate lake depth estimates. The optimal method for supraglacial lake depth estimation with Landsat 8 is shown to be an average of single band depths by red and panchromatic bands. With this best method, preliminary investigation of seasonal behavior and elevation distribution of lakes is also discussed and documented.&lt;br /&gt;
* '''Challenge:''' Reproducibility, Dark Code&lt;br /&gt;
* '''Relationship to other publications:''' Documenting and explaining the data and code behind the analysis and results presented in another paper.&lt;br /&gt;
* '''Pointer to the wiki page that documents the article:''' [[Document_GPF_activities_by_Allen_Pope | Page]]&lt;br /&gt;
* '''Expected submission date:''' Late June 2015&lt;br /&gt;
&lt;br /&gt;
=== [Read and Winslow 2015] ===&lt;br /&gt;
&lt;br /&gt;
* '''Authors and affiliations:''' [[Jordan Read]] and [[Luke Winslow]]&lt;br /&gt;
* '''Keywords of research area:'''&lt;br /&gt;
* '''Tentative title:'''&lt;br /&gt;
* '''Short abstract:'''&lt;br /&gt;
* '''Challenge:''' &lt;br /&gt;
* '''Relationship to other publications:''' (is the article based on a previously published article? is it new content?)&lt;br /&gt;
* '''Pointer to the wiki page that documents the article:''' [[Document_GPF_activities_by_Jordan_Read | Page]]&lt;br /&gt;
* '''Expected submission date:'''&lt;br /&gt;
&lt;br /&gt;
=== [Tzeng 2015] ===&lt;br /&gt;
&lt;br /&gt;
* '''Authors and affiliations:''' [[Mimi Tzeng]], Brian Dzwonkowski (DISL); Kyeong Park (TAMU Galveston)&lt;br /&gt;
* '''Keywords of research area:'''physical oceanography, remote sensing&lt;br /&gt;
* '''Tentative title:''' Fisheries Oceanography of Coastal Alabama (FOCAL): A Subset of a Time-Series of Hydrographic and Current Data from a Permanent Moored Station Outside Mobile Bay (27 Jan to 18 May 2011)&lt;br /&gt;
* '''Short abstract:'''The Fisheries Oceanography in Coastal Alabama (FOCAL) program began in 2006 as a way for scientists at Dauphin Island Sea Lab (DISL) to study the natural variability of Alabama's nearshore environment as it relates to fisheries production. FOCAL provided a long-term baseline data set that included time-series hydrographic data from a permanent offshore mooring (ADCP, vertical thermister array and CTDs at surface and bottom) and shipboard surveys (vertical CTD profiles and water sampling), as well as monthly ichthyoplankton and zooplankton (depth-discrete) sample collections at FOCAL sites. The subset of data presented here are from the mooring, and includes a vertical array of thermisters, CTDs at surface and bottom, an ADCP at the bottom, and vertical CTD profiles collected at the mooring during maintenance surveys. The mooring is located at 30 05.410'N 88 12.694'W, 25 km southwest of the entrance to Mobile Bay. Temperature, salinity, density, depth, and current velocity data were collected at 20-minute intervals from 2006 to 2012. Other parameters, such as dissolved oxygen, are available for portions of the time series depending on which instruments were deployed at the time.&lt;br /&gt;
* '''Challenge:''' My paper will be about the processing of data in a larger dataset, from which peer-reviewed papers have been written. The processing I did was not specific to any particular paper. I can point to an example paper that used some of the data from this dataset, that I processed, however all of the figures in the paper are composites that also include other data from elsewhere that I had nothing to do with (and it wouldn't be feasible to try to get hold of the other data within our timeframe).&lt;br /&gt;
* '''Relationship to other publications:''' A recent paper that used the part of the FOCAL data I'm documenting as the sample from the larger dataset: Dzwonkowski, Brian, Kyeong Park, Jungwoo Lee, Bret M. Webb, and Arnoldo Valle-Levinson. 2014. &amp;quot;Spatial variability of flow over a river-influenced inner shelf in coastal Alabama during spring.&amp;quot; Continental Shelf Research 74:25-34.&lt;br /&gt;
* '''Pointer to the wiki page that documents the article:''' [[Document_GPF_activities_by_Mimi_Tzeng | Page]]&lt;br /&gt;
* '''Expected submission date:'''&lt;br /&gt;
&lt;br /&gt;
=== [Villamizar 2015] ===&lt;br /&gt;
&lt;br /&gt;
* '''Authors and affiliations:''' [[Sandra Villamizar]], University of California, Merced&lt;br /&gt;
* '''Keywords of research area:''' river ecohydrology&lt;br /&gt;
* '''Tentative title:''' Producing long-term series of whole-stream metabolism using readily available data.  &lt;br /&gt;
* '''Short abstract:''' Continuous water quality and river discharge data that are readily available through government websites may be used to produce valuable information about key processes within a river ecosystem. In this paper I describe in detail the steps for acquisition and processing of river flow, dissolved oxygen, temperature, and specific conductance data that, combined with atmospheric data and physical properties of the river reach of interest, allow for the production of a long-term series of whole stream metabolism. This information is key in understanding the structure and function of an ecosystem such as the San Joaquin River in the Central Valley of California which has been increasingly degraded during the last 60 years due to intensive human intervention but now, since 2010, has been going through a restoration effort. The key advantage of this tool is that it uses readily available information to produce knowledge about a river ecosystem. This set of scripts, written in the R code, can be used immediately for any other river for which the key parameters (river flow, dissolved oxygen, temperature, and specific conductivity) are available. The scripts can also be modified by users to fit their particular site conditions.&lt;br /&gt;
 &lt;br /&gt;
* '''Challenge:''' Document new software/applications. This set of scripts was written after the necessity of generating daily estimates of metabolic rates for long periods of time and at various sites within the San Joaquin River.  &lt;br /&gt;
* '''Relationship to other publications:''' This will be a new publication&lt;br /&gt;
* '''Pointer to the wiki page that documents the article:''' [[Document_GPF_activities_by_Sandra_Villamizar | Page]]&lt;br /&gt;
* '''Expected submission date:''' To be defined&lt;br /&gt;
&lt;br /&gt;
=== [Yu and Bhatt 2015] ===&lt;br /&gt;
&lt;br /&gt;
* '''Authors and affiliations:''' [[Xuan Yu]], Department of Geological Sciences, University of Delaware. Gopal Bhatt, Department of Civil &amp;amp; Environmental Engineering, Pennsylvania State University. &lt;br /&gt;
* '''Keywords of research area:''' coupled processes, integrated hydrologic modeling, PIHM, surface flow, subsurface flow, open science&lt;br /&gt;
* '''Tentative title:''' Learning integrated modeling of surface and subsurface flow from scratch&lt;br /&gt;
* '''Short abstract:''' Integrated modeling of surface and subsurface flow has been of great interest in understanding not only intimate interconnectedness of hydrological processes, but also land-surface energy balance, biogeochemical and ecological processes, and landscape evolution. Although a growing number of complex hydrologic models have been used for resolving environmental processes, hypothesis testing, hydrologic predictions for effective management of watershed, very limited resources of the model implementation have been made accessible to a large group of model users. The users have to invest a significant amount of time and effort to reproduce, and to understand the workflow of hydrologic simulation in a modeling paper. To provide a challenging and stimulating introduction to integrated modeling of surface and subsurface flow in this paper, we revisit the development of Penn State Integrated Hydrologic Model (PIHM) by reproducing a numerical benchmarking example, and a real world catchment scale application. Specifically, we document PIHM and it’s modeling workflow to enable basic understanding of simulating coupled surface and subsurface flow processes. We provide model and data to highlight the reciprocal roles between the two. In addition, we incorporate user experience as third dimension in the modeling workflow to enable deeper communications between model developers and users. The workflow has important implications for smoothing and accelerating open scientific collaborations in geosciences research.&lt;br /&gt;
* '''Challenge:''' Reproduce published simulations by a existing model with the latest version. Benchmarking modeling application for numerical experiment and field data.&lt;br /&gt;
* '''Relationship to other publications:''' The article is based on a previously published article. &lt;br /&gt;
* '''Pointer to the wiki page that documents the article:''' [[Document_GPF_activities_by_Xuan_Yu | Page]]&lt;br /&gt;
* '''Expected submission date:''' End of June 2015&lt;br /&gt;
&lt;br /&gt;
== Special Issue Editors ==&lt;br /&gt;
&lt;br /&gt;
* Co-editor: Chris Duffy and/or Scott Peckham&lt;br /&gt;
* Co-editor: Cedric David&lt;br /&gt;
* Co-editor: possibly Karan Venayagamoorthy&lt;br /&gt;
&lt;br /&gt;
The editors will only accept submissions that follow the [[Develop_proposal_for_special_issue#Special_Issue_Review_Criteria | special issue review criteria]].&lt;br /&gt;
&lt;br /&gt;
The editors will select a set of reviewers to handle the submissions.  Reviewers will include computer scientists, library scientists, and geoscientists.&lt;br /&gt;
&lt;br /&gt;
== Special Issue Review Criteria ==&lt;br /&gt;
&lt;br /&gt;
The reviewers will be asked to provide feedback on the papers according to the following criteria. Note that some papers will have good reasons for limiting the information (e.g. the data is from third parties and not openly available, etc), and in that case they would document those reasons.&lt;br /&gt;
&lt;br /&gt;
* Documentation of the datasets: descriptions of datasets, unique identifiers, repositories.&lt;br /&gt;
* Documentation of software: description of all software used (including pre-processing of data, visualization steps, etc), unique identifiers, repositories.&lt;br /&gt;
* Documentation of the provenance of results: provenance for each figure or result, such as the workflow or the provenance record.&lt;br /&gt;
&lt;br /&gt;
== Tentative Timeline ==&lt;br /&gt;
&lt;br /&gt;
* Journal committed to special issue: April 15, 2015&lt;br /&gt;
* Submissions due to editors: June 30, 2015&lt;br /&gt;
* Reviews due: Sept 15, 2015&lt;br /&gt;
* Decisions out to authors: Sept 30, 2015&lt;br /&gt;
* Revisions due: October 31, 2015&lt;br /&gt;
* Final versions due November 15, 2015&lt;br /&gt;
* Issue published December 31, 2015&lt;br /&gt;
&lt;br /&gt;
&amp;lt;!-- Add any wiki Text above this Line --&amp;gt;&lt;br /&gt;
&amp;lt;!-- Do NOT Edit below this Line --&amp;gt;&lt;br /&gt;
{{#set:&lt;br /&gt;
	Owner=Chris_Duffy|&lt;br /&gt;
	Participants=Yolanda_Gil|&lt;br /&gt;
	Participants=Scott_Peckham|&lt;br /&gt;
	Participants=Cedric_David|&lt;br /&gt;
	Participants=Ibrahim_Demir|&lt;br /&gt;
	Participants=Wally_Fulweiler|&lt;br /&gt;
	Participants=Leif_Karlstrom|&lt;br /&gt;
	Participants=Kyo_Lee|&lt;br /&gt;
	Participants=Kim_Miller|&lt;br /&gt;
	Participants=Heath_Mills|&lt;br /&gt;
	Participants=Ji-Hyun_Oh|&lt;br /&gt;
	Participants=Suzanne_Pierce|&lt;br /&gt;
	Participants=Allen_Pope|&lt;br /&gt;
	Participants=Jordan_Read|&lt;br /&gt;
	Participants=Mimi_Tzeng|&lt;br /&gt;
	Participants=Sandra_Villamizar|&lt;br /&gt;
	Participants=Xuan_Yu|&lt;br /&gt;
	Progress=20|&lt;br /&gt;
	StartDate=2015-03-10|&lt;br /&gt;
	TargetDate=2015-03-16|&lt;br /&gt;
	Type=Low}}&lt;/div&gt;</summary>
		<author><name>Suzanne</name></author>	</entry>

	<entry>
		<id>https://www.organicdatascience.org/gpf/index.php?title=Develop_proposal_for_special_issue&amp;diff=11690</id>
		<title>Develop proposal for special issue</title>
		<link rel="alternate" type="text/html" href="https://www.organicdatascience.org/gpf/index.php?title=Develop_proposal_for_special_issue&amp;diff=11690"/>
				<updated>2015-04-03T17:49:51Z</updated>
		
		<summary type="html">&lt;p&gt;Suzanne: /* [Pierce 2015] */&lt;/p&gt;
&lt;hr /&gt;
&lt;div&gt;[[Category:Task]]&lt;br /&gt;
&lt;br /&gt;
== Background: Why a Special Issue on Geoscience Papers of the Future? ==&lt;br /&gt;
&lt;br /&gt;
[[Discuss_what_we_will_consider_a_GPF#The_Vision | Include here our discussion for the vision]]&lt;br /&gt;
&lt;br /&gt;
Background should be 1-2 pages.&lt;br /&gt;
&lt;br /&gt;
Motivated by need to fully document and make research accessible and reproducible. &lt;br /&gt;
&lt;br /&gt;
=== Motivation: The EarthCube Initiative and the GeoSoft Project ===&lt;br /&gt;
&lt;br /&gt;
[http://www.geosoft-earthcube.org/about Include here background about GeoSoft from the web site]&lt;br /&gt;
&lt;br /&gt;
OSTP memo.  EarthCube reports.&lt;br /&gt;
Other reports that talk about the need for new approaches to editing.&lt;br /&gt;
&lt;br /&gt;
It's possible that small or very large contributions are not well captured in the current publishing paradigms.  Nanopublications.&lt;br /&gt;
&lt;br /&gt;
For example, nano-publications are a possible way to reflect advances in a research process that may not merit a full pubication but they are useful advances to share with the community. A challenge here is that there is a stigma in publishing for publishing units that are too small or very small.  &lt;br /&gt;
&lt;br /&gt;
Alternatively, a very large piece of research or work with many parts may be better suited to a GPF style publication.&lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
Perhaps, the concept of a 'paper' can be better reflected in the concept of a 'wrapper' or a collection of materials and resources. The purpose is to assure that publications are representative of the work, effort, and results achieved in the research process.&lt;br /&gt;
&lt;br /&gt;
=== What is a GPF ===&lt;br /&gt;
&lt;br /&gt;
[[Discuss_what_we_will_consider_a_GPF#What_is_a_Geoscience_Paper_of_the_Future.3F | Include here our discussion of what is a GPF]]&lt;br /&gt;
&lt;br /&gt;
=== The challenges of creating GPFs ===&lt;br /&gt;
&lt;br /&gt;
The articles in this issue reflect the current best practice for generating a Geoscience Paper of the Future.&lt;br /&gt;
&lt;br /&gt;
'''Figure discussions''': Do we want to do exactly the same figure automatically.  Figures in the paper may be a clean versions of an image generated by software.  To the extent possible, authors have included clear delineations of provenance. The goal is to assure that readers may regenerate the figures using documented workflows, data, and codes.  An important note (Allen, Sandra) is that frequently figures are generated by code, scripts, etc. yet the actual figure is finalized with user.....  Mimi is trying to say: is it really worth belaboring the point about how the prettified version of the figure is made? If it is: both of the visualization software I've used (Matlab and SigmaPlot) have actual code in the background that specifies how to set up the prettification, and this code can be found, copied out, and rerun to generate the exact same figure with all of the prettification in the same place. SigmaPlot uses Visual Basic (I think) in its macros. If it is an important point about explicit code, this should be doable. But I'm not sure it's strictly necessary to specify exactly where all the prettifications are to get the gist across.&lt;br /&gt;
&lt;br /&gt;
How much of your experimental history does one include?  (Ibrahim).  The experimental process often ends up nowhere.  Should we document all the failed experiments?  Get one DOI for the results of the successful experiment?  Another for failed trials?&lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
'''''Documenting: Timing and Intermediate proceses'''''&lt;br /&gt;
When should we document and what are the bounds on what we document?&lt;br /&gt;
For example, should we document and include data and workflows for 'failed' experiments? Or should we assign datasets DOIs before we know the results from using them?  &lt;br /&gt;
The group thinks that  good ideas/practices may include documenting and sharing data when you have a clear understanding of the outcomes worth reporting. For example successful experiments should have clear, clean data documented and shared. Whereas one strategy with 'failed' experiments could include bundling the intermediate datasets with one DOI and a more general discussion of the process/methods.&lt;br /&gt;
&lt;br /&gt;
=== Related work ===&lt;br /&gt;
&lt;br /&gt;
[[Discuss_what_we_will_consider_a_GPF#New_Frameworks_to_Create_a_New_Generation_of_Scientific_Articles | Include here the related work we have discussed]]&lt;br /&gt;
&lt;br /&gt;
== Papers to be included ==&lt;br /&gt;
&lt;br /&gt;
Would it be worthwhile to group the papers into broader categories rather than giving specifics about every single paper?&lt;br /&gt;
&lt;br /&gt;
For each submission, we describe:&lt;br /&gt;
&lt;br /&gt;
* '''Authors and affiliations'''&lt;br /&gt;
* '''Keywords of research area'''&lt;br /&gt;
* '''Tentative title'''&lt;br /&gt;
* '''Short abstract'''&lt;br /&gt;
* '''Challenge'''&lt;br /&gt;
* '''Relationship to other publications''' (is the article based on a previously published article? is it new content? IF PREVIOUSLY PUBLISHED, PLS PROVIDE A POINTER TO THE PUBLISHED ARTICLE AND SPECIFY WHAT PERCENTAGE OF THE WORK PRESENTED WILL BE NEW)&lt;br /&gt;
* '''Pointer to the wiki page that documents the article'''&lt;br /&gt;
* '''Expected submission date'''&lt;br /&gt;
&lt;br /&gt;
=== [David 2015] ===&lt;br /&gt;
&lt;br /&gt;
* '''Authors and affiliations:''' [[Cedric David]]&lt;br /&gt;
* '''Keywords of research area:''' Hydrology, Rivers, Modeling, Testing, Reproducibility. &lt;br /&gt;
* '''Tentative title:''' Going beyond triple-checking, allowing for peace of mind in community model development.&lt;br /&gt;
* '''Short abstract:''' The development of computer models in the general field of geoscience is often made incrementally over many years.  Endeavors that generally start on one single researcher's own machine evolve over time into software that are often much larger than was initially anticipated.  Looking at years of building on their computer code, sometimes without much training in computer science, geoscience software developers can easily experience an overwhelming sense of incompetence when contemplating ways to further community usage of their software.  How does one allow others to use their code?  How can one foster survival of their tool?  How could one possibly ensure the scientific integrity of ongoing developments including those made by others?  Common issues faced by geoscience developers include selecting a license, learning how to track and document past and ongoing changes, choosing a software repository, and allowing for community development.  This paper provides a brief summary of experience with the three former steps of software growth by focusing on the almost decade-long code development of a river routing model.  The core of this study, however, focuses on reproducing previously-published experiments.  This step is highly repetitive and can therefore benefit greatly from automation.  Additionally, enabling automated software testing can arguably be considered the final step for sustainable software sharing, by allowing the main software developer to let go of a mental block considering scientific integrity.  Creating tools to automatically compare the results of an updated version of a software with those of previous studies can not only save the main developer's own time, it can also empower other researchers to in their ability to check and justify that their potential additions have retained scientific integrity.   &lt;br /&gt;
* '''Challenge:''' Ensure that updates to an existing model are able to reproduce a series of simulations published previously.&lt;br /&gt;
* '''Relationship to other publications:''' This research is related to past and ongoing development of the Routing Application for Parallel computatIon of Discharge (RAPID).  The primary focus of this paper is to allow automated reproducibility of at least the [http://dx.doi.org/10.1175/2011JHM1345.1 first RAPID publication].  The scientific subject of this GPF differs from the article(s) to be reproduced as its focus is on development of automatic testing methods.  In that regard, the paper is expected to be 95% new. &lt;br /&gt;
* '''Pointer to the wiki page that documents the article:''' [[Document_GPF_activities_by_Cedric_David | Page]]&lt;br /&gt;
* '''Expected submission date:'''&lt;br /&gt;
&lt;br /&gt;
=== [Demir 2015] ===&lt;br /&gt;
&lt;br /&gt;
* '''Authors and affiliations:''' [[Ibrahim Demir]]&lt;br /&gt;
* '''Keywords of research area:''' hydrological network, optimization, network representation, database query&lt;br /&gt;
* '''Tentative title:''' Analysis and Optimization of Hydrological Network Database Representation Methods for Fast Access and Query in Web-based System&lt;br /&gt;
* '''Short abstract:''' Web based systems allow users to delineate watersheds on interactive map environments using server side processing. With increasing resolution of hydrological networks, optimized methods for storage of network representation in databases, and efficient queries and actions on the river network structure become critical. This paper presents a detailed study on analysis of widely used methods for representing hydrological networks in relational databases, and benchmarking common queries and modifications on the network structure using these methods. The analysis has been applied to the hydrological network of Iowa utilizing 90m DEM and 600,000 network nodes. The application results indicate that the representation methods provide massive improvements on query times and storage of network structure in the database. Suggested method allows watershed delineation tools running on client-side with desktop-like performance. &lt;br /&gt;
* '''Challenge:''' Some of the internal steps to prepare data might require long computation time and different software environments.&lt;br /&gt;
* '''Relationship to other publications:''' The article is based on a new study&lt;br /&gt;
* '''Pointer to the wiki page that documents the article:''' [[Document_GPF_activities_by_Ibrahim_Demir | Page]]&lt;br /&gt;
* '''Expected submission date:'''&lt;br /&gt;
&lt;br /&gt;
=== [Fulweiler 2015] ===&lt;br /&gt;
&lt;br /&gt;
* '''Authors and affiliations:''' [[Wally Fulweiler]]&lt;br /&gt;
* '''Keywords of research area:'''&lt;br /&gt;
* '''Tentative title:'''&lt;br /&gt;
* '''Short abstract:'''&lt;br /&gt;
* '''Challenge:''' &lt;br /&gt;
* '''Relationship to other publications:''' (is the article based on a previously published article? is it new content?)&lt;br /&gt;
* '''Pointer to the wiki page that documents the article:''' [[Document_GPF_activities_by_Wally_Fulweiler | Page]]&lt;br /&gt;
* '''Expected submission date:'''&lt;br /&gt;
&lt;br /&gt;
=== [Loh and Karlstrom 2015] ===&lt;br /&gt;
&lt;br /&gt;
* '''Authors and affiliations:'''  [[Lay Kuan Loh]] and [[Leif Karlstrom]]&lt;br /&gt;
* '''Keywords of research area:''' Spatial clustering, Eigenvector selection, Entropy Ranking, Cascades Volcanic Region, [http://geosphere.gsapubs.org/content/3/3/152.abstract Afar Depression], [http://astrogeology.usgs.gov/search/details/Mars/Research/Volcanic/TharsisVents/zip Tharsis provonce]&lt;br /&gt;
* '''Tentative title:''' Characterization of volcanic vent distributions using spectral clustering with eigenvector selection and entropy ranking&lt;br /&gt;
* '''Short abstract:''' Volcanic vents on the surface of Earth and other planets often appear in groups that exhibit spatial patterning. Such vent distributions reflect complex interplay between time-evolving mechanical controls on the pathways of magma ascent, background tectonic stresses, and unsteady supply of rising magma. With the ultimate aim of connecting surface vent distributions with the dynamics of magma ascent, we have developed a clustering method to quantify spatial patterns in vents. Clustering is typically used in exploratory data analysis to identify groups with similar behavior by partitioning a dataset into clusters that share similar attributes. Traditional clustering algorithms that work well on simple point-cloud type synthetic datasets generally do not scale well the real-world data we are interested in, where there are poor boundaries between clusters and much ambiguity in cluster assignments. We instead use a spectral clustering algorithm with eigenvector selection based on entropy ranking based off work from [http://www.sciencedirect.com/science/article/pii/S0925231210001311 Zhao et al 2010] that outperforms traditional spectral clustering algorithms in choosing the right number of clusters for point data. We benchmark this algorithm on synthetic vent data with increasingly complex spatial distributions, to test the ability to accurately cluster vent data with variable spatial density, skewness, number of clusters, and proximity of clusters. We then apply our algorithm to several real-world datasets from the Cascades, Afar Depression and Mars. &lt;br /&gt;
* '''Challenge:''' Quantifying clustering. We plan to study how varying the statistical distribution, density, skewness, background noise, number of clusters, proximity of clusters, and combinations of any of these factors affects the performance of our algorithm. We test it against man-made and real world datasets. ''' &lt;br /&gt;
* '''Relationship to other publications:''' New content, but one of the databases we are studying in the paper (Cascades Volcanic Range) would be based off a different paper we are preparing and planning to submit earlier. &lt;br /&gt;
* '''Pointer to the wiki page that documents the article:''' [[Document_GPF_activities_by_Leif_Karlstrom | Page]]&lt;br /&gt;
* '''Expected submission date:''' June 2015&lt;br /&gt;
&lt;br /&gt;
=== [Lee 2015] ===&lt;br /&gt;
&lt;br /&gt;
* '''Authors and affiliations:''' [[Kyo Lee]], Maziyar Boustani and Chris Mattmann, Jet Propulsion Laboratory&lt;br /&gt;
* '''Keywords of research area:'''North American regional climate, regional climate model evaluation system, Open Climate Workbench, &lt;br /&gt;
* '''Tentative title:''' Evaluation of simulated temperature, precipitation, cloud fraction and insolation over the conterminous United States using Regional Climate Model Evaluation System&lt;br /&gt;
* '''Short abstract:'''This study describes the detailed process of evaluating model fidelity in simulating four key climate variables, surface air temperature, precipitation, cloud fraction and insolation and their covariability over the conterminous United States region. Regional Climate Model Evaluation System (RCMES), a suite of public database and open-source software package, provides both observational datasets and data processors useful for evaluating any climate models. In this paper, we provide a clear and easy-to-follow workflow of RCMES to replicate published papers evaluating North American Regional Climate Change Assessment Program (NARCCAP) regional climate model (RCM) hindcast simulations using observations from variety of sources. &lt;br /&gt;
* '''Challenge:'''Sharing big data, better documenting source codes, encouraging climate science community to use RCMES  &lt;br /&gt;
* '''Relationship to other publications:''' [http://journals.ametsoc.org/doi/abs/10.1175/JCLI-D-12-00452.1 Kim et al. 2013], [http://link.springer.com/article/10.1007/s00382-014-2253-y Lee et al. 2014]&lt;br /&gt;
* '''Pointer to the wiki page that documents the article:''' [[Document_GPF_activities_by_Kyo_Lee | Page]]&lt;br /&gt;
* '''Expected submission date:'''End of June 2015&lt;br /&gt;
&lt;br /&gt;
=== [Miller 2015] ===&lt;br /&gt;
&lt;br /&gt;
* '''Authors and affiliations:''' [[Kim Miller]]&lt;br /&gt;
* '''Keywords of research area:'''&lt;br /&gt;
* '''Tentative title:'''&lt;br /&gt;
* '''Short abstract:'''&lt;br /&gt;
* '''Challenge:''' &lt;br /&gt;
* '''Relationship to other publications:''' (is the article based on a previously published article? is it new content?)&lt;br /&gt;
* '''Pointer to the wiki page that documents the article:''' [[Document_GPF_activities_by_Kim_Miller | Page]]&lt;br /&gt;
* '''Expected submission date:'''&lt;br /&gt;
&lt;br /&gt;
=== [Mills 2015] ===&lt;br /&gt;
&lt;br /&gt;
* '''Authors and affiliations:''' [[Heath Mills]], University of Houston Clear Lake; Brandi Kiel Reese, Texas A&amp;amp;M Corpus Christi&lt;br /&gt;
* '''Keywords of research area:'''&lt;br /&gt;
* '''Tentative title:'''Iron and Sulfur Cycling Biogeography Using Advanced Geochemical and Molecular Analyses&lt;br /&gt;
* '''Short abstract:'''&lt;br /&gt;
* '''Challenge:''' My paper will develop and document a new pipeline to analyze a combined and robust genetic and geochemical data set. New, reproducible methods will be highlighted in this manuscript to help others better analyze similar data sets. There is a general lack of guidance within my field for such challenges. This manuscript will be unique and helpful from an analysis standpoint as well as for the science being presented.&lt;br /&gt;
* '''Relationship to other publications:''' Original Manuscript&lt;br /&gt;
* '''Pointer to the wiki page that documents the article:''' [[Document_GPF_activities_by_Heith_Mills | Page]]&lt;br /&gt;
* '''Expected submission date:'''&lt;br /&gt;
&lt;br /&gt;
=== [Oh 2015] ===&lt;br /&gt;
&lt;br /&gt;
* '''Authors and affiliations:''' [[Ji-Hyun Oh]] Jet Propulsion Laboratory/University of Southern California&lt;br /&gt;
* '''Keywords of research area:''' Tropical Meteorology, Madden-Julian Oscillation, Momentum budget analysis &lt;br /&gt;
* '''Tentative title:''' Tools for computing momentum budget for the westerly wind event associated with the Madden-Julian Oscillation&lt;br /&gt;
* '''Short abstract:'''As one of the most pronounced modes of tropical intraseasonal variability, the Madden-Julian Oscillation (MJO) prominently connects global weather and climate, and serves as one of critical predictability sources for extended-range forecasting. The zonal circulation of the MJO is characterized by low-level westerlies (easterlies) in and to the west (east) of the convective center, respectively. The direction of zonal winds in the upper troposphere is opposite to that in the lower troposphere. In addition to the convective signal as an identifier of the MJO initiation, certain characteristics of the zonal circulation been used as a standard metric for monitoring the state of MJO and investigating features of the MJO and its impact on other atmospheric phenomena. This paper documents a tool for  investigating  the generation of low-level westerly winds during the MJO life cycle. The tool is used for the momentum budget analysis to understand the respective contributions of various processes involved in the wind evolution associated with the MJO using European Centre for Medium-Range Weather Forecasts operational analyses during Dynamics of the Madden–Julian Oscillation field campaign.&lt;br /&gt;
&lt;br /&gt;
* '''Challenge:''' This paper will cover how to reproduce two key figures from the paper that I recently submitted to Journal of Atmospheric Science. This will include detailed procedures related to generating the figures such as how/where to download data, how to transform the format of the data to be used as an input for my codes, and so on.. &lt;br /&gt;
* '''Relationship to other publications:''' (is the article based on a previously published article? is it new content?) This article is related to the part of the paper submitted to Journal of Atmospheric Science. &lt;br /&gt;
* '''Pointer to the wiki page that documents the article:''' [[Document_GPF_activities_by_Ji_Hyun | Page]]&lt;br /&gt;
* '''Expected submission date:'''&lt;br /&gt;
&lt;br /&gt;
=== [Pierce 2015] ===&lt;br /&gt;
&lt;br /&gt;
* '''Authors and affiliations:''' [[Suzanne Pierce ^1,2^]], [[John Gentle^1^]], [[Daniel Noll^2,3^]]&lt;br /&gt;
1 Texas Advanced Computing Center&lt;br /&gt;
2 Jackson School of Geosciences, The University of Texas at Austin&lt;br /&gt;
3 International Fellows, US Department of Energy&lt;br /&gt;
&lt;br /&gt;
* '''Keywords of research area:''' Hydrogeology, Risk &lt;br /&gt;
* '''Tentative title:''' [[&lt;br /&gt;
* '''Short abstract:'''&lt;br /&gt;
&lt;br /&gt;
* '''Challenge:''' Fully document a new software application and framework using example case study data and tutorials.&lt;br /&gt;
* '''Relationship to other publications:''' This article is new content&lt;br /&gt;
* '''Pointer to the wiki page that documents the article:''' [[Document_GPF_activities_by_Suzanne_Pierce | Page]]&lt;br /&gt;
* '''Expected submission date:'''&lt;br /&gt;
&lt;br /&gt;
=== [Pope 2015] ===&lt;br /&gt;
&lt;br /&gt;
* '''Authors and affiliations:''' [[Allen Pope]], National Snow and Ice Data Center, University of Colorado, Boulder&lt;br /&gt;
* '''Keywords of research area:''' Glaciology, Remote Sensing, Landsat 8, Polar Science&lt;br /&gt;
* '''Tentative title:''' Data and Code for Estimating and Evaluating Supraglacial Lake Depth With Landsat 8 and other Multispectral Sensors&lt;br /&gt;
* '''Short abstract:''' Supraglacial lakes play a significant role in glacial hydrological systems – for example, transporting water to the glacier bed in Greenland or leading to ice shelf fracture and disintegration in Antarctica. To investigate these important processes, multispectral remote sensing provides multiple methods for estimating supraglacial lake depth – either through single-band or band-ratio methods, both empirical and physically-based. Landsat 8 is the newest satellite in the Landsat series. With new bands, higher dynamic range, and higher radiometric resolution, the Operational Land Imager (OLI) aboard Landsat 8 has a lot of potential. &lt;br /&gt;
&lt;br /&gt;
This paper will document the data and code used in processing in situ reflectance spectra and depth measurements to investigate the ability of Landsat 8 to estimate lake depths using multiple methods, as well as quantify improvements over Landsat 7’s ETM+. A workflow, data, and code are provided to detail promising methods as applied to Landsat 8 OLI imagery of case study areas in Greenland, allowing calculation of regional volume estimates using 2013 and 2014 summer-season imagery. Altimetry from WorldView DEMs are used to validate lake depth estimates. The optimal method for supraglacial lake depth estimation with Landsat 8 is shown to be an average of single band depths by red and panchromatic bands. With this best method, preliminary investigation of seasonal behavior and elevation distribution of lakes is also discussed and documented.&lt;br /&gt;
* '''Challenge:''' Reproducibility, Dark Code&lt;br /&gt;
* '''Relationship to other publications:''' Documenting and explaining the data and code behind the analysis and results presented in another paper.&lt;br /&gt;
* '''Pointer to the wiki page that documents the article:''' [[Document_GPF_activities_by_Allen_Pope | Page]]&lt;br /&gt;
* '''Expected submission date:''' Late June 2015&lt;br /&gt;
&lt;br /&gt;
=== [Read and Winslow 2015] ===&lt;br /&gt;
&lt;br /&gt;
* '''Authors and affiliations:''' [[Jordan Read]] and [[Luke Winslow]]&lt;br /&gt;
* '''Keywords of research area:'''&lt;br /&gt;
* '''Tentative title:'''&lt;br /&gt;
* '''Short abstract:'''&lt;br /&gt;
* '''Challenge:''' &lt;br /&gt;
* '''Relationship to other publications:''' (is the article based on a previously published article? is it new content?)&lt;br /&gt;
* '''Pointer to the wiki page that documents the article:''' [[Document_GPF_activities_by_Jordan_Read | Page]]&lt;br /&gt;
* '''Expected submission date:'''&lt;br /&gt;
&lt;br /&gt;
=== [Tzeng 2015] ===&lt;br /&gt;
&lt;br /&gt;
* '''Authors and affiliations:''' [[Mimi Tzeng]], Brian Dzwonkowski (DISL); Kyeong Park (TAMU Galveston)&lt;br /&gt;
* '''Keywords of research area:'''physical oceanography, remote sensing&lt;br /&gt;
* '''Tentative title:''' Fisheries Oceanography of Coastal Alabama (FOCAL): A Subset of a Time-Series of Hydrographic and Current Data from a Permanent Moored Station Outside Mobile Bay (27 Jan to 18 May 2011)&lt;br /&gt;
* '''Short abstract:'''The Fisheries Oceanography in Coastal Alabama (FOCAL) program began in 2006 as a way for scientists at Dauphin Island Sea Lab (DISL) to study the natural variability of Alabama's nearshore environment as it relates to fisheries production. FOCAL provided a long-term baseline data set that included time-series hydrographic data from a permanent offshore mooring (ADCP, vertical thermister array and CTDs at surface and bottom) and shipboard surveys (vertical CTD profiles and water sampling), as well as monthly ichthyoplankton and zooplankton (depth-discrete) sample collections at FOCAL sites. The subset of data presented here are from the mooring, and includes a vertical array of thermisters, CTDs at surface and bottom, an ADCP at the bottom, and vertical CTD profiles collected at the mooring during maintenance surveys. The mooring is located at 30 05.410'N 88 12.694'W, 25 km southwest of the entrance to Mobile Bay. Temperature, salinity, density, depth, and current velocity data were collected at 20-minute intervals from 2006 to 2012. Other parameters, such as dissolved oxygen, are available for portions of the time series depending on which instruments were deployed at the time.&lt;br /&gt;
* '''Challenge:''' My paper will be about the processing of data in a larger dataset, from which peer-reviewed papers have been written. The processing I did was not specific to any particular paper. I can point to an example paper that used some of the data from this dataset, that I processed, however all of the figures in the paper are composites that also include other data from elsewhere that I had nothing to do with (and it wouldn't be feasible to try to get hold of the other data within our timeframe).&lt;br /&gt;
* '''Relationship to other publications:''' A recent paper that used the part of the FOCAL data I'm documenting as the sample from the larger dataset: Dzwonkowski, Brian, Kyeong Park, Jungwoo Lee, Bret M. Webb, and Arnoldo Valle-Levinson. 2014. &amp;quot;Spatial variability of flow over a river-influenced inner shelf in coastal Alabama during spring.&amp;quot; Continental Shelf Research 74:25-34.&lt;br /&gt;
* '''Pointer to the wiki page that documents the article:''' [[Document_GPF_activities_by_Mimi_Tzeng | Page]]&lt;br /&gt;
* '''Expected submission date:'''&lt;br /&gt;
&lt;br /&gt;
=== [Villamizar 2015] ===&lt;br /&gt;
&lt;br /&gt;
* '''Authors and affiliations:''' [[Sandra Villamizar]], University of California, Merced&lt;br /&gt;
* '''Keywords of research area:''' river ecohydrology&lt;br /&gt;
* '''Tentative title:''' Producing long-term series of whole-stream metabolism using readily available data.  &lt;br /&gt;
* '''Short abstract:''' Continuous water quality and river discharge data that are readily available through government websites may be used to produce valuable information about key processes within a river ecosystem. In this paper I describe in detail the steps for acquisition and processing of river flow, dissolved oxygen, temperature, and specific conductance data that, combined with atmospheric data and physical properties of the river reach of interest, allow for the production of a long-term series of whole stream metabolism. This information is key in understanding the structure and function of an ecosystem such as the San Joaquin River in the Central Valley of California which has been increasingly degraded during the last 60 years due to intensive human intervention but now, since 2010, has been going through a restoration effort. The key advantage of this tool is that it uses readily available information to produce knowledge about a river ecosystem. This set of scripts, written in the R code, can be used immediately for any other river for which the key parameters (river flow, dissolved oxygen, temperature, and specific conductivity) are available. The scripts can also be modified by users to fit their particular site conditions.&lt;br /&gt;
 &lt;br /&gt;
* '''Challenge:''' Document new software/applications. This set of scripts was written after the necessity of generating daily estimates of metabolic rates for long periods of time and at various sites within the San Joaquin River.  &lt;br /&gt;
* '''Relationship to other publications:''' This will be a new publication&lt;br /&gt;
* '''Pointer to the wiki page that documents the article:''' [[Document_GPF_activities_by_Sandra_Villamizar | Page]]&lt;br /&gt;
* '''Expected submission date:''' To be defined&lt;br /&gt;
&lt;br /&gt;
=== [Yu and Bhatt 2015] ===&lt;br /&gt;
&lt;br /&gt;
* '''Authors and affiliations:''' [[Xuan Yu]], Department of Geological Sciences, University of Delaware. Gopal Bhatt, Department of Civil &amp;amp; Environmental Engineering, Pennsylvania State University. &lt;br /&gt;
* '''Keywords of research area:''' coupled processes, integrated hydrologic modeling, PIHM, surface flow, subsurface flow, open science&lt;br /&gt;
* '''Tentative title:''' Learning integrated modeling of surface and subsurface flow from scratch&lt;br /&gt;
* '''Short abstract:''' Integrated modeling of surface and subsurface flow has been of great interest in understanding not only intimate interconnectedness of hydrological processes, but also land-surface energy balance, biogeochemical and ecological processes, and landscape evolution. Although a growing number of complex hydrologic models have been used for resolving environmental processes, hypothesis testing, hydrologic predictions for effective management of watershed, very limited resources of the model implementation have been made accessible to a large group of model users. The users have to invest a significant amount of time and effort to reproduce, and to understand the workflow of hydrologic simulation in a modeling paper. To provide a challenging and stimulating introduction to integrated modeling of surface and subsurface flow in this paper, we revisit the development of Penn State Integrated Hydrologic Model (PIHM) by reproducing a numerical benchmarking example, and a real world catchment scale application. Specifically, we document PIHM and it’s modeling workflow to enable basic understanding of simulating coupled surface and subsurface flow processes. We provide model and data to highlight the reciprocal roles between the two. In addition, we incorporate user experience as third dimension in the modeling workflow to enable deeper communications between model developers and users. The workflow has important implications for smoothing and accelerating open scientific collaborations in geosciences research.&lt;br /&gt;
* '''Challenge:''' Reproduce published simulations by a existing model with the latest version. Benchmarking modeling application for numerical experiment and field data.&lt;br /&gt;
* '''Relationship to other publications:''' The article is based on a previously published article. &lt;br /&gt;
* '''Pointer to the wiki page that documents the article:''' [[Document_GPF_activities_by_Xuan_Yu | Page]]&lt;br /&gt;
* '''Expected submission date:''' End of June 2015&lt;br /&gt;
&lt;br /&gt;
== Special Issue Editors ==&lt;br /&gt;
&lt;br /&gt;
* Co-editor: Chris Duffy and/or Scott Peckham&lt;br /&gt;
* Co-editor: Cedric David&lt;br /&gt;
* Co-editor: possibly Karan Venayagamoorthy&lt;br /&gt;
&lt;br /&gt;
The editors will only accept submissions that follow the [[Develop_proposal_for_special_issue#Special_Issue_Review_Criteria | special issue review criteria]].&lt;br /&gt;
&lt;br /&gt;
The editors will select a set of reviewers to handle the submissions.  Reviewers will include computer scientists, library scientists, and geoscientists.&lt;br /&gt;
&lt;br /&gt;
== Special Issue Review Criteria ==&lt;br /&gt;
&lt;br /&gt;
The reviewers will be asked to provide feedback on the papers according to the following criteria. Note that some papers will have good reasons for limiting the information (e.g. the data is from third parties and not openly available, etc), and in that case they would document those reasons.&lt;br /&gt;
&lt;br /&gt;
* Documentation of the datasets: descriptions of datasets, unique identifiers, repositories.&lt;br /&gt;
* Documentation of software: description of all software used (including pre-processing of data, visualization steps, etc), unique identifiers, repositories.&lt;br /&gt;
* Documentation of the provenance of results: provenance for each figure or result, such as the workflow or the provenance record.&lt;br /&gt;
&lt;br /&gt;
== Tentative Timeline ==&lt;br /&gt;
&lt;br /&gt;
* Journal committed to special issue: April 15, 2015&lt;br /&gt;
* Submissions due to editors: June 30, 2015&lt;br /&gt;
* Reviews due: Sept 15, 2015&lt;br /&gt;
* Decisions out to authors: Sept 30, 2015&lt;br /&gt;
* Revisions due: October 31, 2015&lt;br /&gt;
* Final versions due November 15, 2015&lt;br /&gt;
* Issue published December 31, 2015&lt;br /&gt;
&lt;br /&gt;
&amp;lt;!-- Add any wiki Text above this Line --&amp;gt;&lt;br /&gt;
&amp;lt;!-- Do NOT Edit below this Line --&amp;gt;&lt;br /&gt;
{{#set:&lt;br /&gt;
	Owner=Chris_Duffy|&lt;br /&gt;
	Participants=Yolanda_Gil|&lt;br /&gt;
	Participants=Scott_Peckham|&lt;br /&gt;
	Participants=Cedric_David|&lt;br /&gt;
	Participants=Ibrahim_Demir|&lt;br /&gt;
	Participants=Wally_Fulweiler|&lt;br /&gt;
	Participants=Leif_Karlstrom|&lt;br /&gt;
	Participants=Kyo_Lee|&lt;br /&gt;
	Participants=Kim_Miller|&lt;br /&gt;
	Participants=Heath_Mills|&lt;br /&gt;
	Participants=Ji-Hyun_Oh|&lt;br /&gt;
	Participants=Suzanne_Pierce|&lt;br /&gt;
	Participants=Allen_Pope|&lt;br /&gt;
	Participants=Jordan_Read|&lt;br /&gt;
	Participants=Mimi_Tzeng|&lt;br /&gt;
	Participants=Sandra_Villamizar|&lt;br /&gt;
	Participants=Xuan_Yu|&lt;br /&gt;
	Progress=20|&lt;br /&gt;
	StartDate=2015-03-10|&lt;br /&gt;
	TargetDate=2015-03-16|&lt;br /&gt;
	Type=Low}}&lt;/div&gt;</summary>
		<author><name>Suzanne</name></author>	</entry>

	<entry>
		<id>https://www.organicdatascience.org/gpf/index.php?title=Document_provenance_of_results_by_Suzanne_Pierce&amp;diff=11689</id>
		<title>Document provenance of results by Suzanne Pierce</title>
		<link rel="alternate" type="text/html" href="https://www.organicdatascience.org/gpf/index.php?title=Document_provenance_of_results_by_Suzanne_Pierce&amp;diff=11689"/>
				<updated>2015-04-03T17:41:33Z</updated>
		
		<summary type="html">&lt;p&gt;Suzanne: &lt;/p&gt;
&lt;hr /&gt;
&lt;div&gt;[[Category:Task]]&lt;br /&gt;
&amp;lt;br/&amp;gt;&amp;lt;b&amp;gt;Details on how to do this task:&amp;lt;/b&amp;gt; [[Document the provenance of the results]]&amp;lt;br/&amp;gt;&amp;lt;br/&amp;gt;&lt;br /&gt;
&amp;lt;!-- Add any wiki Text above this Line --&amp;gt;&lt;br /&gt;
&amp;lt;!-- Do NOT Edit below this Line --&amp;gt;&lt;br /&gt;
{{#set:&lt;br /&gt;
	Expertise=Open_science|&lt;br /&gt;
	Expertise=Geosciences|&lt;br /&gt;
	Owner=Suzanne_Pierce|&lt;br /&gt;
	Progress=50|&lt;br /&gt;
	StartDate=2015-03-07|&lt;br /&gt;
	TargetDate=2015-04-15|&lt;br /&gt;
	Type=Low}}&lt;br /&gt;
&lt;br /&gt;
'''Comments on Progress and planned next steps:'''&lt;br /&gt;
&lt;br /&gt;
This task has been set aside for a couple of weeks (as of April 3, 2015).  I anticipate picking it back up and making progress on it next week, April 8th and hope to complete it by April 15th. The dates have been modified for this task per that expectation.&lt;br /&gt;
&lt;br /&gt;
One nice thing about participating in the GPF process is the external timeline and shared group interactions. It helps me keep this paper on my day-to-day list of priorities and continue to make progress. Often when you are writing a manuscript on your own, it's easy to set it aside or lose momentum.  The GeoSoft GPF schedule is helping me tackle the many steps that are necessary to publish interactive/software/complex workflow types of research. &lt;br /&gt;
&lt;br /&gt;
'''Selecting Data Sets for GPF and Tutorial:'''&lt;br /&gt;
The timeline slowed a little because we decided to request the original data input files from my former graduate student. He copied all of his files from his thesis and mailed them to me on an external hard drive. This reflects a common practice for data sharing and management in geosciences. Frequently, students complete a research project and they keep the data and files on their own machines or resources. When students leave the university setting this base data is frequently lost. We are fortunate that, in this case, my graduate student has remained in contact with me and he was still able to access the files and send them to me.&lt;br /&gt;
&lt;br /&gt;
It's made it clear to me that I need to set up a data sharing and management plan for my research group.&lt;br /&gt;
&lt;br /&gt;
'''Observations on making data public'''&lt;br /&gt;
1) challenge to select and assure that dataset is absolutely correct&lt;br /&gt;
2) takes time to document dataset, source and context to be sure it is a standalone object.&lt;br /&gt;
3) Coordination among team members is needed to be sure that all the pieces are uploaded&lt;br /&gt;
4) There has to be a clear plan and choices for what is necessary&lt;br /&gt;
5) bundling and project options on figshare should be useful (but I'm still trying to understand how they work)&lt;br /&gt;
6) the free level access on figshare is very good, but the file size limitations are an issue for many file sets (e.g. set of candidate solutions over 1GB limit) - re-structuring uploads and figshare units to accomodate. May need to subscribe to figshare at appropriate level (=long term cost to researcher to keep data up and available)&lt;br /&gt;
7) Already see benefits because we know exactly where a dataset is (one that we have to search for all the time) - now it has a persistent place (easy to find)&lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
'''Tutorial Files, links, workflow'''&lt;br /&gt;
&lt;br /&gt;
OSGeo4w [http://trac.osgeo.org/osgeo4w/]&lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
'''Publishing/Sharing datasets'''&lt;br /&gt;
&lt;br /&gt;
Citation:&lt;br /&gt;
Pierce, Suzanne (2015): Gridded Shapefile for the Recharge Zones of the Barton Springs segment of the Edwards Aquifer. figshare.&lt;br /&gt;
http://dx.doi.org/10.6084/m9.figshare.1330145 Retrieved 16:56, Mar 10, 2015 (GMT)&lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
Workflow Sketch&lt;br /&gt;
&lt;br /&gt;
[[File:ConceptualWorkflowSketch_GPF.jpg]]&lt;/div&gt;</summary>
		<author><name>Suzanne</name></author>	</entry>

	<entry>
		<id>https://www.organicdatascience.org/gpf/index.php?title=Document_provenance_of_results_by_Suzanne_Pierce&amp;diff=11688</id>
		<title>Document provenance of results by Suzanne Pierce</title>
		<link rel="alternate" type="text/html" href="https://www.organicdatascience.org/gpf/index.php?title=Document_provenance_of_results_by_Suzanne_Pierce&amp;diff=11688"/>
				<updated>2015-04-03T17:38:15Z</updated>
		
		<summary type="html">&lt;p&gt;Suzanne: &lt;/p&gt;
&lt;hr /&gt;
&lt;div&gt;[[Category:Task]]&lt;br /&gt;
&amp;lt;br/&amp;gt;&amp;lt;b&amp;gt;Details on how to do this task:&amp;lt;/b&amp;gt; [[Document the provenance of the results]]&amp;lt;br/&amp;gt;&amp;lt;br/&amp;gt;&lt;br /&gt;
&amp;lt;!-- Add any wiki Text above this Line --&amp;gt;&lt;br /&gt;
&amp;lt;!-- Do NOT Edit below this Line --&amp;gt;&lt;br /&gt;
{{#set:&lt;br /&gt;
	Expertise=Open_science|&lt;br /&gt;
	Expertise=Geosciences|&lt;br /&gt;
	Owner=Suzanne_Pierce|&lt;br /&gt;
	Progress=50|&lt;br /&gt;
	StartDate=2015-03-07|&lt;br /&gt;
	TargetDate=2015-04-15|&lt;br /&gt;
	Type=Low}}&lt;br /&gt;
&lt;br /&gt;
'''Comments on Progress and planned next steps:'''&lt;br /&gt;
&lt;br /&gt;
This task has been set aside for a couple of weeks (as of April 3, 2015).  I anticipate picking it back up and making progress on it next week, April 8th and hope to complete it by April 15th. The dates have been modified for this task per that expectation.&lt;br /&gt;
&lt;br /&gt;
One nice thing about participating in the GPF process is the external timeline and shared group interactions. It helps me keep this paper on my day-to-day list of priorities and continue to make progress. Often when you are writing a manuscript on your own, it's easy to set it aside or lose momentum.  The GeoSoft GPF schedule is helping me tackle the many steps that are necessary to publish interactive/software/complex workflow types of research. &lt;br /&gt;
&lt;br /&gt;
'''Selecting Data Sets for GPF and Tutorial:'''&lt;br /&gt;
&lt;br /&gt;
'''Observations on making data public'''&lt;br /&gt;
1) challenge to select and assure that dataset is absolutely correct&lt;br /&gt;
2) takes time to document dataset, source and context to be sure it is a standalone object.&lt;br /&gt;
3) Coordination among team members is needed to be sure that all the pieces are uploaded&lt;br /&gt;
4) There has to be a clear plan and choices for what is necessary&lt;br /&gt;
5) bundling and project options on figshare should be useful (but I'm still trying to understand how they work)&lt;br /&gt;
6) the free level access on figshare is very good, but the file size limitations are an issue for many file sets (e.g. set of candidate solutions over 1GB limit) - re-structuring uploads and figshare units to accomodate. May need to subscribe to figshare at appropriate level (=long term cost to researcher to keep data up and available)&lt;br /&gt;
7) Already see benefits because we know exactly where a dataset is (one that we have to search for all the time) - now it has a persistent place (easy to find)&lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
'''Tutorial Files, links, workflow'''&lt;br /&gt;
&lt;br /&gt;
OSGeo4w [http://trac.osgeo.org/osgeo4w/]&lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
'''Publishing/Sharing datasets'''&lt;br /&gt;
&lt;br /&gt;
Citation:&lt;br /&gt;
Pierce, Suzanne (2015): Gridded Shapefile for the Recharge Zones of the Barton Springs segment of the Edwards Aquifer. figshare.&lt;br /&gt;
http://dx.doi.org/10.6084/m9.figshare.1330145 Retrieved 16:56, Mar 10, 2015 (GMT)&lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
Workflow Sketch&lt;br /&gt;
&lt;br /&gt;
[[File:ConceptualWorkflowSketch_GPF.jpg]]&lt;/div&gt;</summary>
		<author><name>Suzanne</name></author>	</entry>

	<entry>
		<id>https://www.organicdatascience.org/gpf/index.php?title=Document_provenance_of_results_by_Suzanne_Pierce&amp;diff=11687</id>
		<title>Document provenance of results by Suzanne Pierce</title>
		<link rel="alternate" type="text/html" href="https://www.organicdatascience.org/gpf/index.php?title=Document_provenance_of_results_by_Suzanne_Pierce&amp;diff=11687"/>
				<updated>2015-04-03T17:33:49Z</updated>
		
		<summary type="html">&lt;p&gt;Suzanne: &lt;/p&gt;
&lt;hr /&gt;
&lt;div&gt;[[Category:Task]]&lt;br /&gt;
&amp;lt;br/&amp;gt;&amp;lt;b&amp;gt;Details on how to do this task:&amp;lt;/b&amp;gt; [[Document the provenance of the results]]&amp;lt;br/&amp;gt;&amp;lt;br/&amp;gt;&lt;br /&gt;
&amp;lt;!-- Add any wiki Text above this Line --&amp;gt;&lt;br /&gt;
&amp;lt;!-- Do NOT Edit below this Line --&amp;gt;&lt;br /&gt;
{{#set:&lt;br /&gt;
	Expertise=Open_science|&lt;br /&gt;
	Expertise=Geosciences|&lt;br /&gt;
	Owner=Suzanne_Pierce|&lt;br /&gt;
	Progress=50|&lt;br /&gt;
	StartDate=2015-03-07|&lt;br /&gt;
	TargetDate=2015-03-20|&lt;br /&gt;
	Type=Low}}&lt;br /&gt;
&lt;br /&gt;
'''Selecting Data Sets for GPF and Tutorial:'''&lt;br /&gt;
&lt;br /&gt;
'''Observations on making data public'''&lt;br /&gt;
1) challenge to select and assure that dataset is absolutely correct&lt;br /&gt;
2) takes time to document dataset, source and context to be sure it is a standalone object.&lt;br /&gt;
3) Coordination among team members is needed to be sure that all the pieces are uploaded&lt;br /&gt;
4) There has to be a clear plan and choices for what is necessary&lt;br /&gt;
5) bundling and project options on figshare should be useful (but I'm still trying to understand how they work)&lt;br /&gt;
6) the free level access on figshare is very good, but the file size limitations are an issue for many file sets (e.g. set of candidate solutions over 1GB limit) - re-structuring uploads and figshare units to accomodate. May need to subscribe to figshare at appropriate level (=long term cost to researcher to keep data up and available)&lt;br /&gt;
7) Already see benefits because we know exactly where a dataset is (one that we have to search for all the time) - now it has a persistent place (easy to find)&lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
'''Tutorial Files, links, workflow'''&lt;br /&gt;
&lt;br /&gt;
OSGeo4w [http://trac.osgeo.org/osgeo4w/]&lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
'''Publishing/Sharing datasets'''&lt;br /&gt;
&lt;br /&gt;
Citation:&lt;br /&gt;
Pierce, Suzanne (2015): Gridded Shapefile for the Recharge Zones of the Barton Springs segment of the Edwards Aquifer. figshare.&lt;br /&gt;
http://dx.doi.org/10.6084/m9.figshare.1330145 Retrieved 16:56, Mar 10, 2015 (GMT)&lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
Workflow Sketch&lt;br /&gt;
&lt;br /&gt;
[[File:ConceptualWorkflowSketch_GPF.jpg]]&lt;/div&gt;</summary>
		<author><name>Suzanne</name></author>	</entry>

	<entry>
		<id>https://www.organicdatascience.org/gpf/index.php?title=File:ConceptualWorkflowSketch_GPF.jpg&amp;diff=11534</id>
		<title>File:ConceptualWorkflowSketch GPF.jpg</title>
		<link rel="alternate" type="text/html" href="https://www.organicdatascience.org/gpf/index.php?title=File:ConceptualWorkflowSketch_GPF.jpg&amp;diff=11534"/>
				<updated>2015-03-20T21:57:56Z</updated>
		
		<summary type="html">&lt;p&gt;Suzanne: &lt;/p&gt;
&lt;hr /&gt;
&lt;div&gt;&lt;/div&gt;</summary>
		<author><name>Suzanne</name></author>	</entry>

	<entry>
		<id>https://www.organicdatascience.org/gpf/index.php?title=Document_provenance_of_results_by_Suzanne_Pierce&amp;diff=11533</id>
		<title>Document provenance of results by Suzanne Pierce</title>
		<link rel="alternate" type="text/html" href="https://www.organicdatascience.org/gpf/index.php?title=Document_provenance_of_results_by_Suzanne_Pierce&amp;diff=11533"/>
				<updated>2015-03-20T21:56:38Z</updated>
		
		<summary type="html">&lt;p&gt;Suzanne: &lt;/p&gt;
&lt;hr /&gt;
&lt;div&gt;[[Category:Task]]&lt;br /&gt;
&amp;lt;br/&amp;gt;&amp;lt;b&amp;gt;Details on how to do this task:&amp;lt;/b&amp;gt; [[Document the provenance of the results]]&amp;lt;br/&amp;gt;&amp;lt;br/&amp;gt;&lt;br /&gt;
&amp;lt;!-- Add any wiki Text above this Line --&amp;gt;&lt;br /&gt;
&amp;lt;!-- Do NOT Edit below this Line --&amp;gt;&lt;br /&gt;
{{#set:&lt;br /&gt;
	Expertise=Open_science|&lt;br /&gt;
	Expertise=Geosciences|&lt;br /&gt;
	Owner=Suzanne_Pierce|&lt;br /&gt;
	Progress=0|&lt;br /&gt;
	StartDate=2015-03-07|&lt;br /&gt;
	TargetDate=2015-03-20|&lt;br /&gt;
	Type=Low}}&lt;br /&gt;
&lt;br /&gt;
'''Selecting Data Sets for GPF and Tutorial:'''&lt;br /&gt;
&lt;br /&gt;
'''Observations on making data public'''&lt;br /&gt;
1) challenge to select and assure that dataset is absolutely correct&lt;br /&gt;
2) takes time to document dataset, source and context to be sure it is a standalone object.&lt;br /&gt;
3) Coordination among team members is needed to be sure that all the pieces are uploaded&lt;br /&gt;
4) There has to be a clear plan and choices for what is necessary&lt;br /&gt;
5) bundling and project options on figshare should be useful (but I'm still trying to understand how they work)&lt;br /&gt;
6) the free level access on figshare is very good, but the file size limitations are an issue for many file sets (e.g. set of candidate solutions over 1GB limit) - re-structuring uploads and figshare units to accomodate. May need to subscribe to figshare at appropriate level (=long term cost to researcher to keep data up and available)&lt;br /&gt;
7) Already see benefits because we know exactly where a dataset is (one that we have to search for all the time) - now it has a persistent place (easy to find)&lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
'''Tutorial Files, links, workflow'''&lt;br /&gt;
&lt;br /&gt;
OSGeo4w [http://trac.osgeo.org/osgeo4w/]&lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
'''Publishing/Sharing datasets'''&lt;br /&gt;
&lt;br /&gt;
Citation:&lt;br /&gt;
Pierce, Suzanne (2015): Gridded Shapefile for the Recharge Zones of the Barton Springs segment of the Edwards Aquifer. figshare.&lt;br /&gt;
http://dx.doi.org/10.6084/m9.figshare.1330145 Retrieved 16:56, Mar 10, 2015 (GMT)&lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
Workflow Sketch&lt;br /&gt;
&lt;br /&gt;
[[File:ConceptualWorkflowSketch_GPF.jpg]]&lt;/div&gt;</summary>
		<author><name>Suzanne</name></author>	</entry>

	<entry>
		<id>https://www.organicdatascience.org/gpf/index.php?title=Write_about_each_author_s_motivation_to_participate&amp;diff=11479</id>
		<title>Write about each author s motivation to participate</title>
		<link rel="alternate" type="text/html" href="https://www.organicdatascience.org/gpf/index.php?title=Write_about_each_author_s_motivation_to_participate&amp;diff=11479"/>
				<updated>2015-03-13T21:24:05Z</updated>
		
		<summary type="html">&lt;p&gt;Suzanne: &lt;/p&gt;
&lt;hr /&gt;
&lt;div&gt;[[Category:Task]]&lt;br /&gt;
&amp;lt;!-- Add any wiki Text above this Line --&amp;gt;&lt;br /&gt;
&amp;lt;!-- Do NOT Edit below this Line --&amp;gt;&lt;br /&gt;
{{#set:&lt;br /&gt;
	Owner=Heath_Mills|&lt;br /&gt;
	Participants=Suzanne_Pierce|&lt;br /&gt;
	Type=Low}}&lt;br /&gt;
- Increase accessibility to research and reproducibility&lt;br /&gt;
&lt;br /&gt;
- increase visibility of research and liklihood of use by others&lt;br /&gt;
&lt;br /&gt;
- career choice to make effort to be at cutting edge and gpf is leading the trends for publishing practice&lt;br /&gt;
&lt;br /&gt;
- follow a set schedule and 'get it done' in a timely manner with external group to encourage and support&lt;br /&gt;
&lt;br /&gt;
- learn more about the process and types of sharing that's appropriate for a gpf (sharing, etc.)&lt;br /&gt;
&lt;br /&gt;
- sense of closing the loop and completing a task with good practices&lt;br /&gt;
&lt;br /&gt;
- We need to write articles for our jobs anyway, so doing the gpf helps me do it faster&lt;/div&gt;</summary>
		<author><name>Suzanne</name></author>	</entry>

	<entry>
		<id>https://www.organicdatascience.org/gpf/index.php?title=Write_about_each_author_s_motivation_to_participate&amp;diff=11476</id>
		<title>Write about each author s motivation to participate</title>
		<link rel="alternate" type="text/html" href="https://www.organicdatascience.org/gpf/index.php?title=Write_about_each_author_s_motivation_to_participate&amp;diff=11476"/>
				<updated>2015-03-13T21:18:54Z</updated>
		
		<summary type="html">&lt;p&gt;Suzanne: &lt;/p&gt;
&lt;hr /&gt;
&lt;div&gt;[[Category:Task]]&lt;br /&gt;
&amp;lt;!-- Add any wiki Text above this Line --&amp;gt;&lt;br /&gt;
&amp;lt;!-- Do NOT Edit below this Line --&amp;gt;&lt;br /&gt;
{{#set:&lt;br /&gt;
	Owner=Heath_Mills|&lt;br /&gt;
	Participants=Suzanne_Pierce|&lt;br /&gt;
	Type=Low}}&lt;br /&gt;
- Increase accessibility to research and reproducibility&lt;br /&gt;
&lt;br /&gt;
- increase visibility of research and liklihood of use by others&lt;br /&gt;
&lt;br /&gt;
- career choice to make effort to be at cutting edge and gpf is leading the trends for publishing practice&lt;br /&gt;
&lt;br /&gt;
- follow a set schedule and 'get it done' in a timely manner with external group to encourage and support&lt;br /&gt;
&lt;br /&gt;
- learn more about the process and types of sharing that's appropriate for a gpf (sharing, etc.)&lt;br /&gt;
&lt;br /&gt;
- sense of closing the loop and completing a task with good practices&lt;/div&gt;</summary>
		<author><name>Suzanne</name></author>	</entry>

	<entry>
		<id>https://www.organicdatascience.org/gpf/index.php?title=Write_about_each_author_s_motivation_to_participate&amp;diff=11473</id>
		<title>Write about each author s motivation to participate</title>
		<link rel="alternate" type="text/html" href="https://www.organicdatascience.org/gpf/index.php?title=Write_about_each_author_s_motivation_to_participate&amp;diff=11473"/>
				<updated>2015-03-13T21:16:26Z</updated>
		
		<summary type="html">&lt;p&gt;Suzanne: &lt;/p&gt;
&lt;hr /&gt;
&lt;div&gt;[[Category:Task]]&lt;br /&gt;
&amp;lt;!-- Add any wiki Text above this Line --&amp;gt;&lt;br /&gt;
&amp;lt;!-- Do NOT Edit below this Line --&amp;gt;&lt;br /&gt;
{{#set:&lt;br /&gt;
	Owner=Heath_Mills|&lt;br /&gt;
	Participants=Suzanne_Pierce|&lt;br /&gt;
	Type=Low}}&lt;br /&gt;
- Increase accessibility to research and reproducibility&lt;br /&gt;
&lt;br /&gt;
- increase visibility of research and liklihood of use by others&lt;br /&gt;
&lt;br /&gt;
- career choice to make effort to be at cutting edge&lt;br /&gt;
&lt;br /&gt;
- follow a set schedule and 'get it done' in a timely manner with external group to encourage and support&lt;br /&gt;
&lt;br /&gt;
- learn more about the process of gpf (sharing, etc.)&lt;br /&gt;
&lt;br /&gt;
- sense of closing the loop and completing a task with good practices&lt;/div&gt;</summary>
		<author><name>Suzanne</name></author>	</entry>

	<entry>
		<id>https://www.organicdatascience.org/gpf/index.php?title=Write_about_each_author_s_motivation_to_participate&amp;diff=11468</id>
		<title>Write about each author s motivation to participate</title>
		<link rel="alternate" type="text/html" href="https://www.organicdatascience.org/gpf/index.php?title=Write_about_each_author_s_motivation_to_participate&amp;diff=11468"/>
				<updated>2015-03-13T21:13:38Z</updated>
		
		<summary type="html">&lt;p&gt;Suzanne: &lt;/p&gt;
&lt;hr /&gt;
&lt;div&gt;[[Category:Task]]&lt;br /&gt;
&amp;lt;!-- Add any wiki Text above this Line --&amp;gt;&lt;br /&gt;
&amp;lt;!-- Do NOT Edit below this Line --&amp;gt;&lt;br /&gt;
{{#set:&lt;br /&gt;
	Owner=Heath_Mills|&lt;br /&gt;
	Participants=Suzanne_Pierce|&lt;br /&gt;
	Type=Low}}&lt;br /&gt;
- Increase accessibility to research and reproducibility&lt;br /&gt;
- increase visibility of research and liklihood of use by others&lt;br /&gt;
- career choice to make effort to be at cutting edge&lt;br /&gt;
- follow a set schedule and 'get it done' in a timely manner with external group to encourage and support&lt;br /&gt;
- learn more about the process of gpf (sharing, etc.)&lt;br /&gt;
- sense of closing the loop and completing a task with good practices&lt;/div&gt;</summary>
		<author><name>Suzanne</name></author>	</entry>

	<entry>
		<id>https://www.organicdatascience.org/gpf/index.php?title=Write_about_each_author_s_motivation_to_participate&amp;diff=11464</id>
		<title>Write about each author s motivation to participate</title>
		<link rel="alternate" type="text/html" href="https://www.organicdatascience.org/gpf/index.php?title=Write_about_each_author_s_motivation_to_participate&amp;diff=11464"/>
				<updated>2015-03-13T21:12:49Z</updated>
		
		<summary type="html">&lt;p&gt;Suzanne: &lt;/p&gt;
&lt;hr /&gt;
&lt;div&gt;[[Category:Task]]&lt;br /&gt;
&amp;lt;!-- Add any wiki Text above this Line --&amp;gt;&lt;br /&gt;
&amp;lt;!-- Do NOT Edit below this Line --&amp;gt;&lt;br /&gt;
{{#set:&lt;br /&gt;
	Owner=Heath_Mills|&lt;br /&gt;
	Participants=Suzanne_Pierce|&lt;br /&gt;
	Type=Low}}&lt;br /&gt;
- Increase accessibility to research&lt;br /&gt;
- increase visibility of research and liklihood of use by others&lt;br /&gt;
- career choice to make effort to be at cutting edge&lt;br /&gt;
- follow a set schedule and 'get it done' in a timely manner with external group to encourage and support&lt;br /&gt;
- learn more about the process of gpf (sharing, etc.)&lt;br /&gt;
-&lt;/div&gt;</summary>
		<author><name>Suzanne</name></author>	</entry>

	<entry>
		<id>https://www.organicdatascience.org/gpf/index.php?title=Write_about_each_author_s_motivation_to_participate&amp;diff=11458</id>
		<title>Write about each author s motivation to participate</title>
		<link rel="alternate" type="text/html" href="https://www.organicdatascience.org/gpf/index.php?title=Write_about_each_author_s_motivation_to_participate&amp;diff=11458"/>
				<updated>2015-03-13T21:06:57Z</updated>
		
		<summary type="html">&lt;p&gt;Suzanne: &lt;/p&gt;
&lt;hr /&gt;
&lt;div&gt;[[Category:Task]]&lt;br /&gt;
&amp;lt;!-- Add any wiki Text above this Line --&amp;gt;&lt;br /&gt;
&amp;lt;!-- Do NOT Edit below this Line --&amp;gt;&lt;br /&gt;
{{#set:&lt;br /&gt;
	Owner=Yolanda_Gil|&lt;br /&gt;
	Type=Low}}&lt;br /&gt;
Suzanne&lt;/div&gt;</summary>
		<author><name>Suzanne</name></author>	</entry>

	<entry>
		<id>https://www.organicdatascience.org/gpf/index.php?title=Develop_proposal_for_special_issue&amp;diff=11307</id>
		<title>Develop proposal for special issue</title>
		<link rel="alternate" type="text/html" href="https://www.organicdatascience.org/gpf/index.php?title=Develop_proposal_for_special_issue&amp;diff=11307"/>
				<updated>2015-03-13T16:56:14Z</updated>
		
		<summary type="html">&lt;p&gt;Suzanne: /* [Pierce 2015] */&lt;/p&gt;
&lt;hr /&gt;
&lt;div&gt;[[Category:Task]]&lt;br /&gt;
&lt;br /&gt;
== Background: Why a Special Issue on Geoscience Papers of the Future? ==&lt;br /&gt;
&lt;br /&gt;
[[Discuss_what_we_will_consider_a_GPF#The_Vision | Include here our discussion for the vision]]&lt;br /&gt;
&lt;br /&gt;
Background should be 1-2 pages.&lt;br /&gt;
&lt;br /&gt;
Motivated by need to fully document and make research accessible and reproducible. &lt;br /&gt;
&lt;br /&gt;
=== Motivation: The EarthCube Initiative and the GeoSoft Project ===&lt;br /&gt;
&lt;br /&gt;
[http://www.geosoft-earthcube.org/about Include here background about GeoSoft from the web site]&lt;br /&gt;
&lt;br /&gt;
OSTP memo.  EarthCube reports.&lt;br /&gt;
Other reports that talk about the need for new approaches to editing.&lt;br /&gt;
&lt;br /&gt;
It's possible that small or very large contributions are not well captured in the current publishing paradigms.  Nanopublications.&lt;br /&gt;
&lt;br /&gt;
For example, nano-publications are a possible way to reflect advances in a research process that may not merit a full pubication but they are useful advances to share with the community. A challenge here is that there is a stigma in publishing for publishing units that are too small or very small.  &lt;br /&gt;
&lt;br /&gt;
Alternatively, a very large piece of research or work with many parts may be better suited to a GPF style publication.&lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
Perhaps, the concept of a 'paper' can be better reflected in the concept of a 'wrapper' or a collection of materials and resources. The purpose is to assure that publications are representative of the work, effort, and results achieved in the research process.&lt;br /&gt;
&lt;br /&gt;
=== What is a GPF ===&lt;br /&gt;
&lt;br /&gt;
[[Discuss_what_we_will_consider_a_GPF#What_is_a_Geoscience_Paper_of_the_Future.3F | Include here our discussion of what is a GPF]]&lt;br /&gt;
&lt;br /&gt;
=== The challenges of creating GPFs ===&lt;br /&gt;
&lt;br /&gt;
The articles in this issue reflect the current best practice for generating a Geoscience Paper of the Future.&lt;br /&gt;
&lt;br /&gt;
[['''Figure discussions''']]  Do we want to do exactly the same figure automatically.  Figures in the paper may be a clean versions of an image generated by software.  To the extent possible, authors have included clear delineations of provenance. The goal is to assure that readers may regenerate the figures using documented workflows, data, and codes.  An important note (Allen, Sandra) is that frequently figures are generated by code, scripts, etc. yet the actual figure is finalized with user.....  Mimi is trying to say: is it really worth belaboring the point about how the prettified version of the figure is made? If it is: both of the visualization software I've used (Matlab and SigmaPlot) have actual code in the background that specifies how to set up the prettification, and this code can be found, copied out, and rerun to generate the exact same figure with all of the prettification in the same place. SigmaPlot uses Visual Basic (I think) in its macros. If it is an important point about explicit code, this should be doable. But I'm not sure it's strictly necessary to specify exactly where all the prettifications are to get the gist across.&lt;br /&gt;
&lt;br /&gt;
How much of your experimental history does one include?  (Ibrahim).  The experimental process often ends up nowhere.  Should we document all the failed experiments?  Get one DOI for the results of the successful experiment?  Another for failed trials?&lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
'''''Documenting: Timing and Intermediate proceses'''''&lt;br /&gt;
When should we document and what are the bounds on what we document?&lt;br /&gt;
For example, should we document and include data and workflows for 'failed' experiments? Or should we assign datasets DOIs before we know the results from using them?  &lt;br /&gt;
The group thinks that  good ideas/practices may include documenting and sharing data when you have a clear understanding of the outcomes worth reporting. For example successful experiments should have clear, clean data documented and shared. Whereas one strategy with 'failed' experiments could include bundling the intermediate datasets with one DOI and a more general discussion of the process/methods.&lt;br /&gt;
&lt;br /&gt;
=== Related work ===&lt;br /&gt;
&lt;br /&gt;
[[Discuss_what_we_will_consider_a_GPF#New_Frameworks_to_Create_a_New_Generation_of_Scientific_Articles | Include here the related work we have discussed]]&lt;br /&gt;
&lt;br /&gt;
== Papers to be included ==&lt;br /&gt;
&lt;br /&gt;
Would it be worthwhile to group the papers into broader categories rather than giving specifics about every single paper?&lt;br /&gt;
&lt;br /&gt;
For each submission, we describe:&lt;br /&gt;
&lt;br /&gt;
* '''Authors and affiliations'''&lt;br /&gt;
* '''Keywords of research area'''&lt;br /&gt;
* '''Tentative title'''&lt;br /&gt;
* '''Short abstract'''&lt;br /&gt;
* '''Relationship to other publications''' (is the article based on a previously published article? is it new content?)&lt;br /&gt;
* '''Pointer to the wiki page that documents the article'''&lt;br /&gt;
* '''Expected submission date'''&lt;br /&gt;
&lt;br /&gt;
=== [David 2015] ===&lt;br /&gt;
&lt;br /&gt;
* '''Authors and affiliations:''' [[Cedric David]]&lt;br /&gt;
* '''Keywords of research area:'''&lt;br /&gt;
* '''Tentative title:'''&lt;br /&gt;
* '''Short abstract:'''&lt;br /&gt;
* '''Relationship to other publications:''' &lt;br /&gt;
* '''Pointer to the wiki page that documents the article:''' [[Document_GPF_activities_by_Cedric_David | Page]]&lt;br /&gt;
* '''Expected submission date:'''&lt;br /&gt;
&lt;br /&gt;
=== [Demir 2015] ===&lt;br /&gt;
&lt;br /&gt;
* '''Authors and affiliations:''' [[Ibrahim Demir]]&lt;br /&gt;
* '''Keywords of research area:'''&lt;br /&gt;
* '''Tentative title:'''&lt;br /&gt;
* '''Short abstract:'''&lt;br /&gt;
* '''Relationship to other publications:''' (is the article based on a previously published article? is it new content?)&lt;br /&gt;
* '''Pointer to the wiki page that documents the article:''' [[Document_GPF_activities_by_Ibrahim_Demir | Page]]&lt;br /&gt;
* '''Expected submission date:'''&lt;br /&gt;
&lt;br /&gt;
=== [Fulweiler 2015] ===&lt;br /&gt;
&lt;br /&gt;
* '''Authors and affiliations:''' [[Wally Fulweiler]]&lt;br /&gt;
* '''Keywords of research area:'''&lt;br /&gt;
* '''Tentative title:'''&lt;br /&gt;
* '''Short abstract:'''&lt;br /&gt;
* '''Relationship to other publications:''' (is the article based on a previously published article? is it new content?)&lt;br /&gt;
* '''Pointer to the wiki page that documents the article:''' [[Document_GPF_activities_by_Wally_Fulweiler | Page]]&lt;br /&gt;
* '''Expected submission date:'''&lt;br /&gt;
&lt;br /&gt;
=== [Karlstrom and Lay 2015] ===&lt;br /&gt;
&lt;br /&gt;
* '''Authors and affiliations:''' [[Leif Karlstrom]] and [[Lay Kuan Loh]]&lt;br /&gt;
* '''Keywords of research area:'''&lt;br /&gt;
* '''Tentative title:'''&lt;br /&gt;
* '''Short abstract:'''&lt;br /&gt;
* '''Relationship to other publications:''' (is the article based on a previously published article? is it new content?)&lt;br /&gt;
* '''Pointer to the wiki page that documents the article:''' [[Document_GPF_activities_by_Leif_Karlstron | Page]]&lt;br /&gt;
* '''Expected submission date:'''&lt;br /&gt;
&lt;br /&gt;
=== [Lee 2015] ===&lt;br /&gt;
&lt;br /&gt;
* '''Authors and affiliations:''' [[Kyo Lee]]&lt;br /&gt;
* '''Keywords of research area:'''&lt;br /&gt;
* '''Tentative title:'''&lt;br /&gt;
* '''Short abstract:'''&lt;br /&gt;
* '''Relationship to other publications:''' (is the article based on a previously published article? is it new content?)&lt;br /&gt;
* '''Pointer to the wiki page that documents the article:''' [[Document_GPF_activities_by_Kyo_Lee | Page]]&lt;br /&gt;
* '''Expected submission date:'''&lt;br /&gt;
&lt;br /&gt;
=== [Miller 2015] ===&lt;br /&gt;
&lt;br /&gt;
* '''Authors and affiliations:''' [[Kim Miller]]&lt;br /&gt;
* '''Keywords of research area:'''&lt;br /&gt;
* '''Tentative title:'''&lt;br /&gt;
* '''Short abstract:'''&lt;br /&gt;
* '''Relationship to other publications:''' (is the article based on a previously published article? is it new content?)&lt;br /&gt;
* '''Pointer to the wiki page that documents the article:''' [[Document_GPF_activities_by_Kim_Miller | Page]]&lt;br /&gt;
* '''Expected submission date:'''&lt;br /&gt;
&lt;br /&gt;
=== [Mills 2015] ===&lt;br /&gt;
&lt;br /&gt;
* '''Authors and affiliations:''' [[Heath Mills]]&lt;br /&gt;
* '''Keywords of research area:'''&lt;br /&gt;
* '''Tentative title:'''&lt;br /&gt;
* '''Short abstract:'''&lt;br /&gt;
* '''Relationship to other publications:''' (is the article based on a previously published article? is it new content?)&lt;br /&gt;
* '''Pointer to the wiki page that documents the article:''' [[Document_GPF_activities_by_Heith_Mills | Page]]&lt;br /&gt;
* '''Expected submission date:'''&lt;br /&gt;
&lt;br /&gt;
=== [Oh 2015] ===&lt;br /&gt;
&lt;br /&gt;
* '''Authors and affiliations:''' [[Ji-Hyun Oh]]&lt;br /&gt;
* '''Keywords of research area:'''&lt;br /&gt;
* '''Tentative title:'''&lt;br /&gt;
* '''Short abstract:'''&lt;br /&gt;
* '''Relationship to other publications:''' (is the article based on a previously published article? is it new content?)&lt;br /&gt;
* '''Pointer to the wiki page that documents the article:''' [[Document_GPF_activities_by_Ji_Hyun | Page]]&lt;br /&gt;
* '''Expected submission date:'''&lt;br /&gt;
&lt;br /&gt;
=== [Pierce 2015] ===&lt;br /&gt;
&lt;br /&gt;
* '''Authors and affiliations:''' [[Suzanne Pierce]]&lt;br /&gt;
* '''Keywords of research area:'''&lt;br /&gt;
* '''Tentative title:'''&lt;br /&gt;
* '''Short abstract:'''&lt;br /&gt;
* '''Challenge:''' Fully document a new software application and framework using example case study data and tutorials.&lt;br /&gt;
* '''Relationship to other publications:''' (is the article based on a previously published article? is it new content?)&lt;br /&gt;
* '''Pointer to the wiki page that documents the article:''' [[Document_GPF_activities_by_Suzanne_Pierce | Page]]&lt;br /&gt;
* '''Expected submission date:'''&lt;br /&gt;
&lt;br /&gt;
=== [Pope 2015] ===&lt;br /&gt;
&lt;br /&gt;
* '''Authors and affiliations:''' [[Allen Pope]]&lt;br /&gt;
* '''Keywords of research area:'''&lt;br /&gt;
* '''Tentative title:'''&lt;br /&gt;
* '''Short abstract:'''&lt;br /&gt;
* '''Relationship to other publications:''' (is the article based on a previously published article? is it new content?)&lt;br /&gt;
* '''Pointer to the wiki page that documents the article:''' [[Document_GPF_activities_by_Allen_Pope | Page]]&lt;br /&gt;
* '''Expected submission date:'''&lt;br /&gt;
&lt;br /&gt;
=== [Read and Winslow 2015] ===&lt;br /&gt;
&lt;br /&gt;
* '''Authors and affiliations:''' [[Jordan Read]] and [[Luke Winslow]]&lt;br /&gt;
* '''Keywords of research area:'''&lt;br /&gt;
* '''Tentative title:'''&lt;br /&gt;
* '''Short abstract:'''&lt;br /&gt;
* '''Relationship to other publications:''' (is the article based on a previously published article? is it new content?)&lt;br /&gt;
* '''Pointer to the wiki page that documents the article:''' [[Document_GPF_activities_by_Jordan_Read | Page]]&lt;br /&gt;
* '''Expected submission date:'''&lt;br /&gt;
&lt;br /&gt;
=== [Tzeng 2015] ===&lt;br /&gt;
&lt;br /&gt;
* '''Authors and affiliations:''' [[Mimi Tzeng]]&lt;br /&gt;
* '''Keywords of research area:'''&lt;br /&gt;
* '''Tentative title:'''&lt;br /&gt;
* '''Short abstract:'''&lt;br /&gt;
* '''Relationship to other publications:''' (is the article based on a previously published article? is it new content?)&lt;br /&gt;
* '''Pointer to the wiki page that documents the article:''' [[Document_GPF_activities_by_Mimi_Tzeng | Page]]&lt;br /&gt;
* '''Expected submission date:'''&lt;br /&gt;
&lt;br /&gt;
=== [Villamizar 2015] ===&lt;br /&gt;
&lt;br /&gt;
* '''Authors and affiliations:''' [[Sandra Villamizar]]&lt;br /&gt;
* '''Keywords of research area:'''&lt;br /&gt;
* '''Tentative title:'''&lt;br /&gt;
* '''Short abstract:'''&lt;br /&gt;
* '''Relationship to other publications:''' (is the article based on a previously published article? is it new content?)&lt;br /&gt;
* '''Pointer to the wiki page that documents the article:''' [[Document_GPF_activities_by_Sandra_Villamizar | Page]]&lt;br /&gt;
* '''Expected submission date:'''&lt;br /&gt;
&lt;br /&gt;
=== [Yu 2015] ===&lt;br /&gt;
&lt;br /&gt;
* '''Authors and affiliations:''' [[Xuan Yu]]&lt;br /&gt;
* '''Keywords of research area:'''&lt;br /&gt;
* '''Tentative title:'''&lt;br /&gt;
* '''Short abstract:'''&lt;br /&gt;
* '''Relationship to other publications:''' (is the article based on a previously published article? is it new content?)&lt;br /&gt;
* '''Pointer to the wiki page that documents the article:''' [[Document_GPF_activities_by_Xuan_Yu | Page]]&lt;br /&gt;
* '''Expected submission date:'''&lt;br /&gt;
&lt;br /&gt;
== Special Issue Editors ==&lt;br /&gt;
&lt;br /&gt;
* Co-editor: &lt;br /&gt;
* Co-editor:&lt;br /&gt;
* Co-editor: &lt;br /&gt;
&lt;br /&gt;
The editors will only accept submissions that follow the [[Develop_proposal_for_special_issue#Special_Issue_Review_Criteria | special issue review criteria]].&lt;br /&gt;
&lt;br /&gt;
The editors will select a set of reviewers to handle the submissions.  Reviewers will include computer scientists, library scientists, and geoscientists.&lt;br /&gt;
&lt;br /&gt;
== Special Issue Review Criteria ==&lt;br /&gt;
&lt;br /&gt;
The reviewers will be asked to provide feedback on the papers according to the following criteria:&lt;br /&gt;
&lt;br /&gt;
* Documentation of the datasets: descriptions of datasets, unique identifiers, repositories.&lt;br /&gt;
* Documentation of software: description of all software used (including pre-processing of data, visualization steps, etc), unique identifiers, repositories.&lt;br /&gt;
* Documentation of the provenance of results: provenance for each figure or result, such as the workflow or the provenance record.&lt;br /&gt;
&lt;br /&gt;
== Tentative Timeline ==&lt;br /&gt;
&lt;br /&gt;
* Journal committed to special issue: April 15, 2015&lt;br /&gt;
* Submissions due to editors: June 30, 2015&lt;br /&gt;
* Reviews due: Sept 15, 2015&lt;br /&gt;
* Decisions out to authors: Sept 30, 2015&lt;br /&gt;
* Revisions due: October 31, 2015&lt;br /&gt;
* Final versions due November 15, 2015&lt;br /&gt;
* Issue published December 31, 2015&lt;br /&gt;
&lt;br /&gt;
&amp;lt;!-- Add any wiki Text above this Line --&amp;gt;&lt;br /&gt;
&amp;lt;!-- Do NOT Edit below this Line --&amp;gt;&lt;br /&gt;
{{#set:&lt;br /&gt;
	Owner=Yolanda_Gil|&lt;br /&gt;
	Participants=Xuan_Yu|&lt;br /&gt;
	Participants=Chris_Duffy|&lt;br /&gt;
	Participants=Scott_Peckham|&lt;br /&gt;
	Participants=Cedric_David|&lt;br /&gt;
	Participants=Ibrahim_Demir|&lt;br /&gt;
	Participants=Wally_Fulweiler|&lt;br /&gt;
	Participants=Leif_Karlstrom|&lt;br /&gt;
	Participants=Kyo_Lee|&lt;br /&gt;
	Participants=Kim_Miller|&lt;br /&gt;
	Participants=Heath_Mills|&lt;br /&gt;
	Participants=Ji-Hyun_Oh|&lt;br /&gt;
	Participants=Suzanne_Pierce|&lt;br /&gt;
	Participants=Allen_Pope|&lt;br /&gt;
	Participants=Jordan_Read|&lt;br /&gt;
	Participants=Mimi_Tzeng|&lt;br /&gt;
	Participants=Sandra_Villamizar|&lt;br /&gt;
	Progress=20|&lt;br /&gt;
	StartDate=2015-03-10|&lt;br /&gt;
	TargetDate=2015-03-16|&lt;br /&gt;
	Type=Low}}&lt;/div&gt;</summary>
		<author><name>Suzanne</name></author>	</entry>

	<entry>
		<id>https://www.organicdatascience.org/gpf/index.php?title=Develop_proposal_for_special_issue&amp;diff=11305</id>
		<title>Develop proposal for special issue</title>
		<link rel="alternate" type="text/html" href="https://www.organicdatascience.org/gpf/index.php?title=Develop_proposal_for_special_issue&amp;diff=11305"/>
				<updated>2015-03-13T16:44:15Z</updated>
		
		<summary type="html">&lt;p&gt;Suzanne: /* The challenges of creating GPFs */&lt;/p&gt;
&lt;hr /&gt;
&lt;div&gt;[[Category:Task]]&lt;br /&gt;
&lt;br /&gt;
== Background: Why a Special Issue on Geoscience Papers of the Future? ==&lt;br /&gt;
&lt;br /&gt;
[[Discuss_what_we_will_consider_a_GPF#The_Vision | Include here our discussion for the vision]]&lt;br /&gt;
&lt;br /&gt;
Background should be 1-2 pages.&lt;br /&gt;
&lt;br /&gt;
Motivated by need to fully document and make research accessible and reproducible. &lt;br /&gt;
&lt;br /&gt;
=== Motivation: The EarthCube Initiative and the GeoSoft Project ===&lt;br /&gt;
&lt;br /&gt;
[http://www.geosoft-earthcube.org/about Include here background about GeoSoft from the web site]&lt;br /&gt;
&lt;br /&gt;
OSTP memo.  EarthCube reports.&lt;br /&gt;
Other reports that talk about the need for new approaches to editing.&lt;br /&gt;
&lt;br /&gt;
It's possible that small or very large contributions are not well captured in the current publishing paradigms.  Nanopublications.&lt;br /&gt;
&lt;br /&gt;
For example, nano-publications are a possible way to reflect advances in a research process that may not merit a full pubication but they are useful advances to share with the community. A challenge here is that there is a stigma in publishing for publishing units that are too small or very small.  &lt;br /&gt;
&lt;br /&gt;
Alternatively, a very large piece of research or work with many parts may be better suited to a GPF style publication.&lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
Perhaps, the concept of a 'paper' can be better reflected in the concept of a 'wrapper' or a collection of materials and resources. The purpose is to assure that publications are representative of the work, effort, and results achieved in the research process.&lt;br /&gt;
&lt;br /&gt;
=== What is a GPF ===&lt;br /&gt;
&lt;br /&gt;
[[Discuss_what_we_will_consider_a_GPF#What_is_a_Geoscience_Paper_of_the_Future.3F | Include here our discussion of what is a GPF]]&lt;br /&gt;
&lt;br /&gt;
=== The challenges of creating GPFs ===&lt;br /&gt;
&lt;br /&gt;
The articles in this issue reflect the current best practice for generating a Geoscience Paper of the Future.&lt;br /&gt;
&lt;br /&gt;
[['''Figure discussions''']]  Do we want to do exactly the same figure automatically.  Figures in the paper may be a clean versions of an image generated by software.  To the extent possible, authors have included clear delineations of provenance. The goal is to assure that readers may regenerate the figures using documented workflows, data, and codes.  An important note (Allen, Sandra) is that frequently figures are generated by code, scripts, etc. yet the actual figure is finalized with user.....  Mimi is trying to say: is it really worth belaboring the point about how the prettified version of the figure is made? If it is: both of the visualization software I've used (Matlab and SigmaPlot) have actual code in the background that specifies how to set up the prettification, and this code can be found, copied out, and rerun to generate the exact same figure with all of the prettification in the same place. SigmaPlot uses Visual Basic (I think) in its macros. If it is an important point about explicit code, this should be doable. But I'm not sure it's strictly necessary to specify exactly where all the prettifications are to get the gist across.&lt;br /&gt;
&lt;br /&gt;
How much of your experimental history does one include?  (Ibrahim).  The experimental process often ends up nowhere.  Should we document all the failed experiments?  Get one DOI for the results of the successful experiment?  Another for failed trials?&lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
'''''Documenting: Timing and Intermediate proceses'''''&lt;br /&gt;
When should we document and what are the bounds on what we document?&lt;br /&gt;
For example, should we document and include data and workflows for 'failed' experiments? Or should we assign datasets DOIs before we know the results from using them?  &lt;br /&gt;
The group thinks that  good ideas/practices may include documenting and sharing data when you have a clear understanding of the outcomes worth reporting. For example successful experiments should have clear, clean data documented and shared. Whereas one strategy with 'failed' experiments could include bundling the intermediate datasets with one DOI and a more general discussion of the process/methods.&lt;br /&gt;
&lt;br /&gt;
=== Related work ===&lt;br /&gt;
&lt;br /&gt;
[[Discuss_what_we_will_consider_a_GPF#New_Frameworks_to_Create_a_New_Generation_of_Scientific_Articles | Include here the related work we have discussed]]&lt;br /&gt;
&lt;br /&gt;
== Papers to be included ==&lt;br /&gt;
&lt;br /&gt;
For each submission, we describe:&lt;br /&gt;
&lt;br /&gt;
* '''Authors and affiliations'''&lt;br /&gt;
* '''Keywords of research area'''&lt;br /&gt;
* '''Tentative title'''&lt;br /&gt;
* '''Short abstract'''&lt;br /&gt;
* '''Relationship to other publications''' (is the article based on a previously published article? is it new content?)&lt;br /&gt;
* '''Pointer to the wiki page that documents the article'''&lt;br /&gt;
* '''Expected submission date'''&lt;br /&gt;
&lt;br /&gt;
=== [David 2015] ===&lt;br /&gt;
&lt;br /&gt;
* '''Authors and affiliations:''' [[Cedric David]]&lt;br /&gt;
* '''Keywords of research area:'''&lt;br /&gt;
* '''Tentative title:'''&lt;br /&gt;
* '''Short abstract:'''&lt;br /&gt;
* '''Relationship to other publications:''' &lt;br /&gt;
* '''Pointer to the wiki page that documents the article:''' [[Document_GPF_activities_by_Cedric_David | Page]]&lt;br /&gt;
* '''Expected submission date:'''&lt;br /&gt;
&lt;br /&gt;
=== [Demir 2015] ===&lt;br /&gt;
&lt;br /&gt;
* '''Authors and affiliations:''' [[Ibrahim Demir]]&lt;br /&gt;
* '''Keywords of research area:'''&lt;br /&gt;
* '''Tentative title:'''&lt;br /&gt;
* '''Short abstract:'''&lt;br /&gt;
* '''Relationship to other publications:''' (is the article based on a previously published article? is it new content?)&lt;br /&gt;
* '''Pointer to the wiki page that documents the article:''' [[Document_GPF_activities_by_Ibrahim_Demir | Page]]&lt;br /&gt;
* '''Expected submission date:'''&lt;br /&gt;
&lt;br /&gt;
=== [Fulweiler 2015] ===&lt;br /&gt;
&lt;br /&gt;
* '''Authors and affiliations:''' [[Wally Fulweiler]]&lt;br /&gt;
* '''Keywords of research area:'''&lt;br /&gt;
* '''Tentative title:'''&lt;br /&gt;
* '''Short abstract:'''&lt;br /&gt;
* '''Relationship to other publications:''' (is the article based on a previously published article? is it new content?)&lt;br /&gt;
* '''Pointer to the wiki page that documents the article:''' [[Document_GPF_activities_by_Wally_Fulweiler | Page]]&lt;br /&gt;
* '''Expected submission date:'''&lt;br /&gt;
&lt;br /&gt;
=== [Karlstrom and Lay 2015] ===&lt;br /&gt;
&lt;br /&gt;
* '''Authors and affiliations:''' [[Leif Karlstrom]] and [[Lay Kuan Loh]]&lt;br /&gt;
* '''Keywords of research area:'''&lt;br /&gt;
* '''Tentative title:'''&lt;br /&gt;
* '''Short abstract:'''&lt;br /&gt;
* '''Relationship to other publications:''' (is the article based on a previously published article? is it new content?)&lt;br /&gt;
* '''Pointer to the wiki page that documents the article:''' [[Document_GPF_activities_by_Leif_Karlstron | Page]]&lt;br /&gt;
* '''Expected submission date:'''&lt;br /&gt;
&lt;br /&gt;
=== [Lee 2015] ===&lt;br /&gt;
&lt;br /&gt;
* '''Authors and affiliations:''' [[Kyo Lee]]&lt;br /&gt;
* '''Keywords of research area:'''&lt;br /&gt;
* '''Tentative title:'''&lt;br /&gt;
* '''Short abstract:'''&lt;br /&gt;
* '''Relationship to other publications:''' (is the article based on a previously published article? is it new content?)&lt;br /&gt;
* '''Pointer to the wiki page that documents the article:''' [[Document_GPF_activities_by_Kyo_Lee | Page]]&lt;br /&gt;
* '''Expected submission date:'''&lt;br /&gt;
&lt;br /&gt;
=== [Miller 2015] ===&lt;br /&gt;
&lt;br /&gt;
* '''Authors and affiliations:''' [[Kim Miller]]&lt;br /&gt;
* '''Keywords of research area:'''&lt;br /&gt;
* '''Tentative title:'''&lt;br /&gt;
* '''Short abstract:'''&lt;br /&gt;
* '''Relationship to other publications:''' (is the article based on a previously published article? is it new content?)&lt;br /&gt;
* '''Pointer to the wiki page that documents the article:''' [[Document_GPF_activities_by_Kim_Miller | Page]]&lt;br /&gt;
* '''Expected submission date:'''&lt;br /&gt;
&lt;br /&gt;
=== [Mills 2015] ===&lt;br /&gt;
&lt;br /&gt;
* '''Authors and affiliations:''' [[Heath Mills]]&lt;br /&gt;
* '''Keywords of research area:'''&lt;br /&gt;
* '''Tentative title:'''&lt;br /&gt;
* '''Short abstract:'''&lt;br /&gt;
* '''Relationship to other publications:''' (is the article based on a previously published article? is it new content?)&lt;br /&gt;
* '''Pointer to the wiki page that documents the article:''' [[Document_GPF_activities_by_Heith_Mills | Page]]&lt;br /&gt;
* '''Expected submission date:'''&lt;br /&gt;
&lt;br /&gt;
=== [Oh 2015] ===&lt;br /&gt;
&lt;br /&gt;
* '''Authors and affiliations:''' [[Ji-Hyun Oh]]&lt;br /&gt;
* '''Keywords of research area:'''&lt;br /&gt;
* '''Tentative title:'''&lt;br /&gt;
* '''Short abstract:'''&lt;br /&gt;
* '''Relationship to other publications:''' (is the article based on a previously published article? is it new content?)&lt;br /&gt;
* '''Pointer to the wiki page that documents the article:''' [[Document_GPF_activities_by_Ji_Hyun | Page]]&lt;br /&gt;
* '''Expected submission date:'''&lt;br /&gt;
&lt;br /&gt;
=== [Pierce 2015] ===&lt;br /&gt;
&lt;br /&gt;
* '''Authors and affiliations:''' [[Suzanne Pierce]]&lt;br /&gt;
* '''Keywords of research area:'''&lt;br /&gt;
* '''Tentative title:'''&lt;br /&gt;
* '''Short abstract:'''&lt;br /&gt;
* '''Relationship to other publications:''' (is the article based on a previously published article? is it new content?)&lt;br /&gt;
* '''Pointer to the wiki page that documents the article:''' [[Document_GPF_activities_by_Suzanne_Pierce | Page]]&lt;br /&gt;
* '''Expected submission date:'''&lt;br /&gt;
&lt;br /&gt;
=== [Pope 2015] ===&lt;br /&gt;
&lt;br /&gt;
* '''Authors and affiliations:''' [[Allen Pope]]&lt;br /&gt;
* '''Keywords of research area:'''&lt;br /&gt;
* '''Tentative title:'''&lt;br /&gt;
* '''Short abstract:'''&lt;br /&gt;
* '''Relationship to other publications:''' (is the article based on a previously published article? is it new content?)&lt;br /&gt;
* '''Pointer to the wiki page that documents the article:''' [[Document_GPF_activities_by_Allen_Pope | Page]]&lt;br /&gt;
* '''Expected submission date:'''&lt;br /&gt;
&lt;br /&gt;
=== [Read and Winslow 2015] ===&lt;br /&gt;
&lt;br /&gt;
* '''Authors and affiliations:''' [[Jordan Read]] and [[Luke Winslow]]&lt;br /&gt;
* '''Keywords of research area:'''&lt;br /&gt;
* '''Tentative title:'''&lt;br /&gt;
* '''Short abstract:'''&lt;br /&gt;
* '''Relationship to other publications:''' (is the article based on a previously published article? is it new content?)&lt;br /&gt;
* '''Pointer to the wiki page that documents the article:''' [[Document_GPF_activities_by_Jordan_Read | Page]]&lt;br /&gt;
* '''Expected submission date:'''&lt;br /&gt;
&lt;br /&gt;
=== [Tzeng 2015] ===&lt;br /&gt;
&lt;br /&gt;
* '''Authors and affiliations:''' [[Mimi Tzeng]]&lt;br /&gt;
* '''Keywords of research area:'''&lt;br /&gt;
* '''Tentative title:'''&lt;br /&gt;
* '''Short abstract:'''&lt;br /&gt;
* '''Relationship to other publications:''' (is the article based on a previously published article? is it new content?)&lt;br /&gt;
* '''Pointer to the wiki page that documents the article:''' [[Document_GPF_activities_by_Mimi_Tzeng | Page]]&lt;br /&gt;
* '''Expected submission date:'''&lt;br /&gt;
&lt;br /&gt;
=== [Villamizar 2015] ===&lt;br /&gt;
&lt;br /&gt;
* '''Authors and affiliations:''' [[Sandra Villamizar]]&lt;br /&gt;
* '''Keywords of research area:'''&lt;br /&gt;
* '''Tentative title:'''&lt;br /&gt;
* '''Short abstract:'''&lt;br /&gt;
* '''Relationship to other publications:''' (is the article based on a previously published article? is it new content?)&lt;br /&gt;
* '''Pointer to the wiki page that documents the article:''' [[Document_GPF_activities_by_Sandra_Villamizar | Page]]&lt;br /&gt;
* '''Expected submission date:'''&lt;br /&gt;
&lt;br /&gt;
=== [Yu 2015] ===&lt;br /&gt;
&lt;br /&gt;
* '''Authors and affiliations:''' [[Xuan Yu]]&lt;br /&gt;
* '''Keywords of research area:'''&lt;br /&gt;
* '''Tentative title:'''&lt;br /&gt;
* '''Short abstract:'''&lt;br /&gt;
* '''Relationship to other publications:''' (is the article based on a previously published article? is it new content?)&lt;br /&gt;
* '''Pointer to the wiki page that documents the article:''' [[Document_GPF_activities_by_Xuan_Yu | Page]]&lt;br /&gt;
* '''Expected submission date:'''&lt;br /&gt;
&lt;br /&gt;
== Special Issue Editors ==&lt;br /&gt;
&lt;br /&gt;
* Co-editor: &lt;br /&gt;
* Co-editor:&lt;br /&gt;
* Co-editor: &lt;br /&gt;
&lt;br /&gt;
The editors will only accept submissions that follow the [[Develop_proposal_for_special_issue#Special_Issue_Review_Criteria | special issue review criteria]].&lt;br /&gt;
&lt;br /&gt;
The editors will select a set of reviewers to handle the submissions.  Reviewers will include computer scientists, library scientists, and geoscientists.&lt;br /&gt;
&lt;br /&gt;
== Special Issue Review Criteria ==&lt;br /&gt;
&lt;br /&gt;
The reviewers will be asked to provide feedback on the papers according to the following criteria:&lt;br /&gt;
&lt;br /&gt;
* Documentation of the datasets: descriptions of datasets, unique identifiers, repositories.&lt;br /&gt;
* Documentation of software: description of all software used (including pre-processing of data, visualization steps, etc), unique identifiers, repositories.&lt;br /&gt;
* Documentation of the provenance of results: provenance for each figure or result, such as the workflow or the provenance record.&lt;br /&gt;
&lt;br /&gt;
== Tentative Timeline ==&lt;br /&gt;
&lt;br /&gt;
* Journal committed to special issue: April 15, 2015&lt;br /&gt;
* Submissions due to editors: June 30, 2015&lt;br /&gt;
* Reviews due: Sept 15, 2015&lt;br /&gt;
* Decisions out to authors: Sept 30, 2015&lt;br /&gt;
* Revisions due: October 31, 2015&lt;br /&gt;
* Final versions due November 15, 2015&lt;br /&gt;
* Issue published December 31, 2015&lt;br /&gt;
&lt;br /&gt;
&amp;lt;!-- Add any wiki Text above this Line --&amp;gt;&lt;br /&gt;
&amp;lt;!-- Do NOT Edit below this Line --&amp;gt;&lt;br /&gt;
{{#set:&lt;br /&gt;
	Owner=Yolanda_Gil|&lt;br /&gt;
	Participants=Xuan_Yu|&lt;br /&gt;
	Participants=Chris_Duffy|&lt;br /&gt;
	Participants=Scott_Peckham|&lt;br /&gt;
	Participants=Cedric_David|&lt;br /&gt;
	Participants=Ibrahim_Demir|&lt;br /&gt;
	Participants=Wally_Fulweiler|&lt;br /&gt;
	Participants=Leif_Karlstrom|&lt;br /&gt;
	Participants=Kyo_Lee|&lt;br /&gt;
	Participants=Kim_Miller|&lt;br /&gt;
	Participants=Heath_Mills|&lt;br /&gt;
	Participants=Ji-Hyun_Oh|&lt;br /&gt;
	Participants=Suzanne_Pierce|&lt;br /&gt;
	Participants=Allen_Pope|&lt;br /&gt;
	Participants=Jordan_Read|&lt;br /&gt;
	Participants=Mimi_Tzeng|&lt;br /&gt;
	Participants=Sandra_Villamizar|&lt;br /&gt;
	Progress=20|&lt;br /&gt;
	StartDate=2015-03-10|&lt;br /&gt;
	TargetDate=2015-03-16|&lt;br /&gt;
	Type=Low}}&lt;/div&gt;</summary>
		<author><name>Suzanne</name></author>	</entry>

	<entry>
		<id>https://www.organicdatascience.org/gpf/index.php?title=Develop_proposal_for_special_issue&amp;diff=11303</id>
		<title>Develop proposal for special issue</title>
		<link rel="alternate" type="text/html" href="https://www.organicdatascience.org/gpf/index.php?title=Develop_proposal_for_special_issue&amp;diff=11303"/>
				<updated>2015-03-13T16:42:53Z</updated>
		
		<summary type="html">&lt;p&gt;Suzanne: /* The challenges of creating GPFs */&lt;/p&gt;
&lt;hr /&gt;
&lt;div&gt;[[Category:Task]]&lt;br /&gt;
&lt;br /&gt;
== Background: Why a Special Issue on Geoscience Papers of the Future? ==&lt;br /&gt;
&lt;br /&gt;
[[Discuss_what_we_will_consider_a_GPF#The_Vision | Include here our discussion for the vision]]&lt;br /&gt;
&lt;br /&gt;
Motivated by need to fully document and make research accessible and reproducable. &lt;br /&gt;
&lt;br /&gt;
=== Motivation: The EarthCube Initiative and the GeoSoft Project ===&lt;br /&gt;
&lt;br /&gt;
[http://www.geosoft-earthcube.org/about Include here background about GeoSoft from the web site]&lt;br /&gt;
&lt;br /&gt;
OSTP memo.  EarthCube reports.&lt;br /&gt;
Other reports that talk about the need for new approaches to editing.&lt;br /&gt;
&lt;br /&gt;
It's possible that small or very large contributions are not well captured in the current publishing paradigms.  Nanopublications.&lt;br /&gt;
&lt;br /&gt;
For example, nano-publications are a possible way to reflect advances in a research process that may not merit a full pubication but they are useful advances to share with the community. A challenge here is that there is a stigma in publishing for publishing units that are too small or very small.  &lt;br /&gt;
&lt;br /&gt;
Alternatively, a very large piece of research or work with many parts may be better suited to a GPF style publication.&lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
Perhaps, the concept of a 'paper' can be better reflected in the concept of a 'wrapper' or a collection of materials and resources. The purpose is to assure that publications are representative of the work, effort, and results achieved in the research process.&lt;br /&gt;
&lt;br /&gt;
=== What is a GPF ===&lt;br /&gt;
&lt;br /&gt;
[[Discuss_what_we_will_consider_a_GPF#What_is_a_Geoscience_Paper_of_the_Future.3F | Include here our discussion of what is a GPF]]&lt;br /&gt;
&lt;br /&gt;
=== The challenges of creating GPFs ===&lt;br /&gt;
&lt;br /&gt;
The articles in this issue reflect the current best practice for generating a Geoscience Paper of the Future.&lt;br /&gt;
&lt;br /&gt;
Figure discussions.  Do we want to do exactly the same figure automatically.  Figures in the paper may be a clean versions of an image generated by software.  To the extent possible, authors have included clear delineations of provenance. The goal is to assure that readers may regenerate the figures using documented workflows, data, and codes.  An important note (Allen, Sandra) is that frequently figures are generated by code, scripts, etc. yet the actual figure is finalized with user.....  Mimi is trying to say: is it really worth belaboring the point about how the prettified version of the figure is made? If it is: both of the visualization software I've used (Matlab and SigmaPlot) have actual code in the background that specifies how to set up the prettification, and this code can be found, copied out, and rerun to generate the exact same figure with all of the prettification in the same place. SigmaPlot uses Visual Basic (I think) in its macros. If it is an important point about explicit code, this should be doable. But I'm not sure it's strictly necessary to specify exactly where all the prettifications are to get the gist across.&lt;br /&gt;
&lt;br /&gt;
How much of your experimental history does one include?  (Ibrahim).  The experimental process often ends up nowhere.  Should we document all the failed experiments?  Get one DOI for the results of the successful experiment?  Another for failed trials?&lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
Documenting: Timing and Intermediate proceses&lt;br /&gt;
When should we document and what are the bounds on what we document?&lt;br /&gt;
For example, should we document and include data and workflows for 'failed' experiments? Or should we assign datasets DOIs before we know the results from using them?  &lt;br /&gt;
The group thinks that  good ideas/practices may include documenting and sharing data when you have a clear understanding of the outcomes worth reporting. For example successful experiments should have clear, clean data documented and shared. Whereas one strategy with 'failed' experiments could include bundling the intermediate datasets with one DOI and a more general discussion of the process/methods.&lt;br /&gt;
&lt;br /&gt;
=== Related work ===&lt;br /&gt;
&lt;br /&gt;
[[Discuss_what_we_will_consider_a_GPF#New_Frameworks_to_Create_a_New_Generation_of_Scientific_Articles | Include here the related work we have discussed]]&lt;br /&gt;
&lt;br /&gt;
== Papers to be included ==&lt;br /&gt;
&lt;br /&gt;
For each submission, we describe:&lt;br /&gt;
&lt;br /&gt;
* '''Authors and affiliations'''&lt;br /&gt;
* '''Keywords of research area'''&lt;br /&gt;
* '''Tentative title'''&lt;br /&gt;
* '''Short abstract'''&lt;br /&gt;
* '''Relationship to other publications''' (is the article based on a previously published article? is it new content?)&lt;br /&gt;
* '''Pointer to the wiki page that documents the article'''&lt;br /&gt;
* '''Expected submission date'''&lt;br /&gt;
&lt;br /&gt;
=== [David 2015] ===&lt;br /&gt;
&lt;br /&gt;
* '''Authors and affiliations:''' [[Cedric David]]&lt;br /&gt;
* '''Keywords of research area:'''&lt;br /&gt;
* '''Tentative title:'''&lt;br /&gt;
* '''Short abstract:'''&lt;br /&gt;
* '''Relationship to other publications:''' &lt;br /&gt;
* '''Pointer to the wiki page that documents the article:''' [[Document_GPF_activities_by_Cedric_David | Page]]&lt;br /&gt;
* '''Expected submission date:'''&lt;br /&gt;
&lt;br /&gt;
=== [Demir 2015] ===&lt;br /&gt;
&lt;br /&gt;
* '''Authors and affiliations:''' [[Ibrahim Demir]]&lt;br /&gt;
* '''Keywords of research area:'''&lt;br /&gt;
* '''Tentative title:'''&lt;br /&gt;
* '''Short abstract:'''&lt;br /&gt;
* '''Relationship to other publications:''' (is the article based on a previously published article? is it new content?)&lt;br /&gt;
* '''Pointer to the wiki page that documents the article:''' [[Document_GPF_activities_by_Ibrahim_Demir | Page]]&lt;br /&gt;
* '''Expected submission date:'''&lt;br /&gt;
&lt;br /&gt;
=== [Fulweiler 2015] ===&lt;br /&gt;
&lt;br /&gt;
* '''Authors and affiliations:''' [[Wally Fulweiler]]&lt;br /&gt;
* '''Keywords of research area:'''&lt;br /&gt;
* '''Tentative title:'''&lt;br /&gt;
* '''Short abstract:'''&lt;br /&gt;
* '''Relationship to other publications:''' (is the article based on a previously published article? is it new content?)&lt;br /&gt;
* '''Pointer to the wiki page that documents the article:''' [[Document_GPF_activities_by_Wally_Fulweiler | Page]]&lt;br /&gt;
* '''Expected submission date:'''&lt;br /&gt;
&lt;br /&gt;
=== [Karlstrom and Lay 2015] ===&lt;br /&gt;
&lt;br /&gt;
* '''Authors and affiliations:''' [[Leif Karlstrom]] and [[Lay Kuan Loh]]&lt;br /&gt;
* '''Keywords of research area:'''&lt;br /&gt;
* '''Tentative title:'''&lt;br /&gt;
* '''Short abstract:'''&lt;br /&gt;
* '''Relationship to other publications:''' (is the article based on a previously published article? is it new content?)&lt;br /&gt;
* '''Pointer to the wiki page that documents the article:''' [[Document_GPF_activities_by_Leif_Karlstron | Page]]&lt;br /&gt;
* '''Expected submission date:'''&lt;br /&gt;
&lt;br /&gt;
=== [Lee 2015] ===&lt;br /&gt;
&lt;br /&gt;
* '''Authors and affiliations:''' [[Kyo Lee]]&lt;br /&gt;
* '''Keywords of research area:'''&lt;br /&gt;
* '''Tentative title:'''&lt;br /&gt;
* '''Short abstract:'''&lt;br /&gt;
* '''Relationship to other publications:''' (is the article based on a previously published article? is it new content?)&lt;br /&gt;
* '''Pointer to the wiki page that documents the article:''' [[Document_GPF_activities_by_Kyo_Lee | Page]]&lt;br /&gt;
* '''Expected submission date:'''&lt;br /&gt;
&lt;br /&gt;
=== [Miller 2015] ===&lt;br /&gt;
&lt;br /&gt;
* '''Authors and affiliations:''' [[Kim Miller]]&lt;br /&gt;
* '''Keywords of research area:'''&lt;br /&gt;
* '''Tentative title:'''&lt;br /&gt;
* '''Short abstract:'''&lt;br /&gt;
* '''Relationship to other publications:''' (is the article based on a previously published article? is it new content?)&lt;br /&gt;
* '''Pointer to the wiki page that documents the article:''' [[Document_GPF_activities_by_Kim_Miller | Page]]&lt;br /&gt;
* '''Expected submission date:'''&lt;br /&gt;
&lt;br /&gt;
=== [Mills 2015] ===&lt;br /&gt;
&lt;br /&gt;
* '''Authors and affiliations:''' [[Heath Mills]]&lt;br /&gt;
* '''Keywords of research area:'''&lt;br /&gt;
* '''Tentative title:'''&lt;br /&gt;
* '''Short abstract:'''&lt;br /&gt;
* '''Relationship to other publications:''' (is the article based on a previously published article? is it new content?)&lt;br /&gt;
* '''Pointer to the wiki page that documents the article:''' [[Document_GPF_activities_by_Heith_Mills | Page]]&lt;br /&gt;
* '''Expected submission date:'''&lt;br /&gt;
&lt;br /&gt;
=== [Oh 2015] ===&lt;br /&gt;
&lt;br /&gt;
* '''Authors and affiliations:''' [[Ji-Hyun Oh]]&lt;br /&gt;
* '''Keywords of research area:'''&lt;br /&gt;
* '''Tentative title:'''&lt;br /&gt;
* '''Short abstract:'''&lt;br /&gt;
* '''Relationship to other publications:''' (is the article based on a previously published article? is it new content?)&lt;br /&gt;
* '''Pointer to the wiki page that documents the article:''' [[Document_GPF_activities_by_Ji_Hyun | Page]]&lt;br /&gt;
* '''Expected submission date:'''&lt;br /&gt;
&lt;br /&gt;
=== [Pierce 2015] ===&lt;br /&gt;
&lt;br /&gt;
* '''Authors and affiliations:''' [[Suzanne Pierce]]&lt;br /&gt;
* '''Keywords of research area:'''&lt;br /&gt;
* '''Tentative title:'''&lt;br /&gt;
* '''Short abstract:'''&lt;br /&gt;
* '''Relationship to other publications:''' (is the article based on a previously published article? is it new content?)&lt;br /&gt;
* '''Pointer to the wiki page that documents the article:''' [[Document_GPF_activities_by_Suzanne_Pierce | Page]]&lt;br /&gt;
* '''Expected submission date:'''&lt;br /&gt;
&lt;br /&gt;
=== [Pope 2015] ===&lt;br /&gt;
&lt;br /&gt;
* '''Authors and affiliations:''' [[Allen Pope]]&lt;br /&gt;
* '''Keywords of research area:'''&lt;br /&gt;
* '''Tentative title:'''&lt;br /&gt;
* '''Short abstract:'''&lt;br /&gt;
* '''Relationship to other publications:''' (is the article based on a previously published article? is it new content?)&lt;br /&gt;
* '''Pointer to the wiki page that documents the article:''' [[Document_GPF_activities_by_Allen_Pope | Page]]&lt;br /&gt;
* '''Expected submission date:'''&lt;br /&gt;
&lt;br /&gt;
=== [Read and Winslow 2015] ===&lt;br /&gt;
&lt;br /&gt;
* '''Authors and affiliations:''' [[Jordan Read]] and [[Luke Winslow]]&lt;br /&gt;
* '''Keywords of research area:'''&lt;br /&gt;
* '''Tentative title:'''&lt;br /&gt;
* '''Short abstract:'''&lt;br /&gt;
* '''Relationship to other publications:''' (is the article based on a previously published article? is it new content?)&lt;br /&gt;
* '''Pointer to the wiki page that documents the article:''' [[Document_GPF_activities_by_Jordan_Read | Page]]&lt;br /&gt;
* '''Expected submission date:'''&lt;br /&gt;
&lt;br /&gt;
=== [Tzeng 2015] ===&lt;br /&gt;
&lt;br /&gt;
* '''Authors and affiliations:''' [[Mimi Tzeng]]&lt;br /&gt;
* '''Keywords of research area:'''&lt;br /&gt;
* '''Tentative title:'''&lt;br /&gt;
* '''Short abstract:'''&lt;br /&gt;
* '''Relationship to other publications:''' (is the article based on a previously published article? is it new content?)&lt;br /&gt;
* '''Pointer to the wiki page that documents the article:''' [[Document_GPF_activities_by_Mimi_Tzeng | Page]]&lt;br /&gt;
* '''Expected submission date:'''&lt;br /&gt;
&lt;br /&gt;
=== [Villamizar 2015] ===&lt;br /&gt;
&lt;br /&gt;
* '''Authors and affiliations:''' [[Sandra Villamizar]]&lt;br /&gt;
* '''Keywords of research area:'''&lt;br /&gt;
* '''Tentative title:'''&lt;br /&gt;
* '''Short abstract:'''&lt;br /&gt;
* '''Relationship to other publications:''' (is the article based on a previously published article? is it new content?)&lt;br /&gt;
* '''Pointer to the wiki page that documents the article:''' [[Document_GPF_activities_by_Sandra_Villamizar | Page]]&lt;br /&gt;
* '''Expected submission date:'''&lt;br /&gt;
&lt;br /&gt;
=== [Yu 2015] ===&lt;br /&gt;
&lt;br /&gt;
* '''Authors and affiliations:''' [[Xuan Yu]]&lt;br /&gt;
* '''Keywords of research area:'''&lt;br /&gt;
* '''Tentative title:'''&lt;br /&gt;
* '''Short abstract:'''&lt;br /&gt;
* '''Relationship to other publications:''' (is the article based on a previously published article? is it new content?)&lt;br /&gt;
* '''Pointer to the wiki page that documents the article:''' [[Document_GPF_activities_by_Xuan_Yu | Page]]&lt;br /&gt;
* '''Expected submission date:'''&lt;br /&gt;
&lt;br /&gt;
== Special Issue Editors ==&lt;br /&gt;
&lt;br /&gt;
* Co-editor: &lt;br /&gt;
* Co-editor:&lt;br /&gt;
* Co-editor: &lt;br /&gt;
&lt;br /&gt;
The editors will only accept submissions that follow the [[Develop_proposal_for_special_issue#Special_Issue_Review_Criteria | special issue review criteria]].&lt;br /&gt;
&lt;br /&gt;
The editors will select a set of reviewers to handle the submissions.  Reviewers will include computer scientists, library scientists, and geoscientists.&lt;br /&gt;
&lt;br /&gt;
== Special Issue Review Criteria ==&lt;br /&gt;
&lt;br /&gt;
The reviewers will be asked to provide feedback on the papers according to the following criteria:&lt;br /&gt;
&lt;br /&gt;
* Documentation of the datasets: descriptions of datasets, unique identifiers, repositories.&lt;br /&gt;
* Documentation of software: description of all software used (including pre-processing of data, visualization steps, etc), unique identifiers, repositories.&lt;br /&gt;
* Documentation of the provenance of results: provenance for each figure or result, such as the workflow or the provenance record.&lt;br /&gt;
&lt;br /&gt;
== Tentative Timeline ==&lt;br /&gt;
&lt;br /&gt;
* Journal committed to special issue: April 15, 2015&lt;br /&gt;
* Submissions due to editors: June 30, 2015&lt;br /&gt;
* Reviews due: Sept 15, 2015&lt;br /&gt;
* Decisions out to authors: Sept 30, 2015&lt;br /&gt;
* Revisions due: October 31, 2015&lt;br /&gt;
* Final versions due November 15, 2015&lt;br /&gt;
* Issue published December 31, 2015&lt;br /&gt;
&lt;br /&gt;
&amp;lt;!-- Add any wiki Text above this Line --&amp;gt;&lt;br /&gt;
&amp;lt;!-- Do NOT Edit below this Line --&amp;gt;&lt;br /&gt;
{{#set:&lt;br /&gt;
	Owner=Yolanda_Gil|&lt;br /&gt;
	Participants=Xuan_Yu|&lt;br /&gt;
	Participants=Chris_Duffy|&lt;br /&gt;
	Participants=Scott_Peckham|&lt;br /&gt;
	Participants=Cedric_David|&lt;br /&gt;
	Participants=Ibrahim_Demir|&lt;br /&gt;
	Participants=Wally_Fulweiler|&lt;br /&gt;
	Participants=Leif_Karlstrom|&lt;br /&gt;
	Participants=Kyo_Lee|&lt;br /&gt;
	Participants=Kim_Miller|&lt;br /&gt;
	Participants=Heath_Mills|&lt;br /&gt;
	Participants=Ji-Hyun_Oh|&lt;br /&gt;
	Participants=Suzanne_Pierce|&lt;br /&gt;
	Participants=Allen_Pope|&lt;br /&gt;
	Participants=Jordan_Read|&lt;br /&gt;
	Participants=Mimi_Tzeng|&lt;br /&gt;
	Participants=Sandra_Villamizar|&lt;br /&gt;
	Progress=20|&lt;br /&gt;
	StartDate=2015-03-10|&lt;br /&gt;
	TargetDate=2015-03-16|&lt;br /&gt;
	Type=Low}}&lt;/div&gt;</summary>
		<author><name>Suzanne</name></author>	</entry>

	<entry>
		<id>https://www.organicdatascience.org/gpf/index.php?title=Develop_proposal_for_special_issue&amp;diff=11297</id>
		<title>Develop proposal for special issue</title>
		<link rel="alternate" type="text/html" href="https://www.organicdatascience.org/gpf/index.php?title=Develop_proposal_for_special_issue&amp;diff=11297"/>
				<updated>2015-03-13T16:34:24Z</updated>
		
		<summary type="html">&lt;p&gt;Suzanne: /* Motivation: The EarthCube Initiative and the GeoSoft Project */&lt;/p&gt;
&lt;hr /&gt;
&lt;div&gt;[[Category:Task]]&lt;br /&gt;
&lt;br /&gt;
== Background: Why a Special Issue on Geoscience Papers of the Future? ==&lt;br /&gt;
&lt;br /&gt;
[[Discuss_what_we_will_consider_a_GPF#The_Vision | Include here our discussion for the vision]]&lt;br /&gt;
&lt;br /&gt;
Motivated by need to fully document and make research accessible and reproducable. &lt;br /&gt;
&lt;br /&gt;
=== Motivation: The EarthCube Initiative and the GeoSoft Project ===&lt;br /&gt;
&lt;br /&gt;
[http://www.geosoft-earthcube.org/about Include here background about GeoSoft from the web site]&lt;br /&gt;
&lt;br /&gt;
OSTP memo.  EarthCube reports.&lt;br /&gt;
Other reports that talk about the need for new approaches to editing.&lt;br /&gt;
&lt;br /&gt;
It's possible that small or very large contributions are not well captured in the current publishing paradigms.  Nanopublications.&lt;br /&gt;
&lt;br /&gt;
For example, nano-publications are a possible way to reflect advances in a research process that may not merit a full pubication but they are useful advances to share with the community. A challenge here is that there is a stigma in publishing for publishing units that are too small or very small.  &lt;br /&gt;
&lt;br /&gt;
Alternatively, a very large piece of research or work with many parts may be better suited to a GPF style publication.&lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
Perhaps, the concept of a 'paper' can be better reflected in the concept of a 'wrapper' or a collection of materials and resources. The purpose is to assure that publications are representative of the work, effort, and results achieved in the research process.&lt;br /&gt;
&lt;br /&gt;
=== What is a GPF ===&lt;br /&gt;
&lt;br /&gt;
[[Discuss_what_we_will_consider_a_GPF#What_is_a_Geoscience_Paper_of_the_Future.3F | Include here our discussion of what is a GPF]]&lt;br /&gt;
&lt;br /&gt;
=== The challenges of creating GPFs ===&lt;br /&gt;
&lt;br /&gt;
The articles in this issue reflect the current best practice for generating a Geoscience Paper of the Future.&lt;br /&gt;
&lt;br /&gt;
Figures in the paper may be a clean versions of an image generated by software.  To the extent possible, authors have included clear delineations of provenance. The goal is to assure that readers may regenerate the figures using documented workflows, data, and codes.  &lt;br /&gt;
&lt;br /&gt;
An important note (Allen, Sandra) is that frequently figures are generated by code, scripts, etc. yet the actual figure is finalized with user.....&lt;br /&gt;
&lt;br /&gt;
Mimi is trying to say: is it really worth belaboring the point about how the prettified version of the figure is made? If it is: both of the visualization software I've used (Matlab and SigmaPlot) have actual code in the background that specifies how to set up the prettification, and this code can be found, copied out, and rerun to generate the exact same figure with all of the prettification in the same place. SigmaPlot uses Visual Basic (I think) in its macros. If it is an important point about explicit code, this should be doable. But I'm not sure it's strictly necessary to specify exactly where all the prettifications are to get the gist across.&lt;br /&gt;
&lt;br /&gt;
=== Related work ===&lt;br /&gt;
&lt;br /&gt;
[[Discuss_what_we_will_consider_a_GPF#New_Frameworks_to_Create_a_New_Generation_of_Scientific_Articles | Include here the related work we have discussed]]&lt;br /&gt;
&lt;br /&gt;
== Papers to be included ==&lt;br /&gt;
&lt;br /&gt;
For each submission, we describe:&lt;br /&gt;
&lt;br /&gt;
* '''Authors and affiliations'''&lt;br /&gt;
* '''Keywords of research area'''&lt;br /&gt;
* '''Tentative title'''&lt;br /&gt;
* '''Short abstract'''&lt;br /&gt;
* '''Relationship to other publications''' (is the article based on a previously published article? is it new content?)&lt;br /&gt;
* '''Pointer to the wiki page that documents the article'''&lt;br /&gt;
* '''Expected submission date'''&lt;br /&gt;
&lt;br /&gt;
=== [David 2015] ===&lt;br /&gt;
&lt;br /&gt;
* '''Authors and affiliations:''' [[Cedric David]]&lt;br /&gt;
* '''Keywords of research area:'''&lt;br /&gt;
* '''Tentative title:'''&lt;br /&gt;
* '''Short abstract:'''&lt;br /&gt;
* '''Relationship to other publications:''' &lt;br /&gt;
* '''Pointer to the wiki page that documents the article:''' [[Document_GPF_activities_by_Cedric_David | Page]]&lt;br /&gt;
* '''Expected submission date:'''&lt;br /&gt;
&lt;br /&gt;
=== [Demir 2015] ===&lt;br /&gt;
&lt;br /&gt;
* '''Authors and affiliations:''' [[Ibrahim Demir]]&lt;br /&gt;
* '''Keywords of research area:'''&lt;br /&gt;
* '''Tentative title:'''&lt;br /&gt;
* '''Short abstract:'''&lt;br /&gt;
* '''Relationship to other publications:''' (is the article based on a previously published article? is it new content?)&lt;br /&gt;
* '''Pointer to the wiki page that documents the article:''' [[Document_GPF_activities_by_Ibrahim_Demir | Page]]&lt;br /&gt;
* '''Expected submission date:'''&lt;br /&gt;
&lt;br /&gt;
=== [Fulweiler 2015] ===&lt;br /&gt;
&lt;br /&gt;
* '''Authors and affiliations:''' [[Wally Fulweiler]]&lt;br /&gt;
* '''Keywords of research area:'''&lt;br /&gt;
* '''Tentative title:'''&lt;br /&gt;
* '''Short abstract:'''&lt;br /&gt;
* '''Relationship to other publications:''' (is the article based on a previously published article? is it new content?)&lt;br /&gt;
* '''Pointer to the wiki page that documents the article:''' [[Document_GPF_activities_by_Wally_Fulweiler | Page]]&lt;br /&gt;
* '''Expected submission date:'''&lt;br /&gt;
&lt;br /&gt;
=== [Karlstrom and Lay 2015] ===&lt;br /&gt;
&lt;br /&gt;
* '''Authors and affiliations:''' [[Leif Karlstrom]] and [[Lay Kuan Loh]]&lt;br /&gt;
* '''Keywords of research area:'''&lt;br /&gt;
* '''Tentative title:'''&lt;br /&gt;
* '''Short abstract:'''&lt;br /&gt;
* '''Relationship to other publications:''' (is the article based on a previously published article? is it new content?)&lt;br /&gt;
* '''Pointer to the wiki page that documents the article:''' [[Document_GPF_activities_by_Leif_Karlstron | Page]]&lt;br /&gt;
* '''Expected submission date:'''&lt;br /&gt;
&lt;br /&gt;
=== [Lee 2015] ===&lt;br /&gt;
&lt;br /&gt;
* '''Authors and affiliations:''' [[Kyo Lee]]&lt;br /&gt;
* '''Keywords of research area:'''&lt;br /&gt;
* '''Tentative title:'''&lt;br /&gt;
* '''Short abstract:'''&lt;br /&gt;
* '''Relationship to other publications:''' (is the article based on a previously published article? is it new content?)&lt;br /&gt;
* '''Pointer to the wiki page that documents the article:''' [[Document_GPF_activities_by_Kyo_Lee | Page]]&lt;br /&gt;
* '''Expected submission date:'''&lt;br /&gt;
&lt;br /&gt;
=== [Miller 2015] ===&lt;br /&gt;
&lt;br /&gt;
* '''Authors and affiliations:''' [[Kim Miller]]&lt;br /&gt;
* '''Keywords of research area:'''&lt;br /&gt;
* '''Tentative title:'''&lt;br /&gt;
* '''Short abstract:'''&lt;br /&gt;
* '''Relationship to other publications:''' (is the article based on a previously published article? is it new content?)&lt;br /&gt;
* '''Pointer to the wiki page that documents the article:''' [[Document_GPF_activities_by_Kim_Miller | Page]]&lt;br /&gt;
* '''Expected submission date:'''&lt;br /&gt;
&lt;br /&gt;
=== [Mills 2015] ===&lt;br /&gt;
&lt;br /&gt;
* '''Authors and affiliations:''' [[Heath Mills]]&lt;br /&gt;
* '''Keywords of research area:'''&lt;br /&gt;
* '''Tentative title:'''&lt;br /&gt;
* '''Short abstract:'''&lt;br /&gt;
* '''Relationship to other publications:''' (is the article based on a previously published article? is it new content?)&lt;br /&gt;
* '''Pointer to the wiki page that documents the article:''' [[Document_GPF_activities_by_Heith_Mills | Page]]&lt;br /&gt;
* '''Expected submission date:'''&lt;br /&gt;
&lt;br /&gt;
=== [Oh 2015] ===&lt;br /&gt;
&lt;br /&gt;
* '''Authors and affiliations:''' [[Ji-Hyun Oh]]&lt;br /&gt;
* '''Keywords of research area:'''&lt;br /&gt;
* '''Tentative title:'''&lt;br /&gt;
* '''Short abstract:'''&lt;br /&gt;
* '''Relationship to other publications:''' (is the article based on a previously published article? is it new content?)&lt;br /&gt;
* '''Pointer to the wiki page that documents the article:''' [[Document_GPF_activities_by_Ji_Hyun | Page]]&lt;br /&gt;
* '''Expected submission date:'''&lt;br /&gt;
&lt;br /&gt;
=== [Pierce 2015] ===&lt;br /&gt;
&lt;br /&gt;
* '''Authors and affiliations:''' [[Suzanne Pierce]]&lt;br /&gt;
* '''Keywords of research area:'''&lt;br /&gt;
* '''Tentative title:'''&lt;br /&gt;
* '''Short abstract:'''&lt;br /&gt;
* '''Relationship to other publications:''' (is the article based on a previously published article? is it new content?)&lt;br /&gt;
* '''Pointer to the wiki page that documents the article:''' [[Document_GPF_activities_by_Suzanne_Pierce | Page]]&lt;br /&gt;
* '''Expected submission date:'''&lt;br /&gt;
&lt;br /&gt;
=== [Pope 2015] ===&lt;br /&gt;
&lt;br /&gt;
* '''Authors and affiliations:''' [[Allen Pope]]&lt;br /&gt;
* '''Keywords of research area:'''&lt;br /&gt;
* '''Tentative title:'''&lt;br /&gt;
* '''Short abstract:'''&lt;br /&gt;
* '''Relationship to other publications:''' (is the article based on a previously published article? is it new content?)&lt;br /&gt;
* '''Pointer to the wiki page that documents the article:''' [[Document_GPF_activities_by_Allen_Pope | Page]]&lt;br /&gt;
* '''Expected submission date:'''&lt;br /&gt;
&lt;br /&gt;
=== [Read and Winslow 2015] ===&lt;br /&gt;
&lt;br /&gt;
* '''Authors and affiliations:''' [[Jordan Read]] and [[Luke Winslow]]&lt;br /&gt;
* '''Keywords of research area:'''&lt;br /&gt;
* '''Tentative title:'''&lt;br /&gt;
* '''Short abstract:'''&lt;br /&gt;
* '''Relationship to other publications:''' (is the article based on a previously published article? is it new content?)&lt;br /&gt;
* '''Pointer to the wiki page that documents the article:''' [[Document_GPF_activities_by_Jordan_Read | Page]]&lt;br /&gt;
* '''Expected submission date:'''&lt;br /&gt;
&lt;br /&gt;
=== [Tzeng 2015] ===&lt;br /&gt;
&lt;br /&gt;
* '''Authors and affiliations:''' [[Mimi Tzeng]]&lt;br /&gt;
* '''Keywords of research area:'''&lt;br /&gt;
* '''Tentative title:'''&lt;br /&gt;
* '''Short abstract:'''&lt;br /&gt;
* '''Relationship to other publications:''' (is the article based on a previously published article? is it new content?)&lt;br /&gt;
* '''Pointer to the wiki page that documents the article:''' [[Document_GPF_activities_by_Mimi_Tzeng | Page]]&lt;br /&gt;
* '''Expected submission date:'''&lt;br /&gt;
&lt;br /&gt;
=== [Villamizar 2015] ===&lt;br /&gt;
&lt;br /&gt;
* '''Authors and affiliations:''' [[Sandra Villamizar]]&lt;br /&gt;
* '''Keywords of research area:'''&lt;br /&gt;
* '''Tentative title:'''&lt;br /&gt;
* '''Short abstract:'''&lt;br /&gt;
* '''Relationship to other publications:''' (is the article based on a previously published article? is it new content?)&lt;br /&gt;
* '''Pointer to the wiki page that documents the article:''' [[Document_GPF_activities_by_Sandra_Villamizar | Page]]&lt;br /&gt;
* '''Expected submission date:'''&lt;br /&gt;
&lt;br /&gt;
=== [Yu 2015] ===&lt;br /&gt;
&lt;br /&gt;
* '''Authors and affiliations:''' [[Xuan Yu]]&lt;br /&gt;
* '''Keywords of research area:'''&lt;br /&gt;
* '''Tentative title:'''&lt;br /&gt;
* '''Short abstract:'''&lt;br /&gt;
* '''Relationship to other publications:''' (is the article based on a previously published article? is it new content?)&lt;br /&gt;
* '''Pointer to the wiki page that documents the article:''' [[Document_GPF_activities_by_Xuan_Yu | Page]]&lt;br /&gt;
* '''Expected submission date:'''&lt;br /&gt;
&lt;br /&gt;
== Special Issue Editors ==&lt;br /&gt;
&lt;br /&gt;
* Co-editor: &lt;br /&gt;
* Co-editor:&lt;br /&gt;
* Co-editor: &lt;br /&gt;
&lt;br /&gt;
The editors will only accept submissions that follow the [[Develop_proposal_for_special_issue#Special_Issue_Review_Criteria | special issue review criteria]].&lt;br /&gt;
&lt;br /&gt;
The editors will select a set of reviewers to handle the submissions.  Reviewers will include computer scientists, library scientists, and geoscientists.&lt;br /&gt;
&lt;br /&gt;
== Special Issue Review Criteria ==&lt;br /&gt;
&lt;br /&gt;
The reviewers will be asked to provide feedback on the papers according to the following criteria:&lt;br /&gt;
&lt;br /&gt;
* Documentation of the datasets: descriptions of datasets, unique identifiers, repositories.&lt;br /&gt;
* Documentation of software: description of all software used (including pre-processing of data, visualization steps, etc), unique identifiers, repositories.&lt;br /&gt;
* Documentation of the provenance of results: provenance for each figure or result, such as the workflow or the provenance record.&lt;br /&gt;
&lt;br /&gt;
== Tentative Timeline ==&lt;br /&gt;
&lt;br /&gt;
* Journal committed to special issue: April 15, 2015&lt;br /&gt;
* Submissions due to editors: June 30, 2015&lt;br /&gt;
* Reviews due: Sept 15, 2015&lt;br /&gt;
* Decisions out to authors: Sept 30, 2015&lt;br /&gt;
* Revisions due: October 31, 2015&lt;br /&gt;
* Final versions due November 15, 2015&lt;br /&gt;
* Issue published December 31, 2015&lt;br /&gt;
&lt;br /&gt;
&amp;lt;!-- Add any wiki Text above this Line --&amp;gt;&lt;br /&gt;
&amp;lt;!-- Do NOT Edit below this Line --&amp;gt;&lt;br /&gt;
{{#set:&lt;br /&gt;
	Owner=Yolanda_Gil|&lt;br /&gt;
	Participants=Xuan_Yu|&lt;br /&gt;
	Participants=Chris_Duffy|&lt;br /&gt;
	Participants=Scott_Peckham|&lt;br /&gt;
	Participants=Cedric_David|&lt;br /&gt;
	Participants=Ibrahim_Demir|&lt;br /&gt;
	Participants=Wally_Fulweiler|&lt;br /&gt;
	Participants=Leif_Karlstrom|&lt;br /&gt;
	Participants=Kyo_Lee|&lt;br /&gt;
	Participants=Kim_Miller|&lt;br /&gt;
	Participants=Heath_Mills|&lt;br /&gt;
	Participants=Ji-Hyun_Oh|&lt;br /&gt;
	Participants=Suzanne_Pierce|&lt;br /&gt;
	Participants=Allen_Pope|&lt;br /&gt;
	Participants=Jordan_Read|&lt;br /&gt;
	Participants=Mimi_Tzeng|&lt;br /&gt;
	Participants=Sandra_Villamizar|&lt;br /&gt;
	Progress=20|&lt;br /&gt;
	StartDate=2015-03-10|&lt;br /&gt;
	TargetDate=2015-03-16|&lt;br /&gt;
	Type=Low}}&lt;/div&gt;</summary>
		<author><name>Suzanne</name></author>	</entry>

	<entry>
		<id>https://www.organicdatascience.org/gpf/index.php?title=Develop_proposal_for_special_issue&amp;diff=11296</id>
		<title>Develop proposal for special issue</title>
		<link rel="alternate" type="text/html" href="https://www.organicdatascience.org/gpf/index.php?title=Develop_proposal_for_special_issue&amp;diff=11296"/>
				<updated>2015-03-13T16:33:46Z</updated>
		
		<summary type="html">&lt;p&gt;Suzanne: /* Motivation: The EarthCube Initiative and the GeoSoft Project */&lt;/p&gt;
&lt;hr /&gt;
&lt;div&gt;[[Category:Task]]&lt;br /&gt;
&lt;br /&gt;
== Background: Why a Special Issue on Geoscience Papers of the Future? ==&lt;br /&gt;
&lt;br /&gt;
[[Discuss_what_we_will_consider_a_GPF#The_Vision | Include here our discussion for the vision]]&lt;br /&gt;
&lt;br /&gt;
Motivated by need to fully document and make research accessible and reproducable. &lt;br /&gt;
&lt;br /&gt;
=== Motivation: The EarthCube Initiative and the GeoSoft Project ===&lt;br /&gt;
&lt;br /&gt;
[http://www.geosoft-earthcube.org/about Include here background about GeoSoft from the web site]&lt;br /&gt;
&lt;br /&gt;
OSTP memo.  EarthCube reports.&lt;br /&gt;
Other reports that talk about the need for new approaches to editing.&lt;br /&gt;
&lt;br /&gt;
It's possible that small or very large contributions are not well captured in the current publishing paradigms.  Nanopublications.&lt;br /&gt;
&lt;br /&gt;
For example, nano-publications are a possible way to reflect advances in a research process that may not merit a full pubication but they are useful advances to share with the community. A challenge here is that there is a stigma in publishing for publishing units that are too small or very small.  &lt;br /&gt;
&lt;br /&gt;
Alternatively, a very large piece of research or work with many parts may be better suited to a GPF style publication.&lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
Perhaps, the concept of a 'paper' can be better reflected in the concept of a 'wrapper' or a collection of materials and resources.&lt;br /&gt;
&lt;br /&gt;
=== What is a GPF ===&lt;br /&gt;
&lt;br /&gt;
[[Discuss_what_we_will_consider_a_GPF#What_is_a_Geoscience_Paper_of_the_Future.3F | Include here our discussion of what is a GPF]]&lt;br /&gt;
&lt;br /&gt;
=== The challenges of creating GPFs ===&lt;br /&gt;
&lt;br /&gt;
The articles in this issue reflect the current best practice for generating a Geoscience Paper of the Future.&lt;br /&gt;
&lt;br /&gt;
Figures in the paper may be a clean versions of an image generated by software.  To the extent possible, authors have included clear delineations of provenance. The goal is to assure that readers may regenerate the figures using documented workflows, data, and codes.  &lt;br /&gt;
&lt;br /&gt;
An important note (Allen, Sandra) is that frequently figures are generated by code, scripts, etc. yet the actual figure is finalized with user.....&lt;br /&gt;
&lt;br /&gt;
Mimi is trying to say: is it really worth belaboring the point about how the prettified version of the figure is made? If it is: both of the visualization software I've used (Matlab and SigmaPlot) have actual code in the background that specifies how to set up the prettification, and this code can be found, copied out, and rerun to generate the exact same figure with all of the prettification in the same place. SigmaPlot uses Visual Basic (I think) in its macros. If it is an important point about explicit code, this should be doable. But I'm not sure it's strictly necessary to specify exactly where all the prettifications are to get the gist across.&lt;br /&gt;
&lt;br /&gt;
=== Related work ===&lt;br /&gt;
&lt;br /&gt;
[[Discuss_what_we_will_consider_a_GPF#New_Frameworks_to_Create_a_New_Generation_of_Scientific_Articles | Include here the related work we have discussed]]&lt;br /&gt;
&lt;br /&gt;
== Papers to be included ==&lt;br /&gt;
&lt;br /&gt;
For each submission, we describe:&lt;br /&gt;
&lt;br /&gt;
* '''Authors and affiliations'''&lt;br /&gt;
* '''Keywords of research area'''&lt;br /&gt;
* '''Tentative title'''&lt;br /&gt;
* '''Short abstract'''&lt;br /&gt;
* '''Relationship to other publications''' (is the article based on a previously published article? is it new content?)&lt;br /&gt;
* '''Pointer to the wiki page that documents the article'''&lt;br /&gt;
* '''Expected submission date'''&lt;br /&gt;
&lt;br /&gt;
=== [David 2015] ===&lt;br /&gt;
&lt;br /&gt;
* '''Authors and affiliations:''' [[Cedric David]]&lt;br /&gt;
* '''Keywords of research area:'''&lt;br /&gt;
* '''Tentative title:'''&lt;br /&gt;
* '''Short abstract:'''&lt;br /&gt;
* '''Relationship to other publications:''' &lt;br /&gt;
* '''Pointer to the wiki page that documents the article:''' [[Document_GPF_activities_by_Cedric_David | Page]]&lt;br /&gt;
* '''Expected submission date:'''&lt;br /&gt;
&lt;br /&gt;
=== [Demir 2015] ===&lt;br /&gt;
&lt;br /&gt;
* '''Authors and affiliations:''' [[Ibrahim Demir]]&lt;br /&gt;
* '''Keywords of research area:'''&lt;br /&gt;
* '''Tentative title:'''&lt;br /&gt;
* '''Short abstract:'''&lt;br /&gt;
* '''Relationship to other publications:''' (is the article based on a previously published article? is it new content?)&lt;br /&gt;
* '''Pointer to the wiki page that documents the article:''' [[Document_GPF_activities_by_Ibrahim_Demir | Page]]&lt;br /&gt;
* '''Expected submission date:'''&lt;br /&gt;
&lt;br /&gt;
=== [Fulweiler 2015] ===&lt;br /&gt;
&lt;br /&gt;
* '''Authors and affiliations:''' [[Wally Fulweiler]]&lt;br /&gt;
* '''Keywords of research area:'''&lt;br /&gt;
* '''Tentative title:'''&lt;br /&gt;
* '''Short abstract:'''&lt;br /&gt;
* '''Relationship to other publications:''' (is the article based on a previously published article? is it new content?)&lt;br /&gt;
* '''Pointer to the wiki page that documents the article:''' [[Document_GPF_activities_by_Wally_Fulweiler | Page]]&lt;br /&gt;
* '''Expected submission date:'''&lt;br /&gt;
&lt;br /&gt;
=== [Karlstrom and Lay 2015] ===&lt;br /&gt;
&lt;br /&gt;
* '''Authors and affiliations:''' [[Leif Karlstrom]] and [[Lay Kuan Loh]]&lt;br /&gt;
* '''Keywords of research area:'''&lt;br /&gt;
* '''Tentative title:'''&lt;br /&gt;
* '''Short abstract:'''&lt;br /&gt;
* '''Relationship to other publications:''' (is the article based on a previously published article? is it new content?)&lt;br /&gt;
* '''Pointer to the wiki page that documents the article:''' [[Document_GPF_activities_by_Leif_Karlstron | Page]]&lt;br /&gt;
* '''Expected submission date:'''&lt;br /&gt;
&lt;br /&gt;
=== [Lee 2015] ===&lt;br /&gt;
&lt;br /&gt;
* '''Authors and affiliations:''' [[Kyo Lee]]&lt;br /&gt;
* '''Keywords of research area:'''&lt;br /&gt;
* '''Tentative title:'''&lt;br /&gt;
* '''Short abstract:'''&lt;br /&gt;
* '''Relationship to other publications:''' (is the article based on a previously published article? is it new content?)&lt;br /&gt;
* '''Pointer to the wiki page that documents the article:''' [[Document_GPF_activities_by_Kyo_Lee | Page]]&lt;br /&gt;
* '''Expected submission date:'''&lt;br /&gt;
&lt;br /&gt;
=== [Miller 2015] ===&lt;br /&gt;
&lt;br /&gt;
* '''Authors and affiliations:''' [[Kim Miller]]&lt;br /&gt;
* '''Keywords of research area:'''&lt;br /&gt;
* '''Tentative title:'''&lt;br /&gt;
* '''Short abstract:'''&lt;br /&gt;
* '''Relationship to other publications:''' (is the article based on a previously published article? is it new content?)&lt;br /&gt;
* '''Pointer to the wiki page that documents the article:''' [[Document_GPF_activities_by_Kim_Miller | Page]]&lt;br /&gt;
* '''Expected submission date:'''&lt;br /&gt;
&lt;br /&gt;
=== [Mills 2015] ===&lt;br /&gt;
&lt;br /&gt;
* '''Authors and affiliations:''' [[Heath Mills]]&lt;br /&gt;
* '''Keywords of research area:'''&lt;br /&gt;
* '''Tentative title:'''&lt;br /&gt;
* '''Short abstract:'''&lt;br /&gt;
* '''Relationship to other publications:''' (is the article based on a previously published article? is it new content?)&lt;br /&gt;
* '''Pointer to the wiki page that documents the article:''' [[Document_GPF_activities_by_Heith_Mills | Page]]&lt;br /&gt;
* '''Expected submission date:'''&lt;br /&gt;
&lt;br /&gt;
=== [Oh 2015] ===&lt;br /&gt;
&lt;br /&gt;
* '''Authors and affiliations:''' [[Ji-Hyun Oh]]&lt;br /&gt;
* '''Keywords of research area:'''&lt;br /&gt;
* '''Tentative title:'''&lt;br /&gt;
* '''Short abstract:'''&lt;br /&gt;
* '''Relationship to other publications:''' (is the article based on a previously published article? is it new content?)&lt;br /&gt;
* '''Pointer to the wiki page that documents the article:''' [[Document_GPF_activities_by_Ji_Hyun | Page]]&lt;br /&gt;
* '''Expected submission date:'''&lt;br /&gt;
&lt;br /&gt;
=== [Pierce 2015] ===&lt;br /&gt;
&lt;br /&gt;
* '''Authors and affiliations:''' [[Suzanne Pierce]]&lt;br /&gt;
* '''Keywords of research area:'''&lt;br /&gt;
* '''Tentative title:'''&lt;br /&gt;
* '''Short abstract:'''&lt;br /&gt;
* '''Relationship to other publications:''' (is the article based on a previously published article? is it new content?)&lt;br /&gt;
* '''Pointer to the wiki page that documents the article:''' [[Document_GPF_activities_by_Suzanne_Pierce | Page]]&lt;br /&gt;
* '''Expected submission date:'''&lt;br /&gt;
&lt;br /&gt;
=== [Pope 2015] ===&lt;br /&gt;
&lt;br /&gt;
* '''Authors and affiliations:''' [[Allen Pope]]&lt;br /&gt;
* '''Keywords of research area:'''&lt;br /&gt;
* '''Tentative title:'''&lt;br /&gt;
* '''Short abstract:'''&lt;br /&gt;
* '''Relationship to other publications:''' (is the article based on a previously published article? is it new content?)&lt;br /&gt;
* '''Pointer to the wiki page that documents the article:''' [[Document_GPF_activities_by_Allen_Pope | Page]]&lt;br /&gt;
* '''Expected submission date:'''&lt;br /&gt;
&lt;br /&gt;
=== [Read and Winslow 2015] ===&lt;br /&gt;
&lt;br /&gt;
* '''Authors and affiliations:''' [[Jordan Read]] and [[Luke Winslow]]&lt;br /&gt;
* '''Keywords of research area:'''&lt;br /&gt;
* '''Tentative title:'''&lt;br /&gt;
* '''Short abstract:'''&lt;br /&gt;
* '''Relationship to other publications:''' (is the article based on a previously published article? is it new content?)&lt;br /&gt;
* '''Pointer to the wiki page that documents the article:''' [[Document_GPF_activities_by_Jordan_Read | Page]]&lt;br /&gt;
* '''Expected submission date:'''&lt;br /&gt;
&lt;br /&gt;
=== [Tzeng 2015] ===&lt;br /&gt;
&lt;br /&gt;
* '''Authors and affiliations:''' [[Mimi Tzeng]]&lt;br /&gt;
* '''Keywords of research area:'''&lt;br /&gt;
* '''Tentative title:'''&lt;br /&gt;
* '''Short abstract:'''&lt;br /&gt;
* '''Relationship to other publications:''' (is the article based on a previously published article? is it new content?)&lt;br /&gt;
* '''Pointer to the wiki page that documents the article:''' [[Document_GPF_activities_by_Mimi_Tzeng | Page]]&lt;br /&gt;
* '''Expected submission date:'''&lt;br /&gt;
&lt;br /&gt;
=== [Villamizar 2015] ===&lt;br /&gt;
&lt;br /&gt;
* '''Authors and affiliations:''' [[Sandra Villamizar]]&lt;br /&gt;
* '''Keywords of research area:'''&lt;br /&gt;
* '''Tentative title:'''&lt;br /&gt;
* '''Short abstract:'''&lt;br /&gt;
* '''Relationship to other publications:''' (is the article based on a previously published article? is it new content?)&lt;br /&gt;
* '''Pointer to the wiki page that documents the article:''' [[Document_GPF_activities_by_Sandra_Villamizar | Page]]&lt;br /&gt;
* '''Expected submission date:'''&lt;br /&gt;
&lt;br /&gt;
=== [Yu 2015] ===&lt;br /&gt;
&lt;br /&gt;
* '''Authors and affiliations:''' [[Xuan Yu]]&lt;br /&gt;
* '''Keywords of research area:'''&lt;br /&gt;
* '''Tentative title:'''&lt;br /&gt;
* '''Short abstract:'''&lt;br /&gt;
* '''Relationship to other publications:''' (is the article based on a previously published article? is it new content?)&lt;br /&gt;
* '''Pointer to the wiki page that documents the article:''' [[Document_GPF_activities_by_Xuan_Yu | Page]]&lt;br /&gt;
* '''Expected submission date:'''&lt;br /&gt;
&lt;br /&gt;
== Special Issue Editors ==&lt;br /&gt;
&lt;br /&gt;
* Co-editor: &lt;br /&gt;
* Co-editor:&lt;br /&gt;
* Co-editor: &lt;br /&gt;
&lt;br /&gt;
The editors will only accept submissions that follow the [[Develop_proposal_for_special_issue#Special_Issue_Review_Criteria | special issue review criteria]].&lt;br /&gt;
&lt;br /&gt;
The editors will select a set of reviewers to handle the submissions.  Reviewers will include computer scientists, library scientists, and geoscientists.&lt;br /&gt;
&lt;br /&gt;
== Special Issue Review Criteria ==&lt;br /&gt;
&lt;br /&gt;
The reviewers will be asked to provide feedback on the papers according to the following criteria:&lt;br /&gt;
&lt;br /&gt;
* Documentation of the datasets: descriptions of datasets, unique identifiers, repositories.&lt;br /&gt;
* Documentation of software: description of all software used (including pre-processing of data, visualization steps, etc), unique identifiers, repositories.&lt;br /&gt;
* Documentation of the provenance of results: provenance for each figure or result, such as the workflow or the provenance record.&lt;br /&gt;
&lt;br /&gt;
== Tentative Timeline ==&lt;br /&gt;
&lt;br /&gt;
* Journal committed to special issue: April 15, 2015&lt;br /&gt;
* Submissions due to editors: June 30, 2015&lt;br /&gt;
* Reviews due: Sept 15, 2015&lt;br /&gt;
* Decisions out to authors: Sept 30, 2015&lt;br /&gt;
* Revisions due: October 31, 2015&lt;br /&gt;
* Final versions due November 15, 2015&lt;br /&gt;
* Issue published December 31, 2015&lt;br /&gt;
&lt;br /&gt;
&amp;lt;!-- Add any wiki Text above this Line --&amp;gt;&lt;br /&gt;
&amp;lt;!-- Do NOT Edit below this Line --&amp;gt;&lt;br /&gt;
{{#set:&lt;br /&gt;
	Owner=Yolanda_Gil|&lt;br /&gt;
	Participants=Xuan_Yu|&lt;br /&gt;
	Participants=Chris_Duffy|&lt;br /&gt;
	Participants=Scott_Peckham|&lt;br /&gt;
	Participants=Cedric_David|&lt;br /&gt;
	Participants=Ibrahim_Demir|&lt;br /&gt;
	Participants=Wally_Fulweiler|&lt;br /&gt;
	Participants=Leif_Karlstrom|&lt;br /&gt;
	Participants=Kyo_Lee|&lt;br /&gt;
	Participants=Kim_Miller|&lt;br /&gt;
	Participants=Heath_Mills|&lt;br /&gt;
	Participants=Ji-Hyun_Oh|&lt;br /&gt;
	Participants=Suzanne_Pierce|&lt;br /&gt;
	Participants=Allen_Pope|&lt;br /&gt;
	Participants=Jordan_Read|&lt;br /&gt;
	Participants=Mimi_Tzeng|&lt;br /&gt;
	Participants=Sandra_Villamizar|&lt;br /&gt;
	Progress=20|&lt;br /&gt;
	StartDate=2015-03-10|&lt;br /&gt;
	TargetDate=2015-03-16|&lt;br /&gt;
	Type=Low}}&lt;/div&gt;</summary>
		<author><name>Suzanne</name></author>	</entry>

	<entry>
		<id>https://www.organicdatascience.org/gpf/index.php?title=Develop_proposal_for_special_issue&amp;diff=11295</id>
		<title>Develop proposal for special issue</title>
		<link rel="alternate" type="text/html" href="https://www.organicdatascience.org/gpf/index.php?title=Develop_proposal_for_special_issue&amp;diff=11295"/>
				<updated>2015-03-13T16:32:12Z</updated>
		
		<summary type="html">&lt;p&gt;Suzanne: /* Motivation: The EarthCube Initiative and the GeoSoft Project */&lt;/p&gt;
&lt;hr /&gt;
&lt;div&gt;[[Category:Task]]&lt;br /&gt;
&lt;br /&gt;
== Background: Why a Special Issue on Geoscience Papers of the Future? ==&lt;br /&gt;
&lt;br /&gt;
[[Discuss_what_we_will_consider_a_GPF#The_Vision | Include here our discussion for the vision]]&lt;br /&gt;
&lt;br /&gt;
Motivated by need to fully document and make research accessible and reproducable. &lt;br /&gt;
&lt;br /&gt;
=== Motivation: The EarthCube Initiative and the GeoSoft Project ===&lt;br /&gt;
&lt;br /&gt;
[http://www.geosoft-earthcube.org/about Include here background about GeoSoft from the web site]&lt;br /&gt;
&lt;br /&gt;
OSTP memo.  EarthCube reports.&lt;br /&gt;
Other reports that talk about the need for new approaches to editing.&lt;br /&gt;
&lt;br /&gt;
It's possible that small or very large contributions are not well captured in the current publishing paradigms.  Nanopublications.&lt;br /&gt;
&lt;br /&gt;
For example, nano-publications are a possible way to reflect advances in a research process that may not merit a full pubication but they are useful advances to share with the community.&lt;br /&gt;
&lt;br /&gt;
Alternatively, a very large piece of research or work with many parts may be better suited to a GPF style publication.&lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
Perhaps, the concept of a 'paper' can be better reflected in the concept of a 'wrapper' or a collection of materials and resources.&lt;br /&gt;
&lt;br /&gt;
=== What is a GPF ===&lt;br /&gt;
&lt;br /&gt;
[[Discuss_what_we_will_consider_a_GPF#What_is_a_Geoscience_Paper_of_the_Future.3F | Include here our discussion of what is a GPF]]&lt;br /&gt;
&lt;br /&gt;
=== The challenges of creating GPFs ===&lt;br /&gt;
&lt;br /&gt;
The articles in this issue reflect the current best practice for generating a Geoscience Paper of the Future.&lt;br /&gt;
&lt;br /&gt;
Figures in the paper may be a clean versions of an image generated by software.  To the extent possible, authors have included clear delineations of provenance. The goal is to assure that readers may regenerate the figures using documented workflows, data, and codes.  &lt;br /&gt;
&lt;br /&gt;
An important note (Allen, Sandra) is that frequently figures are generated by code, scripts, etc. yet the actual figure is finalized with user.....&lt;br /&gt;
&lt;br /&gt;
Mimi is trying to say: is it really worth belaboring the point about how the prettified version of the figure is made? If it is: both of the visualization software I've used (Matlab and SigmaPlot) have actual code in the background that specifies how to set up the prettification, and this code can be found, copied out, and rerun to generate the exact same figure with all of the prettification in the same place. SigmaPlot uses Visual Basic (I think) in its macros. If it is an important point about explicit code, this should be doable. But I'm not sure it's strictly necessary to specify exactly where all the prettifications are to get the gist across.&lt;br /&gt;
&lt;br /&gt;
=== Related work ===&lt;br /&gt;
&lt;br /&gt;
[[Discuss_what_we_will_consider_a_GPF#New_Frameworks_to_Create_a_New_Generation_of_Scientific_Articles | Include here the related work we have discussed]]&lt;br /&gt;
&lt;br /&gt;
== Papers to be included ==&lt;br /&gt;
&lt;br /&gt;
For each submission, we describe:&lt;br /&gt;
&lt;br /&gt;
* '''Authors and affiliations'''&lt;br /&gt;
* '''Keywords of research area'''&lt;br /&gt;
* '''Tentative title'''&lt;br /&gt;
* '''Short abstract'''&lt;br /&gt;
* '''Relationship to other publications''' (is the article based on a previously published article? is it new content?)&lt;br /&gt;
* '''Pointer to the wiki page that documents the article'''&lt;br /&gt;
* '''Expected submission date'''&lt;br /&gt;
&lt;br /&gt;
=== [David 2015] ===&lt;br /&gt;
&lt;br /&gt;
* '''Authors and affiliations:''' [[Cedric David]]&lt;br /&gt;
* '''Keywords of research area:'''&lt;br /&gt;
* '''Tentative title:'''&lt;br /&gt;
* '''Short abstract:'''&lt;br /&gt;
* '''Relationship to other publications:''' &lt;br /&gt;
* '''Pointer to the wiki page that documents the article:''' [[Document_GPF_activities_by_Cedric_David | Page]]&lt;br /&gt;
* '''Expected submission date:'''&lt;br /&gt;
&lt;br /&gt;
=== [Demir 2015] ===&lt;br /&gt;
&lt;br /&gt;
* '''Authors and affiliations:''' [[Ibrahim Demir]]&lt;br /&gt;
* '''Keywords of research area:'''&lt;br /&gt;
* '''Tentative title:'''&lt;br /&gt;
* '''Short abstract:'''&lt;br /&gt;
* '''Relationship to other publications:''' (is the article based on a previously published article? is it new content?)&lt;br /&gt;
* '''Pointer to the wiki page that documents the article:''' [[Document_GPF_activities_by_Ibrahim_Demir | Page]]&lt;br /&gt;
* '''Expected submission date:'''&lt;br /&gt;
&lt;br /&gt;
=== [Fulweiler 2015] ===&lt;br /&gt;
&lt;br /&gt;
* '''Authors and affiliations:''' [[Wally Fulweiler]]&lt;br /&gt;
* '''Keywords of research area:'''&lt;br /&gt;
* '''Tentative title:'''&lt;br /&gt;
* '''Short abstract:'''&lt;br /&gt;
* '''Relationship to other publications:''' (is the article based on a previously published article? is it new content?)&lt;br /&gt;
* '''Pointer to the wiki page that documents the article:''' [[Document_GPF_activities_by_Wally_Fulweiler | Page]]&lt;br /&gt;
* '''Expected submission date:'''&lt;br /&gt;
&lt;br /&gt;
=== [Karlstrom and Lay 2015] ===&lt;br /&gt;
&lt;br /&gt;
* '''Authors and affiliations:''' [[Leif Karlstrom]] and [[Lay Kuan Loh]]&lt;br /&gt;
* '''Keywords of research area:'''&lt;br /&gt;
* '''Tentative title:'''&lt;br /&gt;
* '''Short abstract:'''&lt;br /&gt;
* '''Relationship to other publications:''' (is the article based on a previously published article? is it new content?)&lt;br /&gt;
* '''Pointer to the wiki page that documents the article:''' [[Document_GPF_activities_by_Leif_Karlstron | Page]]&lt;br /&gt;
* '''Expected submission date:'''&lt;br /&gt;
&lt;br /&gt;
=== [Lee 2015] ===&lt;br /&gt;
&lt;br /&gt;
* '''Authors and affiliations:''' [[Kyo Lee]]&lt;br /&gt;
* '''Keywords of research area:'''&lt;br /&gt;
* '''Tentative title:'''&lt;br /&gt;
* '''Short abstract:'''&lt;br /&gt;
* '''Relationship to other publications:''' (is the article based on a previously published article? is it new content?)&lt;br /&gt;
* '''Pointer to the wiki page that documents the article:''' [[Document_GPF_activities_by_Kyo_Lee | Page]]&lt;br /&gt;
* '''Expected submission date:'''&lt;br /&gt;
&lt;br /&gt;
=== [Miller 2015] ===&lt;br /&gt;
&lt;br /&gt;
* '''Authors and affiliations:''' [[Kim Miller]]&lt;br /&gt;
* '''Keywords of research area:'''&lt;br /&gt;
* '''Tentative title:'''&lt;br /&gt;
* '''Short abstract:'''&lt;br /&gt;
* '''Relationship to other publications:''' (is the article based on a previously published article? is it new content?)&lt;br /&gt;
* '''Pointer to the wiki page that documents the article:''' [[Document_GPF_activities_by_Kim_Miller | Page]]&lt;br /&gt;
* '''Expected submission date:'''&lt;br /&gt;
&lt;br /&gt;
=== [Mills 2015] ===&lt;br /&gt;
&lt;br /&gt;
* '''Authors and affiliations:''' [[Heath Mills]]&lt;br /&gt;
* '''Keywords of research area:'''&lt;br /&gt;
* '''Tentative title:'''&lt;br /&gt;
* '''Short abstract:'''&lt;br /&gt;
* '''Relationship to other publications:''' (is the article based on a previously published article? is it new content?)&lt;br /&gt;
* '''Pointer to the wiki page that documents the article:''' [[Document_GPF_activities_by_Heith_Mills | Page]]&lt;br /&gt;
* '''Expected submission date:'''&lt;br /&gt;
&lt;br /&gt;
=== [Oh 2015] ===&lt;br /&gt;
&lt;br /&gt;
* '''Authors and affiliations:''' [[Ji-Hyun Oh]]&lt;br /&gt;
* '''Keywords of research area:'''&lt;br /&gt;
* '''Tentative title:'''&lt;br /&gt;
* '''Short abstract:'''&lt;br /&gt;
* '''Relationship to other publications:''' (is the article based on a previously published article? is it new content?)&lt;br /&gt;
* '''Pointer to the wiki page that documents the article:''' [[Document_GPF_activities_by_Ji_Hyun | Page]]&lt;br /&gt;
* '''Expected submission date:'''&lt;br /&gt;
&lt;br /&gt;
=== [Pierce 2015] ===&lt;br /&gt;
&lt;br /&gt;
* '''Authors and affiliations:''' [[Suzanne Pierce]]&lt;br /&gt;
* '''Keywords of research area:'''&lt;br /&gt;
* '''Tentative title:'''&lt;br /&gt;
* '''Short abstract:'''&lt;br /&gt;
* '''Relationship to other publications:''' (is the article based on a previously published article? is it new content?)&lt;br /&gt;
* '''Pointer to the wiki page that documents the article:''' [[Document_GPF_activities_by_Suzanne_Pierce | Page]]&lt;br /&gt;
* '''Expected submission date:'''&lt;br /&gt;
&lt;br /&gt;
=== [Pope 2015] ===&lt;br /&gt;
&lt;br /&gt;
* '''Authors and affiliations:''' [[Allen Pope]]&lt;br /&gt;
* '''Keywords of research area:'''&lt;br /&gt;
* '''Tentative title:'''&lt;br /&gt;
* '''Short abstract:'''&lt;br /&gt;
* '''Relationship to other publications:''' (is the article based on a previously published article? is it new content?)&lt;br /&gt;
* '''Pointer to the wiki page that documents the article:''' [[Document_GPF_activities_by_Allen_Pope | Page]]&lt;br /&gt;
* '''Expected submission date:'''&lt;br /&gt;
&lt;br /&gt;
=== [Read and Winslow 2015] ===&lt;br /&gt;
&lt;br /&gt;
* '''Authors and affiliations:''' [[Jordan Read]] and [[Luke Winslow]]&lt;br /&gt;
* '''Keywords of research area:'''&lt;br /&gt;
* '''Tentative title:'''&lt;br /&gt;
* '''Short abstract:'''&lt;br /&gt;
* '''Relationship to other publications:''' (is the article based on a previously published article? is it new content?)&lt;br /&gt;
* '''Pointer to the wiki page that documents the article:''' [[Document_GPF_activities_by_Jordan_Read | Page]]&lt;br /&gt;
* '''Expected submission date:'''&lt;br /&gt;
&lt;br /&gt;
=== [Tzeng 2015] ===&lt;br /&gt;
&lt;br /&gt;
* '''Authors and affiliations:''' [[Mimi Tzeng]]&lt;br /&gt;
* '''Keywords of research area:'''&lt;br /&gt;
* '''Tentative title:'''&lt;br /&gt;
* '''Short abstract:'''&lt;br /&gt;
* '''Relationship to other publications:''' (is the article based on a previously published article? is it new content?)&lt;br /&gt;
* '''Pointer to the wiki page that documents the article:''' [[Document_GPF_activities_by_Mimi_Tzeng | Page]]&lt;br /&gt;
* '''Expected submission date:'''&lt;br /&gt;
&lt;br /&gt;
=== [Villamizar 2015] ===&lt;br /&gt;
&lt;br /&gt;
* '''Authors and affiliations:''' [[Sandra Villamizar]]&lt;br /&gt;
* '''Keywords of research area:'''&lt;br /&gt;
* '''Tentative title:'''&lt;br /&gt;
* '''Short abstract:'''&lt;br /&gt;
* '''Relationship to other publications:''' (is the article based on a previously published article? is it new content?)&lt;br /&gt;
* '''Pointer to the wiki page that documents the article:''' [[Document_GPF_activities_by_Sandra_Villamizar | Page]]&lt;br /&gt;
* '''Expected submission date:'''&lt;br /&gt;
&lt;br /&gt;
=== [Yu 2015] ===&lt;br /&gt;
&lt;br /&gt;
* '''Authors and affiliations:''' [[Xuan Yu]]&lt;br /&gt;
* '''Keywords of research area:'''&lt;br /&gt;
* '''Tentative title:'''&lt;br /&gt;
* '''Short abstract:'''&lt;br /&gt;
* '''Relationship to other publications:''' (is the article based on a previously published article? is it new content?)&lt;br /&gt;
* '''Pointer to the wiki page that documents the article:''' [[Document_GPF_activities_by_Xuan_Yu | Page]]&lt;br /&gt;
* '''Expected submission date:'''&lt;br /&gt;
&lt;br /&gt;
== Special Issue Editors ==&lt;br /&gt;
&lt;br /&gt;
* Co-editor: &lt;br /&gt;
* Co-editor:&lt;br /&gt;
* Co-editor: &lt;br /&gt;
&lt;br /&gt;
The editors will only accept submissions that follow the [[Develop_proposal_for_special_issue#Special_Issue_Review_Criteria | special issue review criteria]].&lt;br /&gt;
&lt;br /&gt;
The editors will select a set of reviewers to handle the submissions.  Reviewers will include computer scientists, library scientists, and geoscientists.&lt;br /&gt;
&lt;br /&gt;
== Special Issue Review Criteria ==&lt;br /&gt;
&lt;br /&gt;
The reviewers will be asked to provide feedback on the papers according to the following criteria:&lt;br /&gt;
&lt;br /&gt;
* Documentation of the datasets: descriptions of datasets, unique identifiers, repositories.&lt;br /&gt;
* Documentation of software: description of all software used (including pre-processing of data, visualization steps, etc), unique identifiers, repositories.&lt;br /&gt;
* Documentation of the provenance of results: provenance for each figure or result, such as the workflow or the provenance record.&lt;br /&gt;
&lt;br /&gt;
== Tentative Timeline ==&lt;br /&gt;
&lt;br /&gt;
* Journal committed to special issue: April 15, 2015&lt;br /&gt;
* Submissions due to editors: June 30, 2015&lt;br /&gt;
* Reviews due: Sept 15, 2015&lt;br /&gt;
* Decisions out to authors: Sept 30, 2015&lt;br /&gt;
* Revisions due: October 31, 2015&lt;br /&gt;
* Final versions due November 15, 2015&lt;br /&gt;
* Issue published December 31, 2015&lt;br /&gt;
&lt;br /&gt;
&amp;lt;!-- Add any wiki Text above this Line --&amp;gt;&lt;br /&gt;
&amp;lt;!-- Do NOT Edit below this Line --&amp;gt;&lt;br /&gt;
{{#set:&lt;br /&gt;
	Owner=Yolanda_Gil|&lt;br /&gt;
	Participants=Xuan_Yu|&lt;br /&gt;
	Participants=Chris_Duffy|&lt;br /&gt;
	Participants=Scott_Peckham|&lt;br /&gt;
	Participants=Cedric_David|&lt;br /&gt;
	Participants=Ibrahim_Demir|&lt;br /&gt;
	Participants=Wally_Fulweiler|&lt;br /&gt;
	Participants=Leif_Karlstrom|&lt;br /&gt;
	Participants=Kyo_Lee|&lt;br /&gt;
	Participants=Kim_Miller|&lt;br /&gt;
	Participants=Heath_Mills|&lt;br /&gt;
	Participants=Ji-Hyun_Oh|&lt;br /&gt;
	Participants=Suzanne_Pierce|&lt;br /&gt;
	Participants=Allen_Pope|&lt;br /&gt;
	Participants=Jordan_Read|&lt;br /&gt;
	Participants=Mimi_Tzeng|&lt;br /&gt;
	Participants=Sandra_Villamizar|&lt;br /&gt;
	Progress=20|&lt;br /&gt;
	StartDate=2015-03-10|&lt;br /&gt;
	TargetDate=2015-03-16|&lt;br /&gt;
	Type=Low}}&lt;/div&gt;</summary>
		<author><name>Suzanne</name></author>	</entry>

	<entry>
		<id>https://www.organicdatascience.org/gpf/index.php?title=Develop_proposal_for_special_issue&amp;diff=11293</id>
		<title>Develop proposal for special issue</title>
		<link rel="alternate" type="text/html" href="https://www.organicdatascience.org/gpf/index.php?title=Develop_proposal_for_special_issue&amp;diff=11293"/>
				<updated>2015-03-13T16:30:37Z</updated>
		
		<summary type="html">&lt;p&gt;Suzanne: /* Motivation: The EarthCube Initiative and the GeoSoft Project */&lt;/p&gt;
&lt;hr /&gt;
&lt;div&gt;[[Category:Task]]&lt;br /&gt;
&lt;br /&gt;
== Background: Why a Special Issue on Geoscience Papers of the Future? ==&lt;br /&gt;
&lt;br /&gt;
[[Discuss_what_we_will_consider_a_GPF#The_Vision | Include here our discussion for the vision]]&lt;br /&gt;
&lt;br /&gt;
Motivated by need to fully document and make research accessible and reproducable. &lt;br /&gt;
&lt;br /&gt;
=== Motivation: The EarthCube Initiative and the GeoSoft Project ===&lt;br /&gt;
&lt;br /&gt;
[http://www.geosoft-earthcube.org/about Include here background about GeoSoft from the web site]&lt;br /&gt;
&lt;br /&gt;
OSTP memo.  EarthCube reports.&lt;br /&gt;
Other reports that talk about the need for new approaches to editing.&lt;br /&gt;
&lt;br /&gt;
It's possible that small or very large contributions are not well captured in the current publishing paradigms.  Nanopublications.&lt;br /&gt;
&lt;br /&gt;
For example, nano-publications are a possible way to reflect advances in a research process that may not merit a full pubication but they are useful advances to share with the community.&lt;br /&gt;
&lt;br /&gt;
Alternatively, a very large piece of research or work with many parts may be better suited to a GPF style publication.&lt;br /&gt;
&lt;br /&gt;
=== What is a GPF ===&lt;br /&gt;
&lt;br /&gt;
[[Discuss_what_we_will_consider_a_GPF#What_is_a_Geoscience_Paper_of_the_Future.3F | Include here our discussion of what is a GPF]]&lt;br /&gt;
&lt;br /&gt;
The articles in this issue reflect the current best practice for generating a Geoscience Paper of the Future.&lt;br /&gt;
&lt;br /&gt;
Figures in the paper may be a clean versions of an image generated by software.  To the extent possible, authors have included clear delineations of provenance. The goal is to assure that readers may regenerate the figures using documented workflows, data, and codes.  &lt;br /&gt;
&lt;br /&gt;
An important note (Allen, Sandra) is that frequently figures are generated by code, scripts, etc. yet the actual figure is finalized with user.....&lt;br /&gt;
&lt;br /&gt;
Mimi is trying to say: is it really worth belaboring the point about how the prettified version of the figure is made? If it is: both of the visualization software I've used (Matlab and SigmaPlot) have actual code in the background that specifies how to set up the prettification, and this code can be found, copied out, and rerun to generate the exact same figure with all of the prettification in the same place. SigmaPlot uses Visual Basic (I think) in its macros. If it is an important point about explicit code, this should be doable. But I'm not sure it's strictly necessary to specify exactly where all the prettifications are to get the gist across.&lt;br /&gt;
&lt;br /&gt;
=== Related work ===&lt;br /&gt;
&lt;br /&gt;
[[Discuss_what_we_will_consider_a_GPF#New_Frameworks_to_Create_a_New_Generation_of_Scientific_Articles | Include here the related work we have discussed]]&lt;br /&gt;
&lt;br /&gt;
== Papers to be included ==&lt;br /&gt;
&lt;br /&gt;
For each submission, we describe:&lt;br /&gt;
&lt;br /&gt;
* '''Authors and affiliations'''&lt;br /&gt;
* '''Keywords of research area'''&lt;br /&gt;
* '''Tentative title'''&lt;br /&gt;
* '''Short abstract'''&lt;br /&gt;
* '''Relationship to other publications''' (is the article based on a previously published article? is it new content?)&lt;br /&gt;
* '''Pointer to the wiki page that documents the article'''&lt;br /&gt;
* '''Expected submission date'''&lt;br /&gt;
&lt;br /&gt;
=== [David 2015] ===&lt;br /&gt;
&lt;br /&gt;
* '''Authors and affiliations:''' [[Cedric David]]&lt;br /&gt;
* '''Keywords of research area:'''&lt;br /&gt;
* '''Tentative title:'''&lt;br /&gt;
* '''Short abstract:'''&lt;br /&gt;
* '''Relationship to other publications:''' &lt;br /&gt;
* '''Pointer to the wiki page that documents the article:''' [[Document_GPF_activities_by_Cedric_David | Page]]&lt;br /&gt;
* '''Expected submission date:'''&lt;br /&gt;
&lt;br /&gt;
=== [Demir 2015] ===&lt;br /&gt;
&lt;br /&gt;
* '''Authors and affiliations:''' [[Ibrahim Demir]]&lt;br /&gt;
* '''Keywords of research area:'''&lt;br /&gt;
* '''Tentative title:'''&lt;br /&gt;
* '''Short abstract:'''&lt;br /&gt;
* '''Relationship to other publications:''' (is the article based on a previously published article? is it new content?)&lt;br /&gt;
* '''Pointer to the wiki page that documents the article:''' [[Document_GPF_activities_by_Ibrahim_Demir | Page]]&lt;br /&gt;
* '''Expected submission date:'''&lt;br /&gt;
&lt;br /&gt;
=== [Fulweiler 2015] ===&lt;br /&gt;
&lt;br /&gt;
* '''Authors and affiliations:''' [[Wally Fulweiler]]&lt;br /&gt;
* '''Keywords of research area:'''&lt;br /&gt;
* '''Tentative title:'''&lt;br /&gt;
* '''Short abstract:'''&lt;br /&gt;
* '''Relationship to other publications:''' (is the article based on a previously published article? is it new content?)&lt;br /&gt;
* '''Pointer to the wiki page that documents the article:''' [[Document_GPF_activities_by_Wally_Fulweiler | Page]]&lt;br /&gt;
* '''Expected submission date:'''&lt;br /&gt;
&lt;br /&gt;
=== [Karlstrom and Lay 2015] ===&lt;br /&gt;
&lt;br /&gt;
* '''Authors and affiliations:''' [[Leif Karlstrom]] and [[Lay Kuan Loh]]&lt;br /&gt;
* '''Keywords of research area:'''&lt;br /&gt;
* '''Tentative title:'''&lt;br /&gt;
* '''Short abstract:'''&lt;br /&gt;
* '''Relationship to other publications:''' (is the article based on a previously published article? is it new content?)&lt;br /&gt;
* '''Pointer to the wiki page that documents the article:''' [[Document_GPF_activities_by_Leif_Karlstron | Page]]&lt;br /&gt;
* '''Expected submission date:'''&lt;br /&gt;
&lt;br /&gt;
=== [Lee 2015] ===&lt;br /&gt;
&lt;br /&gt;
* '''Authors and affiliations:''' [[Kyo Lee]]&lt;br /&gt;
* '''Keywords of research area:'''&lt;br /&gt;
* '''Tentative title:'''&lt;br /&gt;
* '''Short abstract:'''&lt;br /&gt;
* '''Relationship to other publications:''' (is the article based on a previously published article? is it new content?)&lt;br /&gt;
* '''Pointer to the wiki page that documents the article:''' [[Document_GPF_activities_by_Kyo_Lee | Page]]&lt;br /&gt;
* '''Expected submission date:'''&lt;br /&gt;
&lt;br /&gt;
=== [Miller 2015] ===&lt;br /&gt;
&lt;br /&gt;
* '''Authors and affiliations:''' [[Kim Miller]]&lt;br /&gt;
* '''Keywords of research area:'''&lt;br /&gt;
* '''Tentative title:'''&lt;br /&gt;
* '''Short abstract:'''&lt;br /&gt;
* '''Relationship to other publications:''' (is the article based on a previously published article? is it new content?)&lt;br /&gt;
* '''Pointer to the wiki page that documents the article:''' [[Document_GPF_activities_by_Kim_Miller | Page]]&lt;br /&gt;
* '''Expected submission date:'''&lt;br /&gt;
&lt;br /&gt;
=== [Mills 2015] ===&lt;br /&gt;
&lt;br /&gt;
* '''Authors and affiliations:''' [[Heath Mills]]&lt;br /&gt;
* '''Keywords of research area:'''&lt;br /&gt;
* '''Tentative title:'''&lt;br /&gt;
* '''Short abstract:'''&lt;br /&gt;
* '''Relationship to other publications:''' (is the article based on a previously published article? is it new content?)&lt;br /&gt;
* '''Pointer to the wiki page that documents the article:''' [[Document_GPF_activities_by_Heith_Mills | Page]]&lt;br /&gt;
* '''Expected submission date:'''&lt;br /&gt;
&lt;br /&gt;
=== [Oh 2015] ===&lt;br /&gt;
&lt;br /&gt;
* '''Authors and affiliations:''' [[Ji-Hyun Oh]]&lt;br /&gt;
* '''Keywords of research area:'''&lt;br /&gt;
* '''Tentative title:'''&lt;br /&gt;
* '''Short abstract:'''&lt;br /&gt;
* '''Relationship to other publications:''' (is the article based on a previously published article? is it new content?)&lt;br /&gt;
* '''Pointer to the wiki page that documents the article:''' [[Document_GPF_activities_by_Ji_Hyun | Page]]&lt;br /&gt;
* '''Expected submission date:'''&lt;br /&gt;
&lt;br /&gt;
=== [Pierce 2015] ===&lt;br /&gt;
&lt;br /&gt;
* '''Authors and affiliations:''' [[Suzanne Pierce]]&lt;br /&gt;
* '''Keywords of research area:'''&lt;br /&gt;
* '''Tentative title:'''&lt;br /&gt;
* '''Short abstract:'''&lt;br /&gt;
* '''Relationship to other publications:''' (is the article based on a previously published article? is it new content?)&lt;br /&gt;
* '''Pointer to the wiki page that documents the article:''' [[Document_GPF_activities_by_Suzanne_Pierce | Page]]&lt;br /&gt;
* '''Expected submission date:'''&lt;br /&gt;
&lt;br /&gt;
=== [Pope 2015] ===&lt;br /&gt;
&lt;br /&gt;
* '''Authors and affiliations:''' [[Allen Pope]]&lt;br /&gt;
* '''Keywords of research area:'''&lt;br /&gt;
* '''Tentative title:'''&lt;br /&gt;
* '''Short abstract:'''&lt;br /&gt;
* '''Relationship to other publications:''' (is the article based on a previously published article? is it new content?)&lt;br /&gt;
* '''Pointer to the wiki page that documents the article:''' [[Document_GPF_activities_by_Allen_Pope | Page]]&lt;br /&gt;
* '''Expected submission date:'''&lt;br /&gt;
&lt;br /&gt;
=== [Read and Winslow 2015] ===&lt;br /&gt;
&lt;br /&gt;
* '''Authors and affiliations:''' [[Jordan Read]] and [[Luke Winslow]]&lt;br /&gt;
* '''Keywords of research area:'''&lt;br /&gt;
* '''Tentative title:'''&lt;br /&gt;
* '''Short abstract:'''&lt;br /&gt;
* '''Relationship to other publications:''' (is the article based on a previously published article? is it new content?)&lt;br /&gt;
* '''Pointer to the wiki page that documents the article:''' [[Document_GPF_activities_by_Jordan_Read | Page]]&lt;br /&gt;
* '''Expected submission date:'''&lt;br /&gt;
&lt;br /&gt;
=== [Tzeng 2015] ===&lt;br /&gt;
&lt;br /&gt;
* '''Authors and affiliations:''' [[Mimi Tzeng]]&lt;br /&gt;
* '''Keywords of research area:'''&lt;br /&gt;
* '''Tentative title:'''&lt;br /&gt;
* '''Short abstract:'''&lt;br /&gt;
* '''Relationship to other publications:''' (is the article based on a previously published article? is it new content?)&lt;br /&gt;
* '''Pointer to the wiki page that documents the article:''' [[Document_GPF_activities_by_Mimi_Tzeng | Page]]&lt;br /&gt;
* '''Expected submission date:'''&lt;br /&gt;
&lt;br /&gt;
=== [Villamizar 2015] ===&lt;br /&gt;
&lt;br /&gt;
* '''Authors and affiliations:''' [[Sandra Villamizar]]&lt;br /&gt;
* '''Keywords of research area:'''&lt;br /&gt;
* '''Tentative title:'''&lt;br /&gt;
* '''Short abstract:'''&lt;br /&gt;
* '''Relationship to other publications:''' (is the article based on a previously published article? is it new content?)&lt;br /&gt;
* '''Pointer to the wiki page that documents the article:''' [[Document_GPF_activities_by_Sandra_Villamizar | Page]]&lt;br /&gt;
* '''Expected submission date:'''&lt;br /&gt;
&lt;br /&gt;
=== [Yu 2015] ===&lt;br /&gt;
&lt;br /&gt;
* '''Authors and affiliations:''' [[Xuan Yu]]&lt;br /&gt;
* '''Keywords of research area:'''&lt;br /&gt;
* '''Tentative title:'''&lt;br /&gt;
* '''Short abstract:'''&lt;br /&gt;
* '''Relationship to other publications:''' (is the article based on a previously published article? is it new content?)&lt;br /&gt;
* '''Pointer to the wiki page that documents the article:''' [[Document_GPF_activities_by_Xuan_Yu | Page]]&lt;br /&gt;
* '''Expected submission date:'''&lt;br /&gt;
&lt;br /&gt;
== Special Issue Editors ==&lt;br /&gt;
&lt;br /&gt;
* Co-editor: &lt;br /&gt;
* Co-editor:&lt;br /&gt;
* Co-editor: &lt;br /&gt;
&lt;br /&gt;
The editors will only accept submissions that follow the [[Develop_proposal_for_special_issue#Special_Issue_Review_Criteria | special issue review criteria]].&lt;br /&gt;
&lt;br /&gt;
The editors will select a set of reviewers to handle the submissions.  Reviewers will include computer scientists, library scientists, and geoscientists.&lt;br /&gt;
&lt;br /&gt;
== Special Issue Review Criteria ==&lt;br /&gt;
&lt;br /&gt;
The reviewers will be asked to provide feedback on the papers according to the following criteria:&lt;br /&gt;
&lt;br /&gt;
* Documentation of the datasets: descriptions of datasets, unique identifiers, repositories.&lt;br /&gt;
* Documentation of software: description of all software used (including pre-processing of data, visualization steps, etc), unique identifiers, repositories.&lt;br /&gt;
* Documentation of the provenance of results: provenance for each figure or result, such as the workflow or the provenance record.&lt;br /&gt;
&lt;br /&gt;
== Tentative Timeline ==&lt;br /&gt;
&lt;br /&gt;
* Journal committed to special issue: April 15, 2015&lt;br /&gt;
* Submissions due to editors: June 30, 2015&lt;br /&gt;
* Reviews due: Sept 15, 2015&lt;br /&gt;
* Decisions out to authors: Sept 30, 2015&lt;br /&gt;
* Revisions due: October 31, 2015&lt;br /&gt;
* Final versions due November 15, 2015&lt;br /&gt;
* Issue published December 31, 2015&lt;br /&gt;
&lt;br /&gt;
&amp;lt;!-- Add any wiki Text above this Line --&amp;gt;&lt;br /&gt;
&amp;lt;!-- Do NOT Edit below this Line --&amp;gt;&lt;br /&gt;
{{#set:&lt;br /&gt;
	Owner=Yolanda_Gil|&lt;br /&gt;
	Participants=Xuan_Yu|&lt;br /&gt;
	Participants=Chris_Duffy|&lt;br /&gt;
	Participants=Scott_Peckham|&lt;br /&gt;
	Participants=Cedric_David|&lt;br /&gt;
	Participants=Ibrahim_Demir|&lt;br /&gt;
	Participants=Wally_Fulweiler|&lt;br /&gt;
	Participants=Leif_Karlstrom|&lt;br /&gt;
	Participants=Kyo_Lee|&lt;br /&gt;
	Participants=Kim_Miller|&lt;br /&gt;
	Participants=Heath_Mills|&lt;br /&gt;
	Participants=Ji-Hyun_Oh|&lt;br /&gt;
	Participants=Suzanne_Pierce|&lt;br /&gt;
	Participants=Allen_Pope|&lt;br /&gt;
	Participants=Jordan_Read|&lt;br /&gt;
	Participants=Mimi_Tzeng|&lt;br /&gt;
	Participants=Sandra_Villamizar|&lt;br /&gt;
	Progress=20|&lt;br /&gt;
	StartDate=2015-03-10|&lt;br /&gt;
	TargetDate=2015-03-16|&lt;br /&gt;
	Type=Low}}&lt;/div&gt;</summary>
		<author><name>Suzanne</name></author>	</entry>

	<entry>
		<id>https://www.organicdatascience.org/gpf/index.php?title=Develop_proposal_for_special_issue&amp;diff=11292</id>
		<title>Develop proposal for special issue</title>
		<link rel="alternate" type="text/html" href="https://www.organicdatascience.org/gpf/index.php?title=Develop_proposal_for_special_issue&amp;diff=11292"/>
				<updated>2015-03-13T16:29:49Z</updated>
		
		<summary type="html">&lt;p&gt;Suzanne: /* Motivation: The EarthCube Initiative and the GeoSoft Project */&lt;/p&gt;
&lt;hr /&gt;
&lt;div&gt;[[Category:Task]]&lt;br /&gt;
&lt;br /&gt;
== Background: Why a Special Issue on Geoscience Papers of the Future? ==&lt;br /&gt;
&lt;br /&gt;
[[Discuss_what_we_will_consider_a_GPF#The_Vision | Include here our discussion for the vision]]&lt;br /&gt;
&lt;br /&gt;
Motivated by need to fully document and make research accessible and reproducable. &lt;br /&gt;
&lt;br /&gt;
=== Motivation: The EarthCube Initiative and the GeoSoft Project ===&lt;br /&gt;
&lt;br /&gt;
[http://www.geosoft-earthcube.org/about Include here background about GeoSoft from the web site]&lt;br /&gt;
&lt;br /&gt;
OSTP memo.  EarthCube reports.&lt;br /&gt;
Other reports that talk about the need for new approaches to editing.&lt;br /&gt;
&lt;br /&gt;
It's possible that small or very large contributions are not well captured in the current publishing paradigms.  Nanopublications.&lt;br /&gt;
&lt;br /&gt;
For example, nano-publications are a possible way to reflect advances in a research process that may not merit a full pubication but they are useful advances to share with the community.&lt;br /&gt;
&lt;br /&gt;
=== What is a GPF ===&lt;br /&gt;
&lt;br /&gt;
[[Discuss_what_we_will_consider_a_GPF#What_is_a_Geoscience_Paper_of_the_Future.3F | Include here our discussion of what is a GPF]]&lt;br /&gt;
&lt;br /&gt;
The articles in this issue reflect the current best practice for generating a Geoscience Paper of the Future.&lt;br /&gt;
&lt;br /&gt;
Figures in the paper may be a clean versions of an image generated by software.  To the extent possible, authors have included clear delineations of provenance. The goal is to assure that readers may regenerate the figures using documented workflows, data, and codes.  &lt;br /&gt;
&lt;br /&gt;
An important note (Allen, Sandra) is that frequently figures are generated by code, scripts, etc. yet the actual figure is finalized with user.....&lt;br /&gt;
&lt;br /&gt;
Mimi is trying to say: is it really worth belaboring the point about how the prettified version of the figure is made? If it is: both of the visualization software I've used (Matlab and SigmaPlot) have actual code in the background that specifies how to set up the prettification, and this code can be found, copied out, and rerun to generate the exact same figure with all of the prettification in the same place. SigmaPlot uses Visual Basic (I think) in its macros. If it is an important point about explicit code, this should be doable. But I'm not sure it's strictly necessary to specify exactly where all the prettifications are to get the gist across.&lt;br /&gt;
&lt;br /&gt;
=== Related work ===&lt;br /&gt;
&lt;br /&gt;
[[Discuss_what_we_will_consider_a_GPF#New_Frameworks_to_Create_a_New_Generation_of_Scientific_Articles | Include here the related work we have discussed]]&lt;br /&gt;
&lt;br /&gt;
== Papers to be included ==&lt;br /&gt;
&lt;br /&gt;
For each submission, we describe:&lt;br /&gt;
&lt;br /&gt;
* '''Authors and affiliations'''&lt;br /&gt;
* '''Keywords of research area'''&lt;br /&gt;
* '''Tentative title'''&lt;br /&gt;
* '''Short abstract'''&lt;br /&gt;
* '''Relationship to other publications''' (is the article based on a previously published article? is it new content?)&lt;br /&gt;
* '''Pointer to the wiki page that documents the article'''&lt;br /&gt;
* '''Expected submission date'''&lt;br /&gt;
&lt;br /&gt;
=== [David 2015] ===&lt;br /&gt;
&lt;br /&gt;
* '''Authors and affiliations:''' [[Cedric David]]&lt;br /&gt;
* '''Keywords of research area:'''&lt;br /&gt;
* '''Tentative title:'''&lt;br /&gt;
* '''Short abstract:'''&lt;br /&gt;
* '''Relationship to other publications:''' &lt;br /&gt;
* '''Pointer to the wiki page that documents the article:''' [[Document_GPF_activities_by_Cedric_David | Page]]&lt;br /&gt;
* '''Expected submission date:'''&lt;br /&gt;
&lt;br /&gt;
=== [Demir 2015] ===&lt;br /&gt;
&lt;br /&gt;
* '''Authors and affiliations:''' [[Ibrahim Demir]]&lt;br /&gt;
* '''Keywords of research area:'''&lt;br /&gt;
* '''Tentative title:'''&lt;br /&gt;
* '''Short abstract:'''&lt;br /&gt;
* '''Relationship to other publications:''' (is the article based on a previously published article? is it new content?)&lt;br /&gt;
* '''Pointer to the wiki page that documents the article:''' [[Document_GPF_activities_by_Ibrahim_Demir | Page]]&lt;br /&gt;
* '''Expected submission date:'''&lt;br /&gt;
&lt;br /&gt;
=== [Fulweiler 2015] ===&lt;br /&gt;
&lt;br /&gt;
* '''Authors and affiliations:''' [[Wally Fulweiler]]&lt;br /&gt;
* '''Keywords of research area:'''&lt;br /&gt;
* '''Tentative title:'''&lt;br /&gt;
* '''Short abstract:'''&lt;br /&gt;
* '''Relationship to other publications:''' (is the article based on a previously published article? is it new content?)&lt;br /&gt;
* '''Pointer to the wiki page that documents the article:''' [[Document_GPF_activities_by_Wally_Fulweiler | Page]]&lt;br /&gt;
* '''Expected submission date:'''&lt;br /&gt;
&lt;br /&gt;
=== [Karlstrom and Lay 2015] ===&lt;br /&gt;
&lt;br /&gt;
* '''Authors and affiliations:''' [[Leif Karlstrom]] and [[Lay Kuan Loh]]&lt;br /&gt;
* '''Keywords of research area:'''&lt;br /&gt;
* '''Tentative title:'''&lt;br /&gt;
* '''Short abstract:'''&lt;br /&gt;
* '''Relationship to other publications:''' (is the article based on a previously published article? is it new content?)&lt;br /&gt;
* '''Pointer to the wiki page that documents the article:''' [[Document_GPF_activities_by_Leif_Karlstron | Page]]&lt;br /&gt;
* '''Expected submission date:'''&lt;br /&gt;
&lt;br /&gt;
=== [Lee 2015] ===&lt;br /&gt;
&lt;br /&gt;
* '''Authors and affiliations:''' [[Kyo Lee]]&lt;br /&gt;
* '''Keywords of research area:'''&lt;br /&gt;
* '''Tentative title:'''&lt;br /&gt;
* '''Short abstract:'''&lt;br /&gt;
* '''Relationship to other publications:''' (is the article based on a previously published article? is it new content?)&lt;br /&gt;
* '''Pointer to the wiki page that documents the article:''' [[Document_GPF_activities_by_Kyo_Lee | Page]]&lt;br /&gt;
* '''Expected submission date:'''&lt;br /&gt;
&lt;br /&gt;
=== [Miller 2015] ===&lt;br /&gt;
&lt;br /&gt;
* '''Authors and affiliations:''' [[Kim Miller]]&lt;br /&gt;
* '''Keywords of research area:'''&lt;br /&gt;
* '''Tentative title:'''&lt;br /&gt;
* '''Short abstract:'''&lt;br /&gt;
* '''Relationship to other publications:''' (is the article based on a previously published article? is it new content?)&lt;br /&gt;
* '''Pointer to the wiki page that documents the article:''' [[Document_GPF_activities_by_Kim_Miller | Page]]&lt;br /&gt;
* '''Expected submission date:'''&lt;br /&gt;
&lt;br /&gt;
=== [Mills 2015] ===&lt;br /&gt;
&lt;br /&gt;
* '''Authors and affiliations:''' [[Heath Mills]]&lt;br /&gt;
* '''Keywords of research area:'''&lt;br /&gt;
* '''Tentative title:'''&lt;br /&gt;
* '''Short abstract:'''&lt;br /&gt;
* '''Relationship to other publications:''' (is the article based on a previously published article? is it new content?)&lt;br /&gt;
* '''Pointer to the wiki page that documents the article:''' [[Document_GPF_activities_by_Heith_Mills | Page]]&lt;br /&gt;
* '''Expected submission date:'''&lt;br /&gt;
&lt;br /&gt;
=== [Oh 2015] ===&lt;br /&gt;
&lt;br /&gt;
* '''Authors and affiliations:''' [[Ji-Hyun Oh]]&lt;br /&gt;
* '''Keywords of research area:'''&lt;br /&gt;
* '''Tentative title:'''&lt;br /&gt;
* '''Short abstract:'''&lt;br /&gt;
* '''Relationship to other publications:''' (is the article based on a previously published article? is it new content?)&lt;br /&gt;
* '''Pointer to the wiki page that documents the article:''' [[Document_GPF_activities_by_Ji_Hyun | Page]]&lt;br /&gt;
* '''Expected submission date:'''&lt;br /&gt;
&lt;br /&gt;
=== [Pierce 2015] ===&lt;br /&gt;
&lt;br /&gt;
* '''Authors and affiliations:''' [[Suzanne Pierce]]&lt;br /&gt;
* '''Keywords of research area:'''&lt;br /&gt;
* '''Tentative title:'''&lt;br /&gt;
* '''Short abstract:'''&lt;br /&gt;
* '''Relationship to other publications:''' (is the article based on a previously published article? is it new content?)&lt;br /&gt;
* '''Pointer to the wiki page that documents the article:''' [[Document_GPF_activities_by_Suzanne_Pierce | Page]]&lt;br /&gt;
* '''Expected submission date:'''&lt;br /&gt;
&lt;br /&gt;
=== [Pope 2015] ===&lt;br /&gt;
&lt;br /&gt;
* '''Authors and affiliations:''' [[Allen Pope]]&lt;br /&gt;
* '''Keywords of research area:'''&lt;br /&gt;
* '''Tentative title:'''&lt;br /&gt;
* '''Short abstract:'''&lt;br /&gt;
* '''Relationship to other publications:''' (is the article based on a previously published article? is it new content?)&lt;br /&gt;
* '''Pointer to the wiki page that documents the article:''' [[Document_GPF_activities_by_Allen_Pope | Page]]&lt;br /&gt;
* '''Expected submission date:'''&lt;br /&gt;
&lt;br /&gt;
=== [Read and Winslow 2015] ===&lt;br /&gt;
&lt;br /&gt;
* '''Authors and affiliations:''' [[Jordan Read]] and [[Luke Winslow]]&lt;br /&gt;
* '''Keywords of research area:'''&lt;br /&gt;
* '''Tentative title:'''&lt;br /&gt;
* '''Short abstract:'''&lt;br /&gt;
* '''Relationship to other publications:''' (is the article based on a previously published article? is it new content?)&lt;br /&gt;
* '''Pointer to the wiki page that documents the article:''' [[Document_GPF_activities_by_Jordan_Read | Page]]&lt;br /&gt;
* '''Expected submission date:'''&lt;br /&gt;
&lt;br /&gt;
=== [Tzeng 2015] ===&lt;br /&gt;
&lt;br /&gt;
* '''Authors and affiliations:''' [[Mimi Tzeng]]&lt;br /&gt;
* '''Keywords of research area:'''&lt;br /&gt;
* '''Tentative title:'''&lt;br /&gt;
* '''Short abstract:'''&lt;br /&gt;
* '''Relationship to other publications:''' (is the article based on a previously published article? is it new content?)&lt;br /&gt;
* '''Pointer to the wiki page that documents the article:''' [[Document_GPF_activities_by_Mimi_Tzeng | Page]]&lt;br /&gt;
* '''Expected submission date:'''&lt;br /&gt;
&lt;br /&gt;
=== [Villamizar 2015] ===&lt;br /&gt;
&lt;br /&gt;
* '''Authors and affiliations:''' [[Sandra Villamizar]]&lt;br /&gt;
* '''Keywords of research area:'''&lt;br /&gt;
* '''Tentative title:'''&lt;br /&gt;
* '''Short abstract:'''&lt;br /&gt;
* '''Relationship to other publications:''' (is the article based on a previously published article? is it new content?)&lt;br /&gt;
* '''Pointer to the wiki page that documents the article:''' [[Document_GPF_activities_by_Sandra_Villamizar | Page]]&lt;br /&gt;
* '''Expected submission date:'''&lt;br /&gt;
&lt;br /&gt;
=== [Yu 2015] ===&lt;br /&gt;
&lt;br /&gt;
* '''Authors and affiliations:''' [[Xuan Yu]]&lt;br /&gt;
* '''Keywords of research area:'''&lt;br /&gt;
* '''Tentative title:'''&lt;br /&gt;
* '''Short abstract:'''&lt;br /&gt;
* '''Relationship to other publications:''' (is the article based on a previously published article? is it new content?)&lt;br /&gt;
* '''Pointer to the wiki page that documents the article:''' [[Document_GPF_activities_by_Xuan_Yu | Page]]&lt;br /&gt;
* '''Expected submission date:'''&lt;br /&gt;
&lt;br /&gt;
== Special Issue Editors ==&lt;br /&gt;
&lt;br /&gt;
* Co-editor: &lt;br /&gt;
* Co-editor:&lt;br /&gt;
* Co-editor: &lt;br /&gt;
&lt;br /&gt;
The editors will only accept submissions that follow the [[Develop_proposal_for_special_issue#Special_Issue_Review_Criteria | special issue review criteria]].&lt;br /&gt;
&lt;br /&gt;
The editors will select a set of reviewers to handle the submissions.  Reviewers will include computer scientists, library scientists, and geoscientists.&lt;br /&gt;
&lt;br /&gt;
== Special Issue Review Criteria ==&lt;br /&gt;
&lt;br /&gt;
The reviewers will be asked to provide feedback on the papers according to the following criteria:&lt;br /&gt;
&lt;br /&gt;
* Documentation of the datasets: descriptions of datasets, unique identifiers, repositories.&lt;br /&gt;
* Documentation of software: description of all software used (including pre-processing of data, visualization steps, etc), unique identifiers, repositories.&lt;br /&gt;
* Documentation of the provenance of results: provenance for each figure or result, such as the workflow or the provenance record.&lt;br /&gt;
&lt;br /&gt;
== Tentative Timeline ==&lt;br /&gt;
&lt;br /&gt;
* Journal committed to special issue: April 15, 2015&lt;br /&gt;
* Submissions due to editors: June 30, 2015&lt;br /&gt;
* Reviews due: Sept 15, 2015&lt;br /&gt;
* Decisions out to authors: Sept 30, 2015&lt;br /&gt;
* Revisions due: October 31, 2015&lt;br /&gt;
* Final versions due November 15, 2015&lt;br /&gt;
* Issue published December 31, 2015&lt;br /&gt;
&lt;br /&gt;
&amp;lt;!-- Add any wiki Text above this Line --&amp;gt;&lt;br /&gt;
&amp;lt;!-- Do NOT Edit below this Line --&amp;gt;&lt;br /&gt;
{{#set:&lt;br /&gt;
	Owner=Yolanda_Gil|&lt;br /&gt;
	Participants=Xuan_Yu|&lt;br /&gt;
	Participants=Chris_Duffy|&lt;br /&gt;
	Participants=Scott_Peckham|&lt;br /&gt;
	Participants=Cedric_David|&lt;br /&gt;
	Participants=Ibrahim_Demir|&lt;br /&gt;
	Participants=Wally_Fulweiler|&lt;br /&gt;
	Participants=Leif_Karlstrom|&lt;br /&gt;
	Participants=Kyo_Lee|&lt;br /&gt;
	Participants=Kim_Miller|&lt;br /&gt;
	Participants=Heath_Mills|&lt;br /&gt;
	Participants=Ji-Hyun_Oh|&lt;br /&gt;
	Participants=Suzanne_Pierce|&lt;br /&gt;
	Participants=Allen_Pope|&lt;br /&gt;
	Participants=Jordan_Read|&lt;br /&gt;
	Participants=Mimi_Tzeng|&lt;br /&gt;
	Participants=Sandra_Villamizar|&lt;br /&gt;
	Progress=20|&lt;br /&gt;
	StartDate=2015-03-10|&lt;br /&gt;
	TargetDate=2015-03-16|&lt;br /&gt;
	Type=Low}}&lt;/div&gt;</summary>
		<author><name>Suzanne</name></author>	</entry>

	<entry>
		<id>https://www.organicdatascience.org/gpf/index.php?title=Develop_proposal_for_special_issue&amp;diff=11289</id>
		<title>Develop proposal for special issue</title>
		<link rel="alternate" type="text/html" href="https://www.organicdatascience.org/gpf/index.php?title=Develop_proposal_for_special_issue&amp;diff=11289"/>
				<updated>2015-03-13T16:28:48Z</updated>
		
		<summary type="html">&lt;p&gt;Suzanne: /* Motivation: The EarthCube Initiative and the GeoSoft Project */&lt;/p&gt;
&lt;hr /&gt;
&lt;div&gt;[[Category:Task]]&lt;br /&gt;
&lt;br /&gt;
== Background: Why a Special Issue on Geoscience Papers of the Future? ==&lt;br /&gt;
&lt;br /&gt;
[[Discuss_what_we_will_consider_a_GPF#The_Vision | Include here our discussion for the vision]]&lt;br /&gt;
&lt;br /&gt;
Motivated by need to fully document and make research accessible and reproducable. &lt;br /&gt;
&lt;br /&gt;
=== Motivation: The EarthCube Initiative and the GeoSoft Project ===&lt;br /&gt;
&lt;br /&gt;
[http://www.geosoft-earthcube.org/about Include here background about GeoSoft from the web site]&lt;br /&gt;
&lt;br /&gt;
OSTP memo.  EarthCube reports.&lt;br /&gt;
Other reports that talk about the need for new approaches to editing.&lt;br /&gt;
&lt;br /&gt;
It's possible that small or very large contributions are not well captured in the current publishing paradigms.&lt;br /&gt;
&lt;br /&gt;
=== What is a GPF ===&lt;br /&gt;
&lt;br /&gt;
[[Discuss_what_we_will_consider_a_GPF#What_is_a_Geoscience_Paper_of_the_Future.3F | Include here our discussion of what is a GPF]]&lt;br /&gt;
&lt;br /&gt;
The articles in this issue reflect the current best practice for generating a Geoscience Paper of the Future.&lt;br /&gt;
&lt;br /&gt;
Figures in the paper may be a clean versions of an image generated by software.  To the extent possible, authors have included clear delineations of provenance. The goal is to assure that readers may regenerate the figures using documented workflows, data, and codes.  &lt;br /&gt;
&lt;br /&gt;
An important note (Mimi) is that frequently figures are generated by code, scripts, etc. yet the actual figure is finalized with user.....&lt;br /&gt;
&lt;br /&gt;
Mimi is trying to say: is it really worth belaboring the point about how the prettified version of the figure is made? If it is: both of the visualization software I've used (Matlab and SigmaPlot) have actual code in the background that specifies how to set up the prettification, and this code can be found, copied out, and rerun to generate the exact same figure with all of the prettification in the same place. SigmaPlot uses Visual Basic (I think) in its macros. If it is an important point about explicit code, this should be doable. But I'm not sure it's strictly necessary to specify exactly where all the prettifications are to get the gist across.&lt;br /&gt;
&lt;br /&gt;
=== Related work ===&lt;br /&gt;
&lt;br /&gt;
[[Discuss_what_we_will_consider_a_GPF#New_Frameworks_to_Create_a_New_Generation_of_Scientific_Articles | Include here the related work we have discussed]]&lt;br /&gt;
&lt;br /&gt;
== Papers to be included ==&lt;br /&gt;
&lt;br /&gt;
For each submission, we describe:&lt;br /&gt;
&lt;br /&gt;
* '''Authors and affiliations'''&lt;br /&gt;
* '''Keywords of research area'''&lt;br /&gt;
* '''Tentative title'''&lt;br /&gt;
* '''Short abstract'''&lt;br /&gt;
* '''Relationship to other publications''' (is the article based on a previously published article? is it new content?)&lt;br /&gt;
* '''Pointer to the wiki page that documents the article'''&lt;br /&gt;
* '''Expected submission date'''&lt;br /&gt;
&lt;br /&gt;
=== [David 2015] ===&lt;br /&gt;
&lt;br /&gt;
* '''Authors and affiliations:''' [[Cedric David]]&lt;br /&gt;
* '''Keywords of research area:'''&lt;br /&gt;
* '''Tentative title:'''&lt;br /&gt;
* '''Short abstract:'''&lt;br /&gt;
* '''Relationship to other publications:''' &lt;br /&gt;
* '''Pointer to the wiki page that documents the article:''' [[Document_GPF_activities_by_Cedric_David | Page]]&lt;br /&gt;
* '''Expected submission date:'''&lt;br /&gt;
&lt;br /&gt;
=== [Demir 2015] ===&lt;br /&gt;
&lt;br /&gt;
* '''Authors and affiliations:''' [[Ibrahim Demir]]&lt;br /&gt;
* '''Keywords of research area:'''&lt;br /&gt;
* '''Tentative title:'''&lt;br /&gt;
* '''Short abstract:'''&lt;br /&gt;
* '''Relationship to other publications:''' (is the article based on a previously published article? is it new content?)&lt;br /&gt;
* '''Pointer to the wiki page that documents the article:''' [[Document_GPF_activities_by_Ibrahim_Demir | Page]]&lt;br /&gt;
* '''Expected submission date:'''&lt;br /&gt;
&lt;br /&gt;
=== [Fulweiler 2015] ===&lt;br /&gt;
&lt;br /&gt;
* '''Authors and affiliations:''' [[Wally Fulweiler]]&lt;br /&gt;
* '''Keywords of research area:'''&lt;br /&gt;
* '''Tentative title:'''&lt;br /&gt;
* '''Short abstract:'''&lt;br /&gt;
* '''Relationship to other publications:''' (is the article based on a previously published article? is it new content?)&lt;br /&gt;
* '''Pointer to the wiki page that documents the article:''' [[Document_GPF_activities_by_Wally_Fulweiler | Page]]&lt;br /&gt;
* '''Expected submission date:'''&lt;br /&gt;
&lt;br /&gt;
=== [Karlstrom and Lay 2015] ===&lt;br /&gt;
&lt;br /&gt;
* '''Authors and affiliations:''' [[Leif Karlstrom]] and [[Lay Kuan Loh]]&lt;br /&gt;
* '''Keywords of research area:'''&lt;br /&gt;
* '''Tentative title:'''&lt;br /&gt;
* '''Short abstract:'''&lt;br /&gt;
* '''Relationship to other publications:''' (is the article based on a previously published article? is it new content?)&lt;br /&gt;
* '''Pointer to the wiki page that documents the article:''' [[Document_GPF_activities_by_Leif_Karlstron | Page]]&lt;br /&gt;
* '''Expected submission date:'''&lt;br /&gt;
&lt;br /&gt;
=== [Lee 2015] ===&lt;br /&gt;
&lt;br /&gt;
* '''Authors and affiliations:''' [[Kyo Lee]]&lt;br /&gt;
* '''Keywords of research area:'''&lt;br /&gt;
* '''Tentative title:'''&lt;br /&gt;
* '''Short abstract:'''&lt;br /&gt;
* '''Relationship to other publications:''' (is the article based on a previously published article? is it new content?)&lt;br /&gt;
* '''Pointer to the wiki page that documents the article:''' [[Document_GPF_activities_by_Kyo_Lee | Page]]&lt;br /&gt;
* '''Expected submission date:'''&lt;br /&gt;
&lt;br /&gt;
=== [Miller 2015] ===&lt;br /&gt;
&lt;br /&gt;
* '''Authors and affiliations:''' [[Kim Miller]]&lt;br /&gt;
* '''Keywords of research area:'''&lt;br /&gt;
* '''Tentative title:'''&lt;br /&gt;
* '''Short abstract:'''&lt;br /&gt;
* '''Relationship to other publications:''' (is the article based on a previously published article? is it new content?)&lt;br /&gt;
* '''Pointer to the wiki page that documents the article:''' [[Document_GPF_activities_by_Kim_Miller | Page]]&lt;br /&gt;
* '''Expected submission date:'''&lt;br /&gt;
&lt;br /&gt;
=== [Mills 2015] ===&lt;br /&gt;
&lt;br /&gt;
* '''Authors and affiliations:''' [[Heath Mills]]&lt;br /&gt;
* '''Keywords of research area:'''&lt;br /&gt;
* '''Tentative title:'''&lt;br /&gt;
* '''Short abstract:'''&lt;br /&gt;
* '''Relationship to other publications:''' (is the article based on a previously published article? is it new content?)&lt;br /&gt;
* '''Pointer to the wiki page that documents the article:''' [[Document_GPF_activities_by_Heith_Mills | Page]]&lt;br /&gt;
* '''Expected submission date:'''&lt;br /&gt;
&lt;br /&gt;
=== [Oh 2015] ===&lt;br /&gt;
&lt;br /&gt;
* '''Authors and affiliations:''' [[Ji-Hyun Oh]]&lt;br /&gt;
* '''Keywords of research area:'''&lt;br /&gt;
* '''Tentative title:'''&lt;br /&gt;
* '''Short abstract:'''&lt;br /&gt;
* '''Relationship to other publications:''' (is the article based on a previously published article? is it new content?)&lt;br /&gt;
* '''Pointer to the wiki page that documents the article:''' [[Document_GPF_activities_by_Ji_Hyun | Page]]&lt;br /&gt;
* '''Expected submission date:'''&lt;br /&gt;
&lt;br /&gt;
=== [Pierce 2015] ===&lt;br /&gt;
&lt;br /&gt;
* '''Authors and affiliations:''' [[Suzanne Pierce]]&lt;br /&gt;
* '''Keywords of research area:'''&lt;br /&gt;
* '''Tentative title:'''&lt;br /&gt;
* '''Short abstract:'''&lt;br /&gt;
* '''Relationship to other publications:''' (is the article based on a previously published article? is it new content?)&lt;br /&gt;
* '''Pointer to the wiki page that documents the article:''' [[Document_GPF_activities_by_Suzanne_Pierce | Page]]&lt;br /&gt;
* '''Expected submission date:'''&lt;br /&gt;
&lt;br /&gt;
=== [Pope 2015] ===&lt;br /&gt;
&lt;br /&gt;
* '''Authors and affiliations:''' [[Allen Pope]]&lt;br /&gt;
* '''Keywords of research area:'''&lt;br /&gt;
* '''Tentative title:'''&lt;br /&gt;
* '''Short abstract:'''&lt;br /&gt;
* '''Relationship to other publications:''' (is the article based on a previously published article? is it new content?)&lt;br /&gt;
* '''Pointer to the wiki page that documents the article:''' [[Document_GPF_activities_by_Allen_Pope | Page]]&lt;br /&gt;
* '''Expected submission date:'''&lt;br /&gt;
&lt;br /&gt;
=== [Read and Winslow 2015] ===&lt;br /&gt;
&lt;br /&gt;
* '''Authors and affiliations:''' [[Jordan Read]] and [[Luke Winslow]]&lt;br /&gt;
* '''Keywords of research area:'''&lt;br /&gt;
* '''Tentative title:'''&lt;br /&gt;
* '''Short abstract:'''&lt;br /&gt;
* '''Relationship to other publications:''' (is the article based on a previously published article? is it new content?)&lt;br /&gt;
* '''Pointer to the wiki page that documents the article:''' [[Document_GPF_activities_by_Jordan_Read | Page]]&lt;br /&gt;
* '''Expected submission date:'''&lt;br /&gt;
&lt;br /&gt;
=== [Tzeng 2015] ===&lt;br /&gt;
&lt;br /&gt;
* '''Authors and affiliations:''' [[Mimi Tzeng]]&lt;br /&gt;
* '''Keywords of research area:'''&lt;br /&gt;
* '''Tentative title:'''&lt;br /&gt;
* '''Short abstract:'''&lt;br /&gt;
* '''Relationship to other publications:''' (is the article based on a previously published article? is it new content?)&lt;br /&gt;
* '''Pointer to the wiki page that documents the article:''' [[Document_GPF_activities_by_Mimi_Tzeng | Page]]&lt;br /&gt;
* '''Expected submission date:'''&lt;br /&gt;
&lt;br /&gt;
=== [Villamizar 2015] ===&lt;br /&gt;
&lt;br /&gt;
* '''Authors and affiliations:''' [[Sandra Villamizar]]&lt;br /&gt;
* '''Keywords of research area:'''&lt;br /&gt;
* '''Tentative title:'''&lt;br /&gt;
* '''Short abstract:'''&lt;br /&gt;
* '''Relationship to other publications:''' (is the article based on a previously published article? is it new content?)&lt;br /&gt;
* '''Pointer to the wiki page that documents the article:''' [[Document_GPF_activities_by_Sandra_Villamizar | Page]]&lt;br /&gt;
* '''Expected submission date:'''&lt;br /&gt;
&lt;br /&gt;
=== [Yu 2015] ===&lt;br /&gt;
&lt;br /&gt;
* '''Authors and affiliations:''' [[Xuan Yu]]&lt;br /&gt;
* '''Keywords of research area:'''&lt;br /&gt;
* '''Tentative title:'''&lt;br /&gt;
* '''Short abstract:'''&lt;br /&gt;
* '''Relationship to other publications:''' (is the article based on a previously published article? is it new content?)&lt;br /&gt;
* '''Pointer to the wiki page that documents the article:''' [[Document_GPF_activities_by_Xuan_Yu | Page]]&lt;br /&gt;
* '''Expected submission date:'''&lt;br /&gt;
&lt;br /&gt;
== Special Issue Editors ==&lt;br /&gt;
&lt;br /&gt;
* Co-editor: &lt;br /&gt;
* Co-editor:&lt;br /&gt;
* Co-editor: &lt;br /&gt;
&lt;br /&gt;
The editors will only accept submissions that follow the [[Develop_proposal_for_special_issue#Special_Issue_Review_Criteria | special issue review criteria]].&lt;br /&gt;
&lt;br /&gt;
The editors will select a set of reviewers to handle the submissions.  Reviewers will include computer scientists, library scientists, and geoscientists.&lt;br /&gt;
&lt;br /&gt;
== Special Issue Review Criteria ==&lt;br /&gt;
&lt;br /&gt;
The reviewers will be asked to provide feedback on the papers according to the following criteria:&lt;br /&gt;
&lt;br /&gt;
* Documentation of the datasets: descriptions of datasets, unique identifiers, repositories.&lt;br /&gt;
* Documentation of software: description of all software used (including pre-processing of data, visualization steps, etc), unique identifiers, repositories.&lt;br /&gt;
* Documentation of the provenance of results: provenance for each figure or result, such as the workflow or the provenance record.&lt;br /&gt;
&lt;br /&gt;
== Tentative Timeline ==&lt;br /&gt;
&lt;br /&gt;
* Journal committed to special issue: April 15, 2015&lt;br /&gt;
* Submissions due to editors: June 30, 2015&lt;br /&gt;
* Reviews due: Sept 15, 2015&lt;br /&gt;
* Decisions out to authors: Sept 30, 2015&lt;br /&gt;
* Revisions due: October 31, 2015&lt;br /&gt;
* Final versions due November 15, 2015&lt;br /&gt;
* Issue published December 31, 2015&lt;br /&gt;
&lt;br /&gt;
&amp;lt;!-- Add any wiki Text above this Line --&amp;gt;&lt;br /&gt;
&amp;lt;!-- Do NOT Edit below this Line --&amp;gt;&lt;br /&gt;
{{#set:&lt;br /&gt;
	Owner=Yolanda_Gil|&lt;br /&gt;
	Participants=Xuan_Yu|&lt;br /&gt;
	Participants=Chris_Duffy|&lt;br /&gt;
	Participants=Scott_Peckham|&lt;br /&gt;
	Participants=Cedric_David|&lt;br /&gt;
	Participants=Ibrahim_Demir|&lt;br /&gt;
	Participants=Wally_Fulweiler|&lt;br /&gt;
	Participants=Leif_Karlstrom|&lt;br /&gt;
	Participants=Kyo_Lee|&lt;br /&gt;
	Participants=Kim_Miller|&lt;br /&gt;
	Participants=Heath_Mills|&lt;br /&gt;
	Participants=Ji-Hyun_Oh|&lt;br /&gt;
	Participants=Suzanne_Pierce|&lt;br /&gt;
	Participants=Allen_Pope|&lt;br /&gt;
	Participants=Jordan_Read|&lt;br /&gt;
	Participants=Mimi_Tzeng|&lt;br /&gt;
	Participants=Sandra_Villamizar|&lt;br /&gt;
	Progress=20|&lt;br /&gt;
	StartDate=2015-03-10|&lt;br /&gt;
	TargetDate=2015-03-16|&lt;br /&gt;
	Type=Low}}&lt;/div&gt;</summary>
		<author><name>Suzanne</name></author>	</entry>

	<entry>
		<id>https://www.organicdatascience.org/gpf/index.php?title=Develop_proposal_for_special_issue&amp;diff=11284</id>
		<title>Develop proposal for special issue</title>
		<link rel="alternate" type="text/html" href="https://www.organicdatascience.org/gpf/index.php?title=Develop_proposal_for_special_issue&amp;diff=11284"/>
				<updated>2015-03-13T16:22:10Z</updated>
		
		<summary type="html">&lt;p&gt;Suzanne: /* What is a GPF */&lt;/p&gt;
&lt;hr /&gt;
&lt;div&gt;[[Category:Task]]&lt;br /&gt;
&lt;br /&gt;
== Background: Why a Special Issue on Geoscience Papers of the Future? ==&lt;br /&gt;
&lt;br /&gt;
[[Discuss_what_we_will_consider_a_GPF#The_Vision | Include here our discussion for the vision]]&lt;br /&gt;
&lt;br /&gt;
Motivated by need to fully document and make research accessible and reproducable. &lt;br /&gt;
&lt;br /&gt;
=== Motivation: The EarthCube Initiative and the GeoSoft Project ===&lt;br /&gt;
&lt;br /&gt;
[http://www.geosoft-earthcube.org/about Include here background about GeoSoft from the web site]&lt;br /&gt;
&lt;br /&gt;
=== What is a GPF ===&lt;br /&gt;
&lt;br /&gt;
[[Discuss_what_we_will_consider_a_GPF#What_is_a_Geoscience_Paper_of_the_Future.3F | Include here our discussion of what is a GPF]]&lt;br /&gt;
&lt;br /&gt;
The articles in this issue reflect the current best practice for generating a Geoscience Paper of the Future.&lt;br /&gt;
&lt;br /&gt;
Figures in the paper may be a clean versions of an image generated by software.  To the extent possible, authors have included clear delineations of provenance. The goal is to assure that readers may regenerate the figures using documented workflows, data, and codes.  An important note (Mimi) is that frequently figures are generated by code, scripts, etc. yet the actual figure is finalized with user.....&lt;br /&gt;
&lt;br /&gt;
=== Related work ===&lt;br /&gt;
&lt;br /&gt;
[[Discuss_what_we_will_consider_a_GPF#New_Frameworks_to_Create_a_New_Generation_of_Scientific_Articles | Include here the related work we have discussed]]&lt;br /&gt;
&lt;br /&gt;
== Papers to be included ==&lt;br /&gt;
&lt;br /&gt;
For each submission, we describe:&lt;br /&gt;
&lt;br /&gt;
* '''Authors and affiliations'''&lt;br /&gt;
* '''Keywords of research area'''&lt;br /&gt;
* '''Tentative title'''&lt;br /&gt;
* '''Short abstract'''&lt;br /&gt;
* '''Relationship to other publications''' (is the article based on a previously published article? is it new content?)&lt;br /&gt;
* '''Pointer to the wiki page that documents the article'''&lt;br /&gt;
* '''Expected submission date'''&lt;br /&gt;
&lt;br /&gt;
=== [David 2015] ===&lt;br /&gt;
&lt;br /&gt;
* '''Authors and affiliations:''' [[Cedric David]]&lt;br /&gt;
* '''Keywords of research area:'''&lt;br /&gt;
* '''Tentative title:'''&lt;br /&gt;
* '''Short abstract:'''&lt;br /&gt;
* '''Relationship to other publications:''' &lt;br /&gt;
* '''Pointer to the wiki page that documents the article:''' [[Document_GPF_activities_by_Cedric_David | Page]]&lt;br /&gt;
* '''Expected submission date:'''&lt;br /&gt;
&lt;br /&gt;
=== [Demir 2015] ===&lt;br /&gt;
&lt;br /&gt;
* '''Authors and affiliations:''' [[Ibrahim Demir]]&lt;br /&gt;
* '''Keywords of research area:'''&lt;br /&gt;
* '''Tentative title:'''&lt;br /&gt;
* '''Short abstract:'''&lt;br /&gt;
* '''Relationship to other publications:''' (is the article based on a previously published article? is it new content?)&lt;br /&gt;
* '''Pointer to the wiki page that documents the article:''' [[Document_GPF_activities_by_Ibrahim_Demir | Page]]&lt;br /&gt;
* '''Expected submission date:'''&lt;br /&gt;
&lt;br /&gt;
=== [Fulweiler 2015] ===&lt;br /&gt;
&lt;br /&gt;
* '''Authors and affiliations:''' [[Wally Fulweiler]]&lt;br /&gt;
* '''Keywords of research area:'''&lt;br /&gt;
* '''Tentative title:'''&lt;br /&gt;
* '''Short abstract:'''&lt;br /&gt;
* '''Relationship to other publications:''' (is the article based on a previously published article? is it new content?)&lt;br /&gt;
* '''Pointer to the wiki page that documents the article:''' [[Document_GPF_activities_by_Wally_Fulweiler | Page]]&lt;br /&gt;
* '''Expected submission date:'''&lt;br /&gt;
&lt;br /&gt;
=== [Karlstrom and Lay 2015] ===&lt;br /&gt;
&lt;br /&gt;
* '''Authors and affiliations:''' [[Leif Karlstrom]] and [[Lay Kuan Loh]]&lt;br /&gt;
* '''Keywords of research area:'''&lt;br /&gt;
* '''Tentative title:'''&lt;br /&gt;
* '''Short abstract:'''&lt;br /&gt;
* '''Relationship to other publications:''' (is the article based on a previously published article? is it new content?)&lt;br /&gt;
* '''Pointer to the wiki page that documents the article:''' [[Document_GPF_activities_by_Leif_Karlstron | Page]]&lt;br /&gt;
* '''Expected submission date:'''&lt;br /&gt;
&lt;br /&gt;
=== [Lee 2015] ===&lt;br /&gt;
&lt;br /&gt;
* '''Authors and affiliations:''' [[Kyo Lee]]&lt;br /&gt;
* '''Keywords of research area:'''&lt;br /&gt;
* '''Tentative title:'''&lt;br /&gt;
* '''Short abstract:'''&lt;br /&gt;
* '''Relationship to other publications:''' (is the article based on a previously published article? is it new content?)&lt;br /&gt;
* '''Pointer to the wiki page that documents the article:''' [[Document_GPF_activities_by_Kyo_Lee | Page]]&lt;br /&gt;
* '''Expected submission date:'''&lt;br /&gt;
&lt;br /&gt;
=== [Miller 2015] ===&lt;br /&gt;
&lt;br /&gt;
* '''Authors and affiliations:''' [[Kim Miller]]&lt;br /&gt;
* '''Keywords of research area:'''&lt;br /&gt;
* '''Tentative title:'''&lt;br /&gt;
* '''Short abstract:'''&lt;br /&gt;
* '''Relationship to other publications:''' (is the article based on a previously published article? is it new content?)&lt;br /&gt;
* '''Pointer to the wiki page that documents the article:''' [[Document_GPF_activities_by_Kim_Miller | Page]]&lt;br /&gt;
* '''Expected submission date:'''&lt;br /&gt;
&lt;br /&gt;
=== [Mills 2015] ===&lt;br /&gt;
&lt;br /&gt;
* '''Authors and affiliations:''' [[Heath Mills]]&lt;br /&gt;
* '''Keywords of research area:'''&lt;br /&gt;
* '''Tentative title:'''&lt;br /&gt;
* '''Short abstract:'''&lt;br /&gt;
* '''Relationship to other publications:''' (is the article based on a previously published article? is it new content?)&lt;br /&gt;
* '''Pointer to the wiki page that documents the article:''' [[Document_GPF_activities_by_Heith_Mills | Page]]&lt;br /&gt;
* '''Expected submission date:'''&lt;br /&gt;
&lt;br /&gt;
=== [Oh 2015] ===&lt;br /&gt;
&lt;br /&gt;
* '''Authors and affiliations:''' [[Ji-Hyun Oh]]&lt;br /&gt;
* '''Keywords of research area:'''&lt;br /&gt;
* '''Tentative title:'''&lt;br /&gt;
* '''Short abstract:'''&lt;br /&gt;
* '''Relationship to other publications:''' (is the article based on a previously published article? is it new content?)&lt;br /&gt;
* '''Pointer to the wiki page that documents the article:''' [[Document_GPF_activities_by_Ji_Hyun | Page]]&lt;br /&gt;
* '''Expected submission date:'''&lt;br /&gt;
&lt;br /&gt;
=== [Pierce 2015] ===&lt;br /&gt;
&lt;br /&gt;
* '''Authors and affiliations:''' [[Suzanne Pierce]]&lt;br /&gt;
* '''Keywords of research area:'''&lt;br /&gt;
* '''Tentative title:'''&lt;br /&gt;
* '''Short abstract:'''&lt;br /&gt;
* '''Relationship to other publications:''' (is the article based on a previously published article? is it new content?)&lt;br /&gt;
* '''Pointer to the wiki page that documents the article:''' [[Document_GPF_activities_by_Suzanne_Pierce | Page]]&lt;br /&gt;
* '''Expected submission date:'''&lt;br /&gt;
&lt;br /&gt;
=== [Pope 2015] ===&lt;br /&gt;
&lt;br /&gt;
* '''Authors and affiliations:''' [[Allen Pope]]&lt;br /&gt;
* '''Keywords of research area:'''&lt;br /&gt;
* '''Tentative title:'''&lt;br /&gt;
* '''Short abstract:'''&lt;br /&gt;
* '''Relationship to other publications:''' (is the article based on a previously published article? is it new content?)&lt;br /&gt;
* '''Pointer to the wiki page that documents the article:''' [[Document_GPF_activities_by_Allen_Pope | Page]]&lt;br /&gt;
* '''Expected submission date:'''&lt;br /&gt;
&lt;br /&gt;
=== [Read and Winslow 2015] ===&lt;br /&gt;
&lt;br /&gt;
* '''Authors and affiliations:''' [[Jordan Read]] and [[Luke Winslow]]&lt;br /&gt;
* '''Keywords of research area:'''&lt;br /&gt;
* '''Tentative title:'''&lt;br /&gt;
* '''Short abstract:'''&lt;br /&gt;
* '''Relationship to other publications:''' (is the article based on a previously published article? is it new content?)&lt;br /&gt;
* '''Pointer to the wiki page that documents the article:''' [[Document_GPF_activities_by_Jordan_Read | Page]]&lt;br /&gt;
* '''Expected submission date:'''&lt;br /&gt;
&lt;br /&gt;
=== [Tzeng 2015] ===&lt;br /&gt;
&lt;br /&gt;
* '''Authors and affiliations:''' [[Mimi Tzeng]]&lt;br /&gt;
* '''Keywords of research area:'''&lt;br /&gt;
* '''Tentative title:'''&lt;br /&gt;
* '''Short abstract:'''&lt;br /&gt;
* '''Relationship to other publications:''' (is the article based on a previously published article? is it new content?)&lt;br /&gt;
* '''Pointer to the wiki page that documents the article:''' [[Document_GPF_activities_by_Mimi_Tzeng | Page]]&lt;br /&gt;
* '''Expected submission date:'''&lt;br /&gt;
&lt;br /&gt;
=== [Villamizar 2015] ===&lt;br /&gt;
&lt;br /&gt;
* '''Authors and affiliations:''' [[Sandra Villamizar]]&lt;br /&gt;
* '''Keywords of research area:'''&lt;br /&gt;
* '''Tentative title:'''&lt;br /&gt;
* '''Short abstract:'''&lt;br /&gt;
* '''Relationship to other publications:''' (is the article based on a previously published article? is it new content?)&lt;br /&gt;
* '''Pointer to the wiki page that documents the article:''' [[Document_GPF_activities_by_Sandra_Villamizar | Page]]&lt;br /&gt;
* '''Expected submission date:'''&lt;br /&gt;
&lt;br /&gt;
=== [Yu 2015] ===&lt;br /&gt;
&lt;br /&gt;
* '''Authors and affiliations:''' [[Xuan Yu]]&lt;br /&gt;
* '''Keywords of research area:'''&lt;br /&gt;
* '''Tentative title:'''&lt;br /&gt;
* '''Short abstract:'''&lt;br /&gt;
* '''Relationship to other publications:''' (is the article based on a previously published article? is it new content?)&lt;br /&gt;
* '''Pointer to the wiki page that documents the article:''' [[Document_GPF_activities_by_Xuan_Yu | Page]]&lt;br /&gt;
* '''Expected submission date:'''&lt;br /&gt;
&lt;br /&gt;
== Special Issue Editors ==&lt;br /&gt;
&lt;br /&gt;
* Co-editor: &lt;br /&gt;
* Co-editor:&lt;br /&gt;
* Co-editor: &lt;br /&gt;
&lt;br /&gt;
The editors will only accept submissions that follow the [[Develop_proposal_for_special_issue#Special_Issue_Review_Criteria | special issue review criteria]].&lt;br /&gt;
&lt;br /&gt;
The editors will select a set of reviewers to handle the submissions.  Reviewers will include computer scientists, library scientists, and geoscientists.&lt;br /&gt;
&lt;br /&gt;
== Special Issue Review Criteria ==&lt;br /&gt;
&lt;br /&gt;
The reviewers will be asked to provide feedback on the papers according to the following criteria:&lt;br /&gt;
&lt;br /&gt;
* Documentation of the datasets: descriptions of datasets, unique identifiers, repositories.&lt;br /&gt;
* Documentation of software: description of all software used (including pre-processing of data, visualization steps, etc), unique identifiers, repositories.&lt;br /&gt;
* Documentation of the provenance of results: provenance for each figure or result, such as the workflow or the provenance record.&lt;br /&gt;
&lt;br /&gt;
== Tentative Timeline ==&lt;br /&gt;
&lt;br /&gt;
* Journal committed to special issue: April 15, 2015&lt;br /&gt;
* Submissions due to editors: June 30, 2015&lt;br /&gt;
* Reviews due: Sept 15, 2015&lt;br /&gt;
* Decisions out to authors: Sept 30, 2015&lt;br /&gt;
* Revisions due: October 31, 2015&lt;br /&gt;
* Final versions due November 15, 2015&lt;br /&gt;
* Issue published December 31, 2015&lt;br /&gt;
&lt;br /&gt;
&amp;lt;!-- Add any wiki Text above this Line --&amp;gt;&lt;br /&gt;
&amp;lt;!-- Do NOT Edit below this Line --&amp;gt;&lt;br /&gt;
{{#set:&lt;br /&gt;
	Owner=Yolanda_Gil|&lt;br /&gt;
	Participants=Xuan_Yu|&lt;br /&gt;
	Participants=Chris_Duffy|&lt;br /&gt;
	Participants=Scott_Peckham|&lt;br /&gt;
	Participants=Cedric_David|&lt;br /&gt;
	Participants=Ibrahim_Demir|&lt;br /&gt;
	Participants=Wally_Fulweiler|&lt;br /&gt;
	Participants=Leif_Karlstrom|&lt;br /&gt;
	Participants=Kyo_Lee|&lt;br /&gt;
	Participants=Kim_Miller|&lt;br /&gt;
	Participants=Heath_Mills|&lt;br /&gt;
	Participants=Ji-Hyun_Oh|&lt;br /&gt;
	Participants=Suzanne_Pierce|&lt;br /&gt;
	Participants=Allen_Pope|&lt;br /&gt;
	Participants=Jordan_Read|&lt;br /&gt;
	Participants=Mimi_Tzeng|&lt;br /&gt;
	Participants=Sandra_Villamizar|&lt;br /&gt;
	Progress=20|&lt;br /&gt;
	StartDate=2015-03-10|&lt;br /&gt;
	TargetDate=2015-03-16|&lt;br /&gt;
	Type=Low}}&lt;/div&gt;</summary>
		<author><name>Suzanne</name></author>	</entry>

	<entry>
		<id>https://www.organicdatascience.org/gpf/index.php?title=Develop_proposal_for_special_issue&amp;diff=11283</id>
		<title>Develop proposal for special issue</title>
		<link rel="alternate" type="text/html" href="https://www.organicdatascience.org/gpf/index.php?title=Develop_proposal_for_special_issue&amp;diff=11283"/>
				<updated>2015-03-13T16:20:27Z</updated>
		
		<summary type="html">&lt;p&gt;Suzanne: /* What is a GPF */&lt;/p&gt;
&lt;hr /&gt;
&lt;div&gt;[[Category:Task]]&lt;br /&gt;
&lt;br /&gt;
== Background: Why a Special Issue on Geoscience Papers of the Future? ==&lt;br /&gt;
&lt;br /&gt;
[[Discuss_what_we_will_consider_a_GPF#The_Vision | Include here our discussion for the vision]]&lt;br /&gt;
&lt;br /&gt;
Motivated by need to fully document and make research accessible and reproducable. &lt;br /&gt;
&lt;br /&gt;
=== Motivation: The EarthCube Initiative and the GeoSoft Project ===&lt;br /&gt;
&lt;br /&gt;
[http://www.geosoft-earthcube.org/about Include here background about GeoSoft from the web site]&lt;br /&gt;
&lt;br /&gt;
=== What is a GPF ===&lt;br /&gt;
&lt;br /&gt;
[[Discuss_what_we_will_consider_a_GPF#What_is_a_Geoscience_Paper_of_the_Future.3F | Include here our discussion of what is a GPF]]&lt;br /&gt;
&lt;br /&gt;
The articles in this issue reflect the current best practice for generating a Geoscience Paper of the Future.&lt;br /&gt;
&lt;br /&gt;
Figures in the paper may be a clean version of an image generated by software.  To the extent possible, authors have included clear delineations of provenance.&lt;br /&gt;
&lt;br /&gt;
=== Related work ===&lt;br /&gt;
&lt;br /&gt;
[[Discuss_what_we_will_consider_a_GPF#New_Frameworks_to_Create_a_New_Generation_of_Scientific_Articles | Include here the related work we have discussed]]&lt;br /&gt;
&lt;br /&gt;
== Papers to be included ==&lt;br /&gt;
&lt;br /&gt;
For each submission, we describe:&lt;br /&gt;
&lt;br /&gt;
* '''Authors and affiliations'''&lt;br /&gt;
* '''Keywords of research area'''&lt;br /&gt;
* '''Tentative title'''&lt;br /&gt;
* '''Short abstract'''&lt;br /&gt;
* '''Relationship to other publications''' (is the article based on a previously published article? is it new content?)&lt;br /&gt;
* '''Pointer to the wiki page that documents the article'''&lt;br /&gt;
* '''Expected submission date'''&lt;br /&gt;
&lt;br /&gt;
=== [David 2015] ===&lt;br /&gt;
&lt;br /&gt;
* '''Authors and affiliations:''' [[Cedric David]]&lt;br /&gt;
* '''Keywords of research area:'''&lt;br /&gt;
* '''Tentative title:'''&lt;br /&gt;
* '''Short abstract:'''&lt;br /&gt;
* '''Relationship to other publications:''' &lt;br /&gt;
* '''Pointer to the wiki page that documents the article:''' [[Document_GPF_activities_by_Cedric_David | Page]]&lt;br /&gt;
* '''Expected submission date:'''&lt;br /&gt;
&lt;br /&gt;
=== [Demir 2015] ===&lt;br /&gt;
&lt;br /&gt;
* '''Authors and affiliations:''' [[Ibrahim Demir]]&lt;br /&gt;
* '''Keywords of research area:'''&lt;br /&gt;
* '''Tentative title:'''&lt;br /&gt;
* '''Short abstract:'''&lt;br /&gt;
* '''Relationship to other publications:''' (is the article based on a previously published article? is it new content?)&lt;br /&gt;
* '''Pointer to the wiki page that documents the article:''' [[Document_GPF_activities_by_Ibrahim_Demir | Page]]&lt;br /&gt;
* '''Expected submission date:'''&lt;br /&gt;
&lt;br /&gt;
=== [Fulweiler 2015] ===&lt;br /&gt;
&lt;br /&gt;
* '''Authors and affiliations:''' [[Wally Fulweiler]]&lt;br /&gt;
* '''Keywords of research area:'''&lt;br /&gt;
* '''Tentative title:'''&lt;br /&gt;
* '''Short abstract:'''&lt;br /&gt;
* '''Relationship to other publications:''' (is the article based on a previously published article? is it new content?)&lt;br /&gt;
* '''Pointer to the wiki page that documents the article:''' [[Document_GPF_activities_by_Wally_Fulweiler | Page]]&lt;br /&gt;
* '''Expected submission date:'''&lt;br /&gt;
&lt;br /&gt;
=== [Karlstrom and Lay 2015] ===&lt;br /&gt;
&lt;br /&gt;
* '''Authors and affiliations:''' [[Leif Karlstrom]] and [[Lay Kuan Loh]]&lt;br /&gt;
* '''Keywords of research area:'''&lt;br /&gt;
* '''Tentative title:'''&lt;br /&gt;
* '''Short abstract:'''&lt;br /&gt;
* '''Relationship to other publications:''' (is the article based on a previously published article? is it new content?)&lt;br /&gt;
* '''Pointer to the wiki page that documents the article:''' [[Document_GPF_activities_by_Leif_Karlstron | Page]]&lt;br /&gt;
* '''Expected submission date:'''&lt;br /&gt;
&lt;br /&gt;
=== [Lee 2015] ===&lt;br /&gt;
&lt;br /&gt;
* '''Authors and affiliations:''' [[Kyo Lee]]&lt;br /&gt;
* '''Keywords of research area:'''&lt;br /&gt;
* '''Tentative title:'''&lt;br /&gt;
* '''Short abstract:'''&lt;br /&gt;
* '''Relationship to other publications:''' (is the article based on a previously published article? is it new content?)&lt;br /&gt;
* '''Pointer to the wiki page that documents the article:''' [[Document_GPF_activities_by_Kyo_Lee | Page]]&lt;br /&gt;
* '''Expected submission date:'''&lt;br /&gt;
&lt;br /&gt;
=== [Miller 2015] ===&lt;br /&gt;
&lt;br /&gt;
* '''Authors and affiliations:''' [[Kim Miller]]&lt;br /&gt;
* '''Keywords of research area:'''&lt;br /&gt;
* '''Tentative title:'''&lt;br /&gt;
* '''Short abstract:'''&lt;br /&gt;
* '''Relationship to other publications:''' (is the article based on a previously published article? is it new content?)&lt;br /&gt;
* '''Pointer to the wiki page that documents the article:''' [[Document_GPF_activities_by_Kim_Miller | Page]]&lt;br /&gt;
* '''Expected submission date:'''&lt;br /&gt;
&lt;br /&gt;
=== [Mills 2015] ===&lt;br /&gt;
&lt;br /&gt;
* '''Authors and affiliations:''' [[Heath Mills]]&lt;br /&gt;
* '''Keywords of research area:'''&lt;br /&gt;
* '''Tentative title:'''&lt;br /&gt;
* '''Short abstract:'''&lt;br /&gt;
* '''Relationship to other publications:''' (is the article based on a previously published article? is it new content?)&lt;br /&gt;
* '''Pointer to the wiki page that documents the article:''' [[Document_GPF_activities_by_Heith_Mills | Page]]&lt;br /&gt;
* '''Expected submission date:'''&lt;br /&gt;
&lt;br /&gt;
=== [Oh 2015] ===&lt;br /&gt;
&lt;br /&gt;
* '''Authors and affiliations:''' [[Ji-Hyun Oh]]&lt;br /&gt;
* '''Keywords of research area:'''&lt;br /&gt;
* '''Tentative title:'''&lt;br /&gt;
* '''Short abstract:'''&lt;br /&gt;
* '''Relationship to other publications:''' (is the article based on a previously published article? is it new content?)&lt;br /&gt;
* '''Pointer to the wiki page that documents the article:''' [[Document_GPF_activities_by_Ji_Hyun | Page]]&lt;br /&gt;
* '''Expected submission date:'''&lt;br /&gt;
&lt;br /&gt;
=== [Pierce 2015] ===&lt;br /&gt;
&lt;br /&gt;
* '''Authors and affiliations:''' [[Suzanne Pierce]]&lt;br /&gt;
* '''Keywords of research area:'''&lt;br /&gt;
* '''Tentative title:'''&lt;br /&gt;
* '''Short abstract:'''&lt;br /&gt;
* '''Relationship to other publications:''' (is the article based on a previously published article? is it new content?)&lt;br /&gt;
* '''Pointer to the wiki page that documents the article:''' [[Document_GPF_activities_by_Suzanne_Pierce | Page]]&lt;br /&gt;
* '''Expected submission date:'''&lt;br /&gt;
&lt;br /&gt;
=== [Pope 2015] ===&lt;br /&gt;
&lt;br /&gt;
* '''Authors and affiliations:''' [[Allen Pope]]&lt;br /&gt;
* '''Keywords of research area:'''&lt;br /&gt;
* '''Tentative title:'''&lt;br /&gt;
* '''Short abstract:'''&lt;br /&gt;
* '''Relationship to other publications:''' (is the article based on a previously published article? is it new content?)&lt;br /&gt;
* '''Pointer to the wiki page that documents the article:''' [[Document_GPF_activities_by_Allen_Pope | Page]]&lt;br /&gt;
* '''Expected submission date:'''&lt;br /&gt;
&lt;br /&gt;
=== [Read and Winslow 2015] ===&lt;br /&gt;
&lt;br /&gt;
* '''Authors and affiliations:''' [[Jordan Read]] and [[Luke Winslow]]&lt;br /&gt;
* '''Keywords of research area:'''&lt;br /&gt;
* '''Tentative title:'''&lt;br /&gt;
* '''Short abstract:'''&lt;br /&gt;
* '''Relationship to other publications:''' (is the article based on a previously published article? is it new content?)&lt;br /&gt;
* '''Pointer to the wiki page that documents the article:''' [[Document_GPF_activities_by_Jordan_Read | Page]]&lt;br /&gt;
* '''Expected submission date:'''&lt;br /&gt;
&lt;br /&gt;
=== [Tzeng 2015] ===&lt;br /&gt;
&lt;br /&gt;
* '''Authors and affiliations:''' [[Mimi Tzeng]]&lt;br /&gt;
* '''Keywords of research area:'''&lt;br /&gt;
* '''Tentative title:'''&lt;br /&gt;
* '''Short abstract:'''&lt;br /&gt;
* '''Relationship to other publications:''' (is the article based on a previously published article? is it new content?)&lt;br /&gt;
* '''Pointer to the wiki page that documents the article:''' [[Document_GPF_activities_by_Mimi_Tzeng | Page]]&lt;br /&gt;
* '''Expected submission date:'''&lt;br /&gt;
&lt;br /&gt;
=== [Villamizar 2015] ===&lt;br /&gt;
&lt;br /&gt;
* '''Authors and affiliations:''' [[Sandra Villamizar]]&lt;br /&gt;
* '''Keywords of research area:'''&lt;br /&gt;
* '''Tentative title:'''&lt;br /&gt;
* '''Short abstract:'''&lt;br /&gt;
* '''Relationship to other publications:''' (is the article based on a previously published article? is it new content?)&lt;br /&gt;
* '''Pointer to the wiki page that documents the article:''' [[Document_GPF_activities_by_Sandra_Villamizar | Page]]&lt;br /&gt;
* '''Expected submission date:'''&lt;br /&gt;
&lt;br /&gt;
=== [Yu 2015] ===&lt;br /&gt;
&lt;br /&gt;
* '''Authors and affiliations:''' [[Xuan Yu]]&lt;br /&gt;
* '''Keywords of research area:'''&lt;br /&gt;
* '''Tentative title:'''&lt;br /&gt;
* '''Short abstract:'''&lt;br /&gt;
* '''Relationship to other publications:''' (is the article based on a previously published article? is it new content?)&lt;br /&gt;
* '''Pointer to the wiki page that documents the article:''' [[Document_GPF_activities_by_Xuan_Yu | Page]]&lt;br /&gt;
* '''Expected submission date:'''&lt;br /&gt;
&lt;br /&gt;
== Special Issue Editors ==&lt;br /&gt;
&lt;br /&gt;
* Co-editor: &lt;br /&gt;
* Co-editor:&lt;br /&gt;
* Co-editor: &lt;br /&gt;
&lt;br /&gt;
The editors will only accept submissions that follow the [[Develop_proposal_for_special_issue#Special_Issue_Review_Criteria | special issue review criteria]].&lt;br /&gt;
&lt;br /&gt;
The editors will select a set of reviewers to handle the submissions.  Reviewers will include computer scientists, library scientists, and geoscientists.&lt;br /&gt;
&lt;br /&gt;
== Special Issue Review Criteria ==&lt;br /&gt;
&lt;br /&gt;
The reviewers will be asked to provide feedback on the papers according to the following criteria:&lt;br /&gt;
&lt;br /&gt;
* Documentation of the datasets: descriptions of datasets, unique identifiers, repositories.&lt;br /&gt;
* Documentation of software: description of all software used (including pre-processing of data, visualization steps, etc), unique identifiers, repositories.&lt;br /&gt;
* Documentation of the provenance of results: provenance for each figure or result, such as the workflow or the provenance record.&lt;br /&gt;
&lt;br /&gt;
== Tentative Timeline ==&lt;br /&gt;
&lt;br /&gt;
* Journal committed to special issue: April 15, 2015&lt;br /&gt;
* Submissions due to editors: June 30, 2015&lt;br /&gt;
* Reviews due: Sept 15, 2015&lt;br /&gt;
* Decisions out to authors: Sept 30, 2015&lt;br /&gt;
* Revisions due: October 31, 2015&lt;br /&gt;
* Final versions due November 15, 2015&lt;br /&gt;
* Issue published December 31, 2015&lt;br /&gt;
&lt;br /&gt;
&amp;lt;!-- Add any wiki Text above this Line --&amp;gt;&lt;br /&gt;
&amp;lt;!-- Do NOT Edit below this Line --&amp;gt;&lt;br /&gt;
{{#set:&lt;br /&gt;
	Owner=Yolanda_Gil|&lt;br /&gt;
	Participants=Xuan_Yu|&lt;br /&gt;
	Participants=Chris_Duffy|&lt;br /&gt;
	Participants=Scott_Peckham|&lt;br /&gt;
	Participants=Cedric_David|&lt;br /&gt;
	Participants=Ibrahim_Demir|&lt;br /&gt;
	Participants=Wally_Fulweiler|&lt;br /&gt;
	Participants=Leif_Karlstrom|&lt;br /&gt;
	Participants=Kyo_Lee|&lt;br /&gt;
	Participants=Kim_Miller|&lt;br /&gt;
	Participants=Heath_Mills|&lt;br /&gt;
	Participants=Ji-Hyun_Oh|&lt;br /&gt;
	Participants=Suzanne_Pierce|&lt;br /&gt;
	Participants=Allen_Pope|&lt;br /&gt;
	Participants=Jordan_Read|&lt;br /&gt;
	Participants=Mimi_Tzeng|&lt;br /&gt;
	Participants=Sandra_Villamizar|&lt;br /&gt;
	Progress=20|&lt;br /&gt;
	StartDate=2015-03-10|&lt;br /&gt;
	TargetDate=2015-03-16|&lt;br /&gt;
	Type=Low}}&lt;/div&gt;</summary>
		<author><name>Suzanne</name></author>	</entry>

	<entry>
		<id>https://www.organicdatascience.org/gpf/index.php?title=Develop_proposal_for_special_issue&amp;diff=11279</id>
		<title>Develop proposal for special issue</title>
		<link rel="alternate" type="text/html" href="https://www.organicdatascience.org/gpf/index.php?title=Develop_proposal_for_special_issue&amp;diff=11279"/>
				<updated>2015-03-13T16:16:18Z</updated>
		
		<summary type="html">&lt;p&gt;Suzanne: /* What is a GPF */&lt;/p&gt;
&lt;hr /&gt;
&lt;div&gt;[[Category:Task]]&lt;br /&gt;
&lt;br /&gt;
== Background: Why a Special Issue on Geoscience Papers of the Future? ==&lt;br /&gt;
&lt;br /&gt;
[[Discuss_what_we_will_consider_a_GPF#The_Vision | Include here our discussion for the vision]]&lt;br /&gt;
&lt;br /&gt;
Motivated by need to fully document and make research accessible and reproducable. &lt;br /&gt;
&lt;br /&gt;
=== Motivation: The EarthCube Initiative and the GeoSoft Project ===&lt;br /&gt;
&lt;br /&gt;
[http://www.geosoft-earthcube.org/about Include here background about GeoSoft from the web site]&lt;br /&gt;
&lt;br /&gt;
=== What is a GPF ===&lt;br /&gt;
&lt;br /&gt;
[[Discuss_what_we_will_consider_a_GPF#What_is_a_Geoscience_Paper_of_the_Future.3F | Include here our discussion of what is a GPF]]&lt;br /&gt;
&lt;br /&gt;
The articles in this issue reflect the current best practice for generating a Geoscience Paper of the Future.&lt;br /&gt;
&lt;br /&gt;
=== Related work ===&lt;br /&gt;
&lt;br /&gt;
[[Discuss_what_we_will_consider_a_GPF#New_Frameworks_to_Create_a_New_Generation_of_Scientific_Articles | Include here the related work we have discussed]]&lt;br /&gt;
&lt;br /&gt;
== Papers to be included ==&lt;br /&gt;
&lt;br /&gt;
For each submission, we describe:&lt;br /&gt;
&lt;br /&gt;
* '''Authors and affiliations'''&lt;br /&gt;
* '''Keywords of research area'''&lt;br /&gt;
* '''Tentative title'''&lt;br /&gt;
* '''Short abstract'''&lt;br /&gt;
* '''Relationship to other publications''' (is the article based on a previously published article? is it new content?)&lt;br /&gt;
* '''Pointer to the wiki page that documents the article'''&lt;br /&gt;
* '''Expected submission date'''&lt;br /&gt;
&lt;br /&gt;
=== [David 2015] ===&lt;br /&gt;
&lt;br /&gt;
* '''Authors and affiliations:''' [[Cedric David]]&lt;br /&gt;
* '''Keywords of research area:'''&lt;br /&gt;
* '''Tentative title:'''&lt;br /&gt;
* '''Short abstract:'''&lt;br /&gt;
* '''Relationship to other publications:''' &lt;br /&gt;
* '''Pointer to the wiki page that documents the article:''' [[Document_GPF_activities_by_Cedric_David | Page]]&lt;br /&gt;
* '''Expected submission date:'''&lt;br /&gt;
&lt;br /&gt;
=== [Demir 2015] ===&lt;br /&gt;
&lt;br /&gt;
* '''Authors and affiliations:''' [[Ibrahim Demir]]&lt;br /&gt;
* '''Keywords of research area:'''&lt;br /&gt;
* '''Tentative title:'''&lt;br /&gt;
* '''Short abstract:'''&lt;br /&gt;
* '''Relationship to other publications:''' (is the article based on a previously published article? is it new content?)&lt;br /&gt;
* '''Pointer to the wiki page that documents the article:''' [[Document_GPF_activities_by_Ibrahim_Demir | Page]]&lt;br /&gt;
* '''Expected submission date:'''&lt;br /&gt;
&lt;br /&gt;
=== [Fulweiler 2015] ===&lt;br /&gt;
&lt;br /&gt;
* '''Authors and affiliations:''' [[Wally Fulweiler]]&lt;br /&gt;
* '''Keywords of research area:'''&lt;br /&gt;
* '''Tentative title:'''&lt;br /&gt;
* '''Short abstract:'''&lt;br /&gt;
* '''Relationship to other publications:''' (is the article based on a previously published article? is it new content?)&lt;br /&gt;
* '''Pointer to the wiki page that documents the article:''' [[Document_GPF_activities_by_Wally_Fulweiler | Page]]&lt;br /&gt;
* '''Expected submission date:'''&lt;br /&gt;
&lt;br /&gt;
=== [Karlstrom and Lay 2015] ===&lt;br /&gt;
&lt;br /&gt;
* '''Authors and affiliations:''' [[Leif Karlstrom]] and [[Lay Kuan Loh]]&lt;br /&gt;
* '''Keywords of research area:'''&lt;br /&gt;
* '''Tentative title:'''&lt;br /&gt;
* '''Short abstract:'''&lt;br /&gt;
* '''Relationship to other publications:''' (is the article based on a previously published article? is it new content?)&lt;br /&gt;
* '''Pointer to the wiki page that documents the article:''' [[Document_GPF_activities_by_Leif_Karlstron | Page]]&lt;br /&gt;
* '''Expected submission date:'''&lt;br /&gt;
&lt;br /&gt;
=== [Lee 2015] ===&lt;br /&gt;
&lt;br /&gt;
* '''Authors and affiliations:''' [[Kyo Lee]]&lt;br /&gt;
* '''Keywords of research area:'''&lt;br /&gt;
* '''Tentative title:'''&lt;br /&gt;
* '''Short abstract:'''&lt;br /&gt;
* '''Relationship to other publications:''' (is the article based on a previously published article? is it new content?)&lt;br /&gt;
* '''Pointer to the wiki page that documents the article:''' [[Document_GPF_activities_by_Kyo_Lee | Page]]&lt;br /&gt;
* '''Expected submission date:'''&lt;br /&gt;
&lt;br /&gt;
=== [Miller 2015] ===&lt;br /&gt;
&lt;br /&gt;
* '''Authors and affiliations:''' [[Kim Miller]]&lt;br /&gt;
* '''Keywords of research area:'''&lt;br /&gt;
* '''Tentative title:'''&lt;br /&gt;
* '''Short abstract:'''&lt;br /&gt;
* '''Relationship to other publications:''' (is the article based on a previously published article? is it new content?)&lt;br /&gt;
* '''Pointer to the wiki page that documents the article:''' [[Document_GPF_activities_by_Kim_Miller | Page]]&lt;br /&gt;
* '''Expected submission date:'''&lt;br /&gt;
&lt;br /&gt;
=== [Mills 2015] ===&lt;br /&gt;
&lt;br /&gt;
* '''Authors and affiliations:''' [[Heath Mills]]&lt;br /&gt;
* '''Keywords of research area:'''&lt;br /&gt;
* '''Tentative title:'''&lt;br /&gt;
* '''Short abstract:'''&lt;br /&gt;
* '''Relationship to other publications:''' (is the article based on a previously published article? is it new content?)&lt;br /&gt;
* '''Pointer to the wiki page that documents the article:''' [[Document_GPF_activities_by_Heith_Mills | Page]]&lt;br /&gt;
* '''Expected submission date:'''&lt;br /&gt;
&lt;br /&gt;
=== [Oh 2015] ===&lt;br /&gt;
&lt;br /&gt;
* '''Authors and affiliations:''' [[Ji-Hyun Oh]]&lt;br /&gt;
* '''Keywords of research area:'''&lt;br /&gt;
* '''Tentative title:'''&lt;br /&gt;
* '''Short abstract:'''&lt;br /&gt;
* '''Relationship to other publications:''' (is the article based on a previously published article? is it new content?)&lt;br /&gt;
* '''Pointer to the wiki page that documents the article:''' [[Document_GPF_activities_by_Ji_Hyun | Page]]&lt;br /&gt;
* '''Expected submission date:'''&lt;br /&gt;
&lt;br /&gt;
=== [Pierce 2015] ===&lt;br /&gt;
&lt;br /&gt;
* '''Authors and affiliations:''' [[Suzanne Pierce]]&lt;br /&gt;
* '''Keywords of research area:'''&lt;br /&gt;
* '''Tentative title:'''&lt;br /&gt;
* '''Short abstract:'''&lt;br /&gt;
* '''Relationship to other publications:''' (is the article based on a previously published article? is it new content?)&lt;br /&gt;
* '''Pointer to the wiki page that documents the article:''' [[Document_GPF_activities_by_Suzanne_Pierce | Page]]&lt;br /&gt;
* '''Expected submission date:'''&lt;br /&gt;
&lt;br /&gt;
=== [Pope 2015] ===&lt;br /&gt;
&lt;br /&gt;
* '''Authors and affiliations:''' [[Allen Pope]]&lt;br /&gt;
* '''Keywords of research area:'''&lt;br /&gt;
* '''Tentative title:'''&lt;br /&gt;
* '''Short abstract:'''&lt;br /&gt;
* '''Relationship to other publications:''' (is the article based on a previously published article? is it new content?)&lt;br /&gt;
* '''Pointer to the wiki page that documents the article:''' [[Document_GPF_activities_by_Allen_Pope | Page]]&lt;br /&gt;
* '''Expected submission date:'''&lt;br /&gt;
&lt;br /&gt;
=== [Read and Winslow 2015] ===&lt;br /&gt;
&lt;br /&gt;
* '''Authors and affiliations:''' [[Jordan Read]] and [[Luke Winslow]]&lt;br /&gt;
* '''Keywords of research area:'''&lt;br /&gt;
* '''Tentative title:'''&lt;br /&gt;
* '''Short abstract:'''&lt;br /&gt;
* '''Relationship to other publications:''' (is the article based on a previously published article? is it new content?)&lt;br /&gt;
* '''Pointer to the wiki page that documents the article:''' [[Document_GPF_activities_by_Jordan_Read | Page]]&lt;br /&gt;
* '''Expected submission date:'''&lt;br /&gt;
&lt;br /&gt;
=== [Tzeng 2015] ===&lt;br /&gt;
&lt;br /&gt;
* '''Authors and affiliations:''' [[Mimi Tzeng]]&lt;br /&gt;
* '''Keywords of research area:'''&lt;br /&gt;
* '''Tentative title:'''&lt;br /&gt;
* '''Short abstract:'''&lt;br /&gt;
* '''Relationship to other publications:''' (is the article based on a previously published article? is it new content?)&lt;br /&gt;
* '''Pointer to the wiki page that documents the article:''' [[Document_GPF_activities_by_Mimi_Tzeng | Page]]&lt;br /&gt;
* '''Expected submission date:'''&lt;br /&gt;
&lt;br /&gt;
=== [Villamizar 2015] ===&lt;br /&gt;
&lt;br /&gt;
* '''Authors and affiliations:''' [[Sandra Villamizar]]&lt;br /&gt;
* '''Keywords of research area:'''&lt;br /&gt;
* '''Tentative title:'''&lt;br /&gt;
* '''Short abstract:'''&lt;br /&gt;
* '''Relationship to other publications:''' (is the article based on a previously published article? is it new content?)&lt;br /&gt;
* '''Pointer to the wiki page that documents the article:''' [[Document_GPF_activities_by_Sandra_Villamizar | Page]]&lt;br /&gt;
* '''Expected submission date:'''&lt;br /&gt;
&lt;br /&gt;
=== [Yu 2015] ===&lt;br /&gt;
&lt;br /&gt;
* '''Authors and affiliations:''' [[Xuan Yu]]&lt;br /&gt;
* '''Keywords of research area:'''&lt;br /&gt;
* '''Tentative title:'''&lt;br /&gt;
* '''Short abstract:'''&lt;br /&gt;
* '''Relationship to other publications:''' (is the article based on a previously published article? is it new content?)&lt;br /&gt;
* '''Pointer to the wiki page that documents the article:''' [[Document_GPF_activities_by_Xuan_Yu | Page]]&lt;br /&gt;
* '''Expected submission date:'''&lt;br /&gt;
&lt;br /&gt;
== Special Issue Editors ==&lt;br /&gt;
&lt;br /&gt;
* Co-editor: &lt;br /&gt;
* Co-editor:&lt;br /&gt;
* Co-editor: &lt;br /&gt;
&lt;br /&gt;
The editors will only accept submissions that follow the [[Develop_proposal_for_special_issue#Special_Issue_Review_Criteria | special issue review criteria]].&lt;br /&gt;
&lt;br /&gt;
The editors will select a set of reviewers to handle the submissions.  Reviewers will include computer scientists, library scientists, and geoscientists.&lt;br /&gt;
&lt;br /&gt;
== Special Issue Review Criteria ==&lt;br /&gt;
&lt;br /&gt;
The reviewers will be asked to provide feedback on the papers according to the following criteria:&lt;br /&gt;
&lt;br /&gt;
* Documentation of the datasets: descriptions of datasets, unique identifiers, repositories.&lt;br /&gt;
* Documentation of software: description of all software used (including pre-processing of data, visualization steps, etc), unique identifiers, repositories.&lt;br /&gt;
* Documentation of the provenance of results: provenance for each figure or result, such as the workflow or the provenance record.&lt;br /&gt;
&lt;br /&gt;
== Tentative Timeline ==&lt;br /&gt;
&lt;br /&gt;
* Journal committed to special issue: April 15, 2015&lt;br /&gt;
* Submissions due to editors: June 30, 2015&lt;br /&gt;
* Reviews due: Sept 15, 2015&lt;br /&gt;
* Decisions out to authors: Sept 30, 2015&lt;br /&gt;
* Revisions due: October 31, 2015&lt;br /&gt;
* Final versions due November 15, 2015&lt;br /&gt;
* Issue published December 31, 2015&lt;br /&gt;
&lt;br /&gt;
&amp;lt;!-- Add any wiki Text above this Line --&amp;gt;&lt;br /&gt;
&amp;lt;!-- Do NOT Edit below this Line --&amp;gt;&lt;br /&gt;
{{#set:&lt;br /&gt;
	Owner=Yolanda_Gil|&lt;br /&gt;
	Participants=Xuan_Yu|&lt;br /&gt;
	Participants=Chris_Duffy|&lt;br /&gt;
	Participants=Scott_Peckham|&lt;br /&gt;
	Participants=Cedric_David|&lt;br /&gt;
	Participants=Ibrahim_Demir|&lt;br /&gt;
	Participants=Wally_Fulweiler|&lt;br /&gt;
	Participants=Leif_Karlstrom|&lt;br /&gt;
	Participants=Kyo_Lee|&lt;br /&gt;
	Participants=Kim_Miller|&lt;br /&gt;
	Participants=Heath_Mills|&lt;br /&gt;
	Participants=Ji-Hyun_Oh|&lt;br /&gt;
	Participants=Suzanne_Pierce|&lt;br /&gt;
	Participants=Allen_Pope|&lt;br /&gt;
	Participants=Jordan_Read|&lt;br /&gt;
	Participants=Mimi_Tzeng|&lt;br /&gt;
	Participants=Sandra_Villamizar|&lt;br /&gt;
	Progress=20|&lt;br /&gt;
	StartDate=2015-03-10|&lt;br /&gt;
	TargetDate=2015-03-16|&lt;br /&gt;
	Type=Low}}&lt;/div&gt;</summary>
		<author><name>Suzanne</name></author>	</entry>

	<entry>
		<id>https://www.organicdatascience.org/gpf/index.php?title=Develop_proposal_for_special_issue&amp;diff=11277</id>
		<title>Develop proposal for special issue</title>
		<link rel="alternate" type="text/html" href="https://www.organicdatascience.org/gpf/index.php?title=Develop_proposal_for_special_issue&amp;diff=11277"/>
				<updated>2015-03-13T16:13:55Z</updated>
		
		<summary type="html">&lt;p&gt;Suzanne: &lt;/p&gt;
&lt;hr /&gt;
&lt;div&gt;[[Category:Task]]&lt;br /&gt;
&lt;br /&gt;
== Background: Why a Special Issue on Geoscience Papers of the Future? ==&lt;br /&gt;
&lt;br /&gt;
[[Discuss_what_we_will_consider_a_GPF#The_Vision | Include here our discussion for the vision]]&lt;br /&gt;
&lt;br /&gt;
Motivated by need to fully document and make research accessible and reproducable. &lt;br /&gt;
&lt;br /&gt;
=== Motivation: The EarthCube Initiative and the GeoSoft Project ===&lt;br /&gt;
&lt;br /&gt;
[http://www.geosoft-earthcube.org/about Include here background about GeoSoft from the web site]&lt;br /&gt;
&lt;br /&gt;
=== What is a GPF ===&lt;br /&gt;
&lt;br /&gt;
[[Discuss_what_we_will_consider_a_GPF#What_is_a_Geoscience_Paper_of_the_Future.3F | Include here our discussion of what is a GPF]]&lt;br /&gt;
&lt;br /&gt;
=== Related work ===&lt;br /&gt;
&lt;br /&gt;
[[Discuss_what_we_will_consider_a_GPF#New_Frameworks_to_Create_a_New_Generation_of_Scientific_Articles | Include here the related work we have discussed]]&lt;br /&gt;
&lt;br /&gt;
== Papers to be included ==&lt;br /&gt;
&lt;br /&gt;
For each submission, we describe:&lt;br /&gt;
&lt;br /&gt;
* '''Authors and affiliations'''&lt;br /&gt;
* '''Keywords of research area'''&lt;br /&gt;
* '''Tentative title'''&lt;br /&gt;
* '''Short abstract'''&lt;br /&gt;
* '''Relationship to other publications''' (is the article based on a previously published article? is it new content?)&lt;br /&gt;
* '''Pointer to the wiki page that documents the article'''&lt;br /&gt;
* '''Expected submission date'''&lt;br /&gt;
&lt;br /&gt;
=== [David 2015] ===&lt;br /&gt;
&lt;br /&gt;
* '''Authors and affiliations:''' [[Cedric David]]&lt;br /&gt;
* '''Keywords of research area:'''&lt;br /&gt;
* '''Tentative title:'''&lt;br /&gt;
* '''Short abstract:'''&lt;br /&gt;
* '''Relationship to other publications:''' &lt;br /&gt;
* '''Pointer to the wiki page that documents the article:''' [[Document_GPF_activities_by_Cedric_David | Page]]&lt;br /&gt;
* '''Expected submission date:'''&lt;br /&gt;
&lt;br /&gt;
=== [Demir 2015] ===&lt;br /&gt;
&lt;br /&gt;
* '''Authors and affiliations:''' [[Ibrahim Demir]]&lt;br /&gt;
* '''Keywords of research area:'''&lt;br /&gt;
* '''Tentative title:'''&lt;br /&gt;
* '''Short abstract:'''&lt;br /&gt;
* '''Relationship to other publications:''' (is the article based on a previously published article? is it new content?)&lt;br /&gt;
* '''Pointer to the wiki page that documents the article:''' [[Document_GPF_activities_by_Ibrahim_Demir | Page]]&lt;br /&gt;
* '''Expected submission date:'''&lt;br /&gt;
&lt;br /&gt;
=== [Fulweiler 2015] ===&lt;br /&gt;
&lt;br /&gt;
* '''Authors and affiliations:''' [[Wally Fulweiler]]&lt;br /&gt;
* '''Keywords of research area:'''&lt;br /&gt;
* '''Tentative title:'''&lt;br /&gt;
* '''Short abstract:'''&lt;br /&gt;
* '''Relationship to other publications:''' (is the article based on a previously published article? is it new content?)&lt;br /&gt;
* '''Pointer to the wiki page that documents the article:''' [[Document_GPF_activities_by_Wally_Fulweiler | Page]]&lt;br /&gt;
* '''Expected submission date:'''&lt;br /&gt;
&lt;br /&gt;
=== [Karlstrom and Lay 2015] ===&lt;br /&gt;
&lt;br /&gt;
* '''Authors and affiliations:''' [[Leif Karlstrom]] and [[Lay Kuan Loh]]&lt;br /&gt;
* '''Keywords of research area:'''&lt;br /&gt;
* '''Tentative title:'''&lt;br /&gt;
* '''Short abstract:'''&lt;br /&gt;
* '''Relationship to other publications:''' (is the article based on a previously published article? is it new content?)&lt;br /&gt;
* '''Pointer to the wiki page that documents the article:''' [[Document_GPF_activities_by_Leif_Karlstron | Page]]&lt;br /&gt;
* '''Expected submission date:'''&lt;br /&gt;
&lt;br /&gt;
=== [Lee 2015] ===&lt;br /&gt;
&lt;br /&gt;
* '''Authors and affiliations:''' [[Kyo Lee]]&lt;br /&gt;
* '''Keywords of research area:'''&lt;br /&gt;
* '''Tentative title:'''&lt;br /&gt;
* '''Short abstract:'''&lt;br /&gt;
* '''Relationship to other publications:''' (is the article based on a previously published article? is it new content?)&lt;br /&gt;
* '''Pointer to the wiki page that documents the article:''' [[Document_GPF_activities_by_Kyo_Lee | Page]]&lt;br /&gt;
* '''Expected submission date:'''&lt;br /&gt;
&lt;br /&gt;
=== [Miller 2015] ===&lt;br /&gt;
&lt;br /&gt;
* '''Authors and affiliations:''' [[Kim Miller]]&lt;br /&gt;
* '''Keywords of research area:'''&lt;br /&gt;
* '''Tentative title:'''&lt;br /&gt;
* '''Short abstract:'''&lt;br /&gt;
* '''Relationship to other publications:''' (is the article based on a previously published article? is it new content?)&lt;br /&gt;
* '''Pointer to the wiki page that documents the article:''' [[Document_GPF_activities_by_Kim_Miller | Page]]&lt;br /&gt;
* '''Expected submission date:'''&lt;br /&gt;
&lt;br /&gt;
=== [Mills 2015] ===&lt;br /&gt;
&lt;br /&gt;
* '''Authors and affiliations:''' [[Heath Mills]]&lt;br /&gt;
* '''Keywords of research area:'''&lt;br /&gt;
* '''Tentative title:'''&lt;br /&gt;
* '''Short abstract:'''&lt;br /&gt;
* '''Relationship to other publications:''' (is the article based on a previously published article? is it new content?)&lt;br /&gt;
* '''Pointer to the wiki page that documents the article:''' [[Document_GPF_activities_by_Heith_Mills | Page]]&lt;br /&gt;
* '''Expected submission date:'''&lt;br /&gt;
&lt;br /&gt;
=== [Oh 2015] ===&lt;br /&gt;
&lt;br /&gt;
* '''Authors and affiliations:''' [[Ji-Hyun Oh]]&lt;br /&gt;
* '''Keywords of research area:'''&lt;br /&gt;
* '''Tentative title:'''&lt;br /&gt;
* '''Short abstract:'''&lt;br /&gt;
* '''Relationship to other publications:''' (is the article based on a previously published article? is it new content?)&lt;br /&gt;
* '''Pointer to the wiki page that documents the article:''' [[Document_GPF_activities_by_Ji_Hyun | Page]]&lt;br /&gt;
* '''Expected submission date:'''&lt;br /&gt;
&lt;br /&gt;
=== [Pierce 2015] ===&lt;br /&gt;
&lt;br /&gt;
* '''Authors and affiliations:''' [[Suzanne Pierce]]&lt;br /&gt;
* '''Keywords of research area:'''&lt;br /&gt;
* '''Tentative title:'''&lt;br /&gt;
* '''Short abstract:'''&lt;br /&gt;
* '''Relationship to other publications:''' (is the article based on a previously published article? is it new content?)&lt;br /&gt;
* '''Pointer to the wiki page that documents the article:''' [[Document_GPF_activities_by_Suzanne_Pierce | Page]]&lt;br /&gt;
* '''Expected submission date:'''&lt;br /&gt;
&lt;br /&gt;
=== [Pope 2015] ===&lt;br /&gt;
&lt;br /&gt;
* '''Authors and affiliations:''' [[Allen Pope]]&lt;br /&gt;
* '''Keywords of research area:'''&lt;br /&gt;
* '''Tentative title:'''&lt;br /&gt;
* '''Short abstract:'''&lt;br /&gt;
* '''Relationship to other publications:''' (is the article based on a previously published article? is it new content?)&lt;br /&gt;
* '''Pointer to the wiki page that documents the article:''' [[Document_GPF_activities_by_Allen_Pope | Page]]&lt;br /&gt;
* '''Expected submission date:'''&lt;br /&gt;
&lt;br /&gt;
=== [Read and Winslow 2015] ===&lt;br /&gt;
&lt;br /&gt;
* '''Authors and affiliations:''' [[Jordan Read]] and [[Luke Winslow]]&lt;br /&gt;
* '''Keywords of research area:'''&lt;br /&gt;
* '''Tentative title:'''&lt;br /&gt;
* '''Short abstract:'''&lt;br /&gt;
* '''Relationship to other publications:''' (is the article based on a previously published article? is it new content?)&lt;br /&gt;
* '''Pointer to the wiki page that documents the article:''' [[Document_GPF_activities_by_Jordan_Read | Page]]&lt;br /&gt;
* '''Expected submission date:'''&lt;br /&gt;
&lt;br /&gt;
=== [Tzeng 2015] ===&lt;br /&gt;
&lt;br /&gt;
* '''Authors and affiliations:''' [[Mimi Tzeng]]&lt;br /&gt;
* '''Keywords of research area:'''&lt;br /&gt;
* '''Tentative title:'''&lt;br /&gt;
* '''Short abstract:'''&lt;br /&gt;
* '''Relationship to other publications:''' (is the article based on a previously published article? is it new content?)&lt;br /&gt;
* '''Pointer to the wiki page that documents the article:''' [[Document_GPF_activities_by_Mimi_Tzeng | Page]]&lt;br /&gt;
* '''Expected submission date:'''&lt;br /&gt;
&lt;br /&gt;
=== [Villamizar 2015] ===&lt;br /&gt;
&lt;br /&gt;
* '''Authors and affiliations:''' [[Sandra Villamizar]]&lt;br /&gt;
* '''Keywords of research area:'''&lt;br /&gt;
* '''Tentative title:'''&lt;br /&gt;
* '''Short abstract:'''&lt;br /&gt;
* '''Relationship to other publications:''' (is the article based on a previously published article? is it new content?)&lt;br /&gt;
* '''Pointer to the wiki page that documents the article:''' [[Document_GPF_activities_by_Sandra_Villamizar | Page]]&lt;br /&gt;
* '''Expected submission date:'''&lt;br /&gt;
&lt;br /&gt;
=== [Yu 2015] ===&lt;br /&gt;
&lt;br /&gt;
* '''Authors and affiliations:''' [[Xuan Yu]]&lt;br /&gt;
* '''Keywords of research area:'''&lt;br /&gt;
* '''Tentative title:'''&lt;br /&gt;
* '''Short abstract:'''&lt;br /&gt;
* '''Relationship to other publications:''' (is the article based on a previously published article? is it new content?)&lt;br /&gt;
* '''Pointer to the wiki page that documents the article:''' [[Document_GPF_activities_by_Xuan_Yu | Page]]&lt;br /&gt;
* '''Expected submission date:'''&lt;br /&gt;
&lt;br /&gt;
== Special Issue Editors ==&lt;br /&gt;
&lt;br /&gt;
* Co-editor: &lt;br /&gt;
* Co-editor:&lt;br /&gt;
* Co-editor: &lt;br /&gt;
&lt;br /&gt;
The editors will only accept submissions that follow the [[Develop_proposal_for_special_issue#Special_Issue_Review_Criteria | special issue review criteria]].&lt;br /&gt;
&lt;br /&gt;
The editors will select a set of reviewers to handle the submissions.  Reviewers will include computer scientists, library scientists, and geoscientists.&lt;br /&gt;
&lt;br /&gt;
== Special Issue Review Criteria ==&lt;br /&gt;
&lt;br /&gt;
The reviewers will be asked to provide feedback on the papers according to the following criteria:&lt;br /&gt;
&lt;br /&gt;
* Documentation of the datasets: descriptions of datasets, unique identifiers, repositories.&lt;br /&gt;
* Documentation of software: description of all software used (including pre-processing of data, visualization steps, etc), unique identifiers, repositories.&lt;br /&gt;
* Documentation of the provenance of results: provenance for each figure or result, such as the workflow or the provenance record.&lt;br /&gt;
&lt;br /&gt;
== Tentative Timeline ==&lt;br /&gt;
&lt;br /&gt;
* Journal committed to special issue: April 15, 2015&lt;br /&gt;
* Submissions due to editors: June 30, 2015&lt;br /&gt;
* Reviews due: Sept 15, 2015&lt;br /&gt;
* Decisions out to authors: Sept 30, 2015&lt;br /&gt;
* Revisions due: October 31, 2015&lt;br /&gt;
* Final versions due November 15, 2015&lt;br /&gt;
* Issue published December 31, 2015&lt;br /&gt;
&lt;br /&gt;
&amp;lt;!-- Add any wiki Text above this Line --&amp;gt;&lt;br /&gt;
&amp;lt;!-- Do NOT Edit below this Line --&amp;gt;&lt;br /&gt;
{{#set:&lt;br /&gt;
	Owner=Yolanda_Gil|&lt;br /&gt;
	Participants=Xuan_Yu|&lt;br /&gt;
	Participants=Chris_Duffy|&lt;br /&gt;
	Participants=Scott_Peckham|&lt;br /&gt;
	Participants=Cedric_David|&lt;br /&gt;
	Participants=Ibrahim_Demir|&lt;br /&gt;
	Participants=Wally_Fulweiler|&lt;br /&gt;
	Participants=Leif_Karlstrom|&lt;br /&gt;
	Participants=Kyo_Lee|&lt;br /&gt;
	Participants=Kim_Miller|&lt;br /&gt;
	Participants=Heath_Mills|&lt;br /&gt;
	Participants=Ji-Hyun_Oh|&lt;br /&gt;
	Participants=Suzanne_Pierce|&lt;br /&gt;
	Participants=Allen_Pope|&lt;br /&gt;
	Participants=Jordan_Read|&lt;br /&gt;
	Participants=Mimi_Tzeng|&lt;br /&gt;
	Participants=Sandra_Villamizar|&lt;br /&gt;
	Progress=20|&lt;br /&gt;
	StartDate=2015-03-10|&lt;br /&gt;
	TargetDate=2015-03-16|&lt;br /&gt;
	Type=Low}}&lt;/div&gt;</summary>
		<author><name>Suzanne</name></author>	</entry>

	<entry>
		<id>https://www.organicdatascience.org/gpf/index.php?title=Develop_proposal_for_special_issue&amp;diff=11276</id>
		<title>Develop proposal for special issue</title>
		<link rel="alternate" type="text/html" href="https://www.organicdatascience.org/gpf/index.php?title=Develop_proposal_for_special_issue&amp;diff=11276"/>
				<updated>2015-03-13T16:11:00Z</updated>
		
		<summary type="html">&lt;p&gt;Suzanne: &lt;/p&gt;
&lt;hr /&gt;
&lt;div&gt;[[Category:Task]]&lt;br /&gt;
&lt;br /&gt;
== Background: Why a Special Issue on Geoscience Papers of the Future? ==&lt;br /&gt;
&lt;br /&gt;
[[Discuss_what_we_will_consider_a_GPF#The_Vision | Include here our discussion for the vision]]&lt;br /&gt;
Motivated by need to fully document and make research accessible and reproducable.&lt;br /&gt;
&lt;br /&gt;
=== Motivation: The EarthCube Initiative and the GeoSoft Project ===&lt;br /&gt;
&lt;br /&gt;
[http://www.geosoft-earthcube.org/about Include here background about GeoSoft from the web site]&lt;br /&gt;
&lt;br /&gt;
=== What is a GPF ===&lt;br /&gt;
&lt;br /&gt;
[[Discuss_what_we_will_consider_a_GPF#What_is_a_Geoscience_Paper_of_the_Future.3F | Include here our discussion of what is a GPF]]&lt;br /&gt;
&lt;br /&gt;
=== Related work ===&lt;br /&gt;
&lt;br /&gt;
[[Discuss_what_we_will_consider_a_GPF#New_Frameworks_to_Create_a_New_Generation_of_Scientific_Articles | Include here the related work we have discussed]]&lt;br /&gt;
&lt;br /&gt;
== Papers to be included ==&lt;br /&gt;
&lt;br /&gt;
For each submission, we describe:&lt;br /&gt;
&lt;br /&gt;
* '''Authors and affiliations'''&lt;br /&gt;
* '''Keywords of research area'''&lt;br /&gt;
* '''Tentative title'''&lt;br /&gt;
* '''Short abstract'''&lt;br /&gt;
* '''Relationship to other publications''' (is the article based on a previously published article? is it new content?)&lt;br /&gt;
* '''Pointer to the wiki page that documents the article'''&lt;br /&gt;
* '''Expected submission date'''&lt;br /&gt;
&lt;br /&gt;
=== [David 2015] ===&lt;br /&gt;
&lt;br /&gt;
* '''Authors and affiliations:''' [[Cedric David]]&lt;br /&gt;
* '''Keywords of research area:'''&lt;br /&gt;
* '''Tentative title:'''&lt;br /&gt;
* '''Short abstract:'''&lt;br /&gt;
* '''Relationship to other publications:''' &lt;br /&gt;
* '''Pointer to the wiki page that documents the article:''' [[Document_GPF_activities_by_Cedric_David | Page]]&lt;br /&gt;
* '''Expected submission date:'''&lt;br /&gt;
&lt;br /&gt;
=== [Demir 2015] ===&lt;br /&gt;
&lt;br /&gt;
* '''Authors and affiliations:''' [[Ibrahim Demir]]&lt;br /&gt;
* '''Keywords of research area:'''&lt;br /&gt;
* '''Tentative title:'''&lt;br /&gt;
* '''Short abstract:'''&lt;br /&gt;
* '''Relationship to other publications:''' (is the article based on a previously published article? is it new content?)&lt;br /&gt;
* '''Pointer to the wiki page that documents the article:''' [[Document_GPF_activities_by_Ibrahim_Demir | Page]]&lt;br /&gt;
* '''Expected submission date:'''&lt;br /&gt;
&lt;br /&gt;
=== [Fulweiler 2015] ===&lt;br /&gt;
&lt;br /&gt;
* '''Authors and affiliations:''' [[Wally Fulweiler]]&lt;br /&gt;
* '''Keywords of research area:'''&lt;br /&gt;
* '''Tentative title:'''&lt;br /&gt;
* '''Short abstract:'''&lt;br /&gt;
* '''Relationship to other publications:''' (is the article based on a previously published article? is it new content?)&lt;br /&gt;
* '''Pointer to the wiki page that documents the article:''' [[Document_GPF_activities_by_Wally_Fulweiler | Page]]&lt;br /&gt;
* '''Expected submission date:'''&lt;br /&gt;
&lt;br /&gt;
=== [Karlstrom and Lay 2015] ===&lt;br /&gt;
&lt;br /&gt;
* '''Authors and affiliations:''' [[Leif Karlstrom]] and [[Lay Kuan Loh]]&lt;br /&gt;
* '''Keywords of research area:'''&lt;br /&gt;
* '''Tentative title:'''&lt;br /&gt;
* '''Short abstract:'''&lt;br /&gt;
* '''Relationship to other publications:''' (is the article based on a previously published article? is it new content?)&lt;br /&gt;
* '''Pointer to the wiki page that documents the article:''' [[Document_GPF_activities_by_Leif_Karlstron | Page]]&lt;br /&gt;
* '''Expected submission date:'''&lt;br /&gt;
&lt;br /&gt;
=== [Lee 2015] ===&lt;br /&gt;
&lt;br /&gt;
* '''Authors and affiliations:''' [[Kyo Lee]]&lt;br /&gt;
* '''Keywords of research area:'''&lt;br /&gt;
* '''Tentative title:'''&lt;br /&gt;
* '''Short abstract:'''&lt;br /&gt;
* '''Relationship to other publications:''' (is the article based on a previously published article? is it new content?)&lt;br /&gt;
* '''Pointer to the wiki page that documents the article:''' [[Document_GPF_activities_by_Kyo_Lee | Page]]&lt;br /&gt;
* '''Expected submission date:'''&lt;br /&gt;
&lt;br /&gt;
=== [Miller 2015] ===&lt;br /&gt;
&lt;br /&gt;
* '''Authors and affiliations:''' [[Kim Miller]]&lt;br /&gt;
* '''Keywords of research area:'''&lt;br /&gt;
* '''Tentative title:'''&lt;br /&gt;
* '''Short abstract:'''&lt;br /&gt;
* '''Relationship to other publications:''' (is the article based on a previously published article? is it new content?)&lt;br /&gt;
* '''Pointer to the wiki page that documents the article:''' [[Document_GPF_activities_by_Kim_Miller | Page]]&lt;br /&gt;
* '''Expected submission date:'''&lt;br /&gt;
&lt;br /&gt;
=== [Mills 2015] ===&lt;br /&gt;
&lt;br /&gt;
* '''Authors and affiliations:''' [[Heath Mills]]&lt;br /&gt;
* '''Keywords of research area:'''&lt;br /&gt;
* '''Tentative title:'''&lt;br /&gt;
* '''Short abstract:'''&lt;br /&gt;
* '''Relationship to other publications:''' (is the article based on a previously published article? is it new content?)&lt;br /&gt;
* '''Pointer to the wiki page that documents the article:''' [[Document_GPF_activities_by_Heith_Mills | Page]]&lt;br /&gt;
* '''Expected submission date:'''&lt;br /&gt;
&lt;br /&gt;
=== [Oh 2015] ===&lt;br /&gt;
&lt;br /&gt;
* '''Authors and affiliations:''' [[Ji-Hyun Oh]]&lt;br /&gt;
* '''Keywords of research area:'''&lt;br /&gt;
* '''Tentative title:'''&lt;br /&gt;
* '''Short abstract:'''&lt;br /&gt;
* '''Relationship to other publications:''' (is the article based on a previously published article? is it new content?)&lt;br /&gt;
* '''Pointer to the wiki page that documents the article:''' [[Document_GPF_activities_by_Ji_Hyun | Page]]&lt;br /&gt;
* '''Expected submission date:'''&lt;br /&gt;
&lt;br /&gt;
=== [Pierce 2015] ===&lt;br /&gt;
&lt;br /&gt;
* '''Authors and affiliations:''' [[Suzanne Pierce]]&lt;br /&gt;
* '''Keywords of research area:'''&lt;br /&gt;
* '''Tentative title:'''&lt;br /&gt;
* '''Short abstract:'''&lt;br /&gt;
* '''Relationship to other publications:''' (is the article based on a previously published article? is it new content?)&lt;br /&gt;
* '''Pointer to the wiki page that documents the article:''' [[Document_GPF_activities_by_Suzanne_Pierce | Page]]&lt;br /&gt;
* '''Expected submission date:'''&lt;br /&gt;
&lt;br /&gt;
=== [Pope 2015] ===&lt;br /&gt;
&lt;br /&gt;
* '''Authors and affiliations:''' [[Allen Pope]]&lt;br /&gt;
* '''Keywords of research area:'''&lt;br /&gt;
* '''Tentative title:'''&lt;br /&gt;
* '''Short abstract:'''&lt;br /&gt;
* '''Relationship to other publications:''' (is the article based on a previously published article? is it new content?)&lt;br /&gt;
* '''Pointer to the wiki page that documents the article:''' [[Document_GPF_activities_by_Allen_Pope | Page]]&lt;br /&gt;
* '''Expected submission date:'''&lt;br /&gt;
&lt;br /&gt;
=== [Read and Winslow 2015] ===&lt;br /&gt;
&lt;br /&gt;
* '''Authors and affiliations:''' [[Jordan Read]] and [[Luke Winslow]]&lt;br /&gt;
* '''Keywords of research area:'''&lt;br /&gt;
* '''Tentative title:'''&lt;br /&gt;
* '''Short abstract:'''&lt;br /&gt;
* '''Relationship to other publications:''' (is the article based on a previously published article? is it new content?)&lt;br /&gt;
* '''Pointer to the wiki page that documents the article:''' [[Document_GPF_activities_by_Jordan_Read | Page]]&lt;br /&gt;
* '''Expected submission date:'''&lt;br /&gt;
&lt;br /&gt;
=== [Tzeng 2015] ===&lt;br /&gt;
&lt;br /&gt;
* '''Authors and affiliations:''' [[Mimi Tzeng]]&lt;br /&gt;
* '''Keywords of research area:'''&lt;br /&gt;
* '''Tentative title:'''&lt;br /&gt;
* '''Short abstract:'''&lt;br /&gt;
* '''Relationship to other publications:''' (is the article based on a previously published article? is it new content?)&lt;br /&gt;
* '''Pointer to the wiki page that documents the article:''' [[Document_GPF_activities_by_Mimi_Tzeng | Page]]&lt;br /&gt;
* '''Expected submission date:'''&lt;br /&gt;
&lt;br /&gt;
=== [Villamizar 2015] ===&lt;br /&gt;
&lt;br /&gt;
* '''Authors and affiliations:''' [[Sandra Villamizar]]&lt;br /&gt;
* '''Keywords of research area:'''&lt;br /&gt;
* '''Tentative title:'''&lt;br /&gt;
* '''Short abstract:'''&lt;br /&gt;
* '''Relationship to other publications:''' (is the article based on a previously published article? is it new content?)&lt;br /&gt;
* '''Pointer to the wiki page that documents the article:''' [[Document_GPF_activities_by_Sandra_Villamizar | Page]]&lt;br /&gt;
* '''Expected submission date:'''&lt;br /&gt;
&lt;br /&gt;
=== [Yu 2015] ===&lt;br /&gt;
&lt;br /&gt;
* '''Authors and affiliations:''' [[Xuan Yu]]&lt;br /&gt;
* '''Keywords of research area:'''&lt;br /&gt;
* '''Tentative title:'''&lt;br /&gt;
* '''Short abstract:'''&lt;br /&gt;
* '''Relationship to other publications:''' (is the article based on a previously published article? is it new content?)&lt;br /&gt;
* '''Pointer to the wiki page that documents the article:''' [[Document_GPF_activities_by_Xuan_Yu | Page]]&lt;br /&gt;
* '''Expected submission date:'''&lt;br /&gt;
&lt;br /&gt;
== Special Issue Editors ==&lt;br /&gt;
&lt;br /&gt;
* Co-editor: &lt;br /&gt;
* Co-editor:&lt;br /&gt;
* Co-editor: &lt;br /&gt;
&lt;br /&gt;
The editors will only accept submissions that follow the [[Develop_proposal_for_special_issue#Special_Issue_Review_Criteria | special issue review criteria]].&lt;br /&gt;
&lt;br /&gt;
The editors will select a set of reviewers to handle the submissions.  Reviewers will include computer scientists, library scientists, and geoscientists.&lt;br /&gt;
&lt;br /&gt;
== Special Issue Review Criteria ==&lt;br /&gt;
&lt;br /&gt;
The reviewers will be asked to provide feedback on the papers according to the following criteria:&lt;br /&gt;
&lt;br /&gt;
* Documentation of the datasets: descriptions of datasets, unique identifiers, repositories.&lt;br /&gt;
* Documentation of software: description of all software used (including pre-processing of data, visualization steps, etc), unique identifiers, repositories.&lt;br /&gt;
* Documentation of the provenance of results: provenance for each figure or result, such as the workflow or the provenance record.&lt;br /&gt;
&lt;br /&gt;
== Tentative Timeline ==&lt;br /&gt;
&lt;br /&gt;
* Journal committed to special issue: April 15, 2015&lt;br /&gt;
* Submissions due to editors: June 30, 2015&lt;br /&gt;
* Reviews due: Sept 15, 2015&lt;br /&gt;
* Decisions out to authors: Sept 30, 2015&lt;br /&gt;
* Revisions due: October 31, 2015&lt;br /&gt;
* Final versions due November 15, 2015&lt;br /&gt;
* Issue published December 31, 2015&lt;br /&gt;
&lt;br /&gt;
&amp;lt;!-- Add any wiki Text above this Line --&amp;gt;&lt;br /&gt;
&amp;lt;!-- Do NOT Edit below this Line --&amp;gt;&lt;br /&gt;
{{#set:&lt;br /&gt;
	Owner=Yolanda_Gil|&lt;br /&gt;
	Participants=Xuan_Yu|&lt;br /&gt;
	Participants=Chris_Duffy|&lt;br /&gt;
	Participants=Scott_Peckham|&lt;br /&gt;
	Participants=Cedric_David|&lt;br /&gt;
	Participants=Ibrahim_Demir|&lt;br /&gt;
	Participants=Wally_Fulweiler|&lt;br /&gt;
	Participants=Leif_Karlstrom|&lt;br /&gt;
	Participants=Kyo_Lee|&lt;br /&gt;
	Participants=Kim_Miller|&lt;br /&gt;
	Participants=Heath_Mills|&lt;br /&gt;
	Participants=Ji-Hyun_Oh|&lt;br /&gt;
	Participants=Suzanne_Pierce|&lt;br /&gt;
	Participants=Allen_Pope|&lt;br /&gt;
	Participants=Jordan_Read|&lt;br /&gt;
	Participants=Mimi_Tzeng|&lt;br /&gt;
	Participants=Sandra_Villamizar|&lt;br /&gt;
	Progress=20|&lt;br /&gt;
	StartDate=2015-03-10|&lt;br /&gt;
	TargetDate=2015-03-16|&lt;br /&gt;
	Type=Low}}&lt;/div&gt;</summary>
		<author><name>Suzanne</name></author>	</entry>

	<entry>
		<id>https://www.organicdatascience.org/gpf/index.php?title=Write_about_making_data_accessible&amp;diff=11275</id>
		<title>Write about making data accessible</title>
		<link rel="alternate" type="text/html" href="https://www.organicdatascience.org/gpf/index.php?title=Write_about_making_data_accessible&amp;diff=11275"/>
				<updated>2015-03-13T16:08:50Z</updated>
		
		<summary type="html">&lt;p&gt;Suzanne: &lt;/p&gt;
&lt;hr /&gt;
&lt;div&gt;[[Category:Task]]&lt;br /&gt;
&amp;lt;!-- Add any wiki Text above this Line --&amp;gt;&lt;br /&gt;
&amp;lt;!-- Do NOT Edit below this Line --&amp;gt;&lt;br /&gt;
{{#set:&lt;br /&gt;
	Owner=Yolanda_Gil|&lt;br /&gt;
	SubTask=Write_about_large_datasets|&lt;br /&gt;
	SubTask=Write_about_using_data_from_public_repositories|&lt;br /&gt;
	SubTask=Write_about_using_data_from_colleagues|&lt;br /&gt;
	SubTask=Write_about_data_preparation|&lt;br /&gt;
	Type=Medium}}&lt;br /&gt;
'''Comments and General Discussion, Observations in the Group:'''&lt;br /&gt;
&lt;br /&gt;
[[Data notation and DOIs]] Conversation at F2F meeting: In the conversation during the face-to-face meeting we looked at examples of each author's wiki posts. Ibrahim had posted data for his article and included the QR code with the entry. We discussed whether or not a QR code was appropriate to include in the actual journal articles to improve accessibility. The group determined that a hyperlink to the DOIs is an easy and accessible way to access data in the papers, whereas a QR code is more useful in the context of presentations (e.g. technical posters).&lt;br /&gt;
So, the group felt that including the datalink via DOIs is a the best practice in journal articles&lt;/div&gt;</summary>
		<author><name>Suzanne</name></author>	</entry>

	<entry>
		<id>https://www.organicdatascience.org/gpf/index.php?title=Document_GPF_activities_by_Suzanne_Pierce&amp;diff=11273</id>
		<title>Document GPF activities by Suzanne Pierce</title>
		<link rel="alternate" type="text/html" href="https://www.organicdatascience.org/gpf/index.php?title=Document_GPF_activities_by_Suzanne_Pierce&amp;diff=11273"/>
				<updated>2015-03-13T16:05:49Z</updated>
		
		<summary type="html">&lt;p&gt;Suzanne: &lt;/p&gt;
&lt;hr /&gt;
&lt;div&gt;[[Category:Task]]&lt;br /&gt;
&amp;lt;br/&amp;gt;&amp;lt;b&amp;gt;Details on how to do this task:&amp;lt;/b&amp;gt; [[Document GPF activities]]&amp;lt;br/&amp;gt;&amp;lt;br/&amp;gt;&lt;br /&gt;
&amp;lt;!-- Add any wiki Text above this Line --&amp;gt;&lt;br /&gt;
&lt;br /&gt;
Comments and General Discussion, Observations in the Group:&lt;br /&gt;
&lt;br /&gt;
Data notation and DOIs:&lt;br /&gt;
In the conversation during the face-to-face meeting we looked at examples of each author's wiki posts. Ibrahim had posted data for his article and included the QR code with the entry. We discussed whether or not a QR code was appropriate to include in the actual journal articles to improve accessibility. The group determined that a hyperlink to the DOIs is an easy and accessible way to access data in the papers, whereas a QR code is more useful in the context of presentations (e.g. technical posters).&lt;br /&gt;
&lt;br /&gt;
So, the group felt that including the datalink via DOIs is a the best practice in journal articles.&lt;br /&gt;
&lt;br /&gt;
&amp;lt;!-- Do NOT Edit below this Line --&amp;gt;&lt;br /&gt;
{{#set:&lt;br /&gt;
	Expertise=Open_science|&lt;br /&gt;
	Expertise=Geosciences|&lt;br /&gt;
	Owner=Suzanne_Pierce|&lt;br /&gt;
	StartDate=2015-02-01|&lt;br /&gt;
	SubTask=Select_target_article_by_Suzanne_Pierce|&lt;br /&gt;
	SubTask=Ensure_software_is_usable_by_Suzanne_Pierce|&lt;br /&gt;
	SubTask=Make_data_accessible_by_Suzanne_Pierce|&lt;br /&gt;
	SubTask=Document_provenance_of_results_by_Suzanne_Pierce|&lt;br /&gt;
	SubTask=Suzanne_Pierce_should_make_software_executable_by_others|&lt;br /&gt;
	SubTask=Make_software_accessible_by_Suzanne_Pierce|&lt;br /&gt;
	SubTask=Document_domain_characteristics_by_Suzanne_Pierce|&lt;br /&gt;
	SubTask=Document_quality_of_software_and_data_by_Suzanne_Pierce|&lt;br /&gt;
	SubTask=Prepare_the_article_for_publication_by_Suzanne_Pierce|&lt;br /&gt;
	TargetDate=2015-05-29|&lt;br /&gt;
	Type=Medium}}&lt;/div&gt;</summary>
		<author><name>Suzanne</name></author>	</entry>

	<entry>
		<id>https://www.organicdatascience.org/gpf/index.php?title=Ensure_software_is_usable_by_Suzanne_Pierce&amp;diff=11002</id>
		<title>Ensure software is usable by Suzanne Pierce</title>
		<link rel="alternate" type="text/html" href="https://www.organicdatascience.org/gpf/index.php?title=Ensure_software_is_usable_by_Suzanne_Pierce&amp;diff=11002"/>
				<updated>2015-03-10T18:22:35Z</updated>
		
		<summary type="html">&lt;p&gt;Suzanne: &lt;/p&gt;
&lt;hr /&gt;
&lt;div&gt;[[Category:Task]]&lt;br /&gt;
&amp;lt;br/&amp;gt;&amp;lt;b&amp;gt;Details on how to do this task:&amp;lt;/b&amp;gt; [[Ensure software is usable]]&amp;lt;br/&amp;gt;&amp;lt;br/&amp;gt;&lt;br /&gt;
&amp;lt;!-- Add any wiki Text above this Line --&amp;gt;&lt;br /&gt;
&amp;lt;!-- Do NOT Edit below this Line --&amp;gt;&lt;br /&gt;
{{#set:&lt;br /&gt;
	Expertise=Open_science|&lt;br /&gt;
	Expertise=Geosciences|&lt;br /&gt;
	Owner=Suzanne_Pierce|&lt;br /&gt;
	Progress=100|&lt;br /&gt;
	StartDate=2015-02-07|&lt;br /&gt;
	TargetDate=2015-02-23|&lt;br /&gt;
	Type=Low}}&lt;br /&gt;
&lt;br /&gt;
'''MCSDSS workflows, data, and required packages or resources for a build'''&lt;br /&gt;
&lt;br /&gt;
An initial User Manual and Set-up Guide is available as a word document&lt;br /&gt;
&lt;br /&gt;
Selected case example datasets and gather together (see next section)&lt;br /&gt;
&lt;br /&gt;
== Technology Goals ==&lt;br /&gt;
&lt;br /&gt;
The MCSDSS is a prototype system intended to demonstrate the potential capabilities of a single page application (SPA) running atop a web and cloud based architecture utilizing open source technologies.  The application is implemented on current web standards while supporting human interface design that targets both traditional mouse/keyboard interactions and modern touch/gesture enabled interactions. &lt;br /&gt;
The technology stack for the Heatseeker was selected with the goal of creating a robust and dynamic modular codebase that can be adjusted to fit many use cases and scale to support usage loads that range between simple data display to complex scientific simulation-based modelling and analytics.  The application integrates current frameworks for highly performant agile development with unit testing, statistical analysis, data visualization, mapping technologies, geographic data manipulation, and cloud infrastructure while retaining support for traditional HTML5/CSS3 web standards. &lt;br /&gt;
&lt;br /&gt;
== Server Stack ==&lt;br /&gt;
&lt;br /&gt;
The MCSDSS capitalizes on a recent evolution in the ECMA6 standard (the JavaScript language) and its comprehensive adoption rate across all web-enabled devices as the de facto language of the internet. The current approach is based on the traditional web stack (Apache Server, MySQL DB, Linux OS) technologies. The entirety of the MCSDSS application prototype runs on the Amazon Web Services (AWS) cloud on a single Elastic Compute Cloud (EC2) medium server instance running an Amazon Linux AMI and leveraging the AWS Route53 DNS service with Elastic Storage Blocks (EBS) for dynamic real-time data persistence and Simple Storage Service (S3) for read-only and archived data persistence.&lt;br /&gt;
&lt;br /&gt;
[[Server Side Technology Stack]]&lt;br /&gt;
•	Linux OS (AWS version, though any will work, e.g. http://www.ubuntu.com/ )&lt;br /&gt;
•	Apache Web Server (http://httpd.apache.org/ )&lt;br /&gt;
•	MySQL Database (http://www.mysql.com/ )&lt;br /&gt;
&lt;br /&gt;
[[Workflow]]&lt;br /&gt;
•	Ruby (https://www.ruby-lang.org/en/documentation/installation/ )&lt;br /&gt;
•	Node JS and Node Package Manager (http://nodejs.org/ )&lt;br /&gt;
•	Compass (http://compass-style.org/ )&lt;br /&gt;
•	Yeoman (http://yeoman.io/ )&lt;br /&gt;
•	Bower (http://bower.io/ )&lt;br /&gt;
•	Grunt (http://gruntjs.com/ )&lt;br /&gt;
•	Bitbucket (https://bitbucket.org/ )&lt;br /&gt;
•	Git (http://git-scm.com/ )&lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
== Client Stack ==&lt;br /&gt;
&lt;br /&gt;
The client-side framework implements Google’s AngularJS, one of the predominant modern web libraries which enables modular code development in a decoupled Model-View-Controller-Service structure.  The decoupled structure allows for robust unit testing (via the Karma test runner coupled with the Jasmine behavior-driven development framework for testing JavaScript code), end-to-end testing (using the Protractor test framework), automated application builds, and separation of responsibilities within the application design and within the development team roles. Additionally the AngularJS framework provides support for 1) templating systems (such as Jade, Handlebars, MustacheJS, Underscore, etc.), 2) routing, state machines, 3) data binding, 4) AJAX requests, animations and transitions (both JavaScript and CSS based), and 5) the use of modern web-development workflows based on Yeoman IO. &lt;br /&gt;
The Yeoman workflow adds additional fluidity to the development process. For example, through integration with the Node Package Manager (for using modular third party node based libraries), Bower (for using modular third party JavaScript-based libraries), GruntJS (for automated error and code checking, unit testing, code minification, code obfuscation, and any other automated task), and providing various scaffolding systems for erecting complex application structures in very short periods of time. The online repository and project management services of BitBucket (from Atlassian) and a Git based code repository were used for the codebase management (version control) and issue tracking.&lt;br /&gt;
Several third party libraries are used to provide extended capabilities.  Specifically D3 for data visualization and DOM manipulation, JQuery for DOM manipulation, SkrollrJS for parallax display effects (a visual display technique pioneered by Disney and in vogue on the modern web), WaypointsJS for linking to embedded content, GSAP for additional animations and transitions, KineticJS for additional touch and gesture support, LeafletJS for map technology integration, Bootstrap for responsive web design and user interface components, SASS &amp;amp; Compass for CSS styling and animations, HTML5 Boilerplate for consistent UI development, and ModernizrJS for feature detection and adaptive client capabilities.&lt;br /&gt;
&lt;br /&gt;
[[Client Side Technology Stack]]&lt;br /&gt;
•	AngularJS (https://angularjs.org/ )&lt;br /&gt;
•	KarmaJS (http://karma-runner.github.io/0.12/index.html )&lt;br /&gt;
•	JasmineJS (http://jasmine.github.io/ )&lt;br /&gt;
•	ProtractorJS (http://angular.github.io/protractor/#/ )&lt;br /&gt;
•	ModernizrJS (http://modernizr.com/ )&lt;br /&gt;
•	HTML5 Boilerplate (http://html5boilerplate.com/ )&lt;br /&gt;
•	Bootstrap (http://getbootstrap.com/ )&lt;br /&gt;
•	SASS (http://sass-lang.com/ )&lt;br /&gt;
•	Compass (http://compass-style.org/ )&lt;br /&gt;
•	Jade (http://jade-lang.com/ )&lt;br /&gt;
•	Handlebars (http://handlebarsjs.com/ )&lt;br /&gt;
•	MustacheJS (http://mustache.github.io/ )&lt;br /&gt;
•	JQuery (http://jquery.com/ )&lt;br /&gt;
•	D3 (http://d3js.org/ )&lt;br /&gt;
•	LeafletJS (http://leafletjs.com/ )&lt;br /&gt;
•	GSAP (https://greensock.com/gsap )&lt;br /&gt;
•	SkrollrJS (http://prinzhorn.github.io/skrollr/ )&lt;br /&gt;
•	WaypointsJS (http://imakewebthings.com/waypoints/ )&lt;br /&gt;
•	KineticJS (http://kineticjs.com/ )&lt;/div&gt;</summary>
		<author><name>Suzanne</name></author>	</entry>

	<entry>
		<id>https://www.organicdatascience.org/gpf/index.php?title=Document_provenance_of_results_by_Suzanne_Pierce&amp;diff=11001</id>
		<title>Document provenance of results by Suzanne Pierce</title>
		<link rel="alternate" type="text/html" href="https://www.organicdatascience.org/gpf/index.php?title=Document_provenance_of_results_by_Suzanne_Pierce&amp;diff=11001"/>
				<updated>2015-03-10T18:19:42Z</updated>
		
		<summary type="html">&lt;p&gt;Suzanne: &lt;/p&gt;
&lt;hr /&gt;
&lt;div&gt;[[Category:Task]]&lt;br /&gt;
&amp;lt;br/&amp;gt;&amp;lt;b&amp;gt;Details on how to do this task:&amp;lt;/b&amp;gt; [[Document the provenance of the results]]&amp;lt;br/&amp;gt;&amp;lt;br/&amp;gt;&lt;br /&gt;
&amp;lt;!-- Add any wiki Text above this Line --&amp;gt;&lt;br /&gt;
&amp;lt;!-- Do NOT Edit below this Line --&amp;gt;&lt;br /&gt;
{{#set:&lt;br /&gt;
	Expertise=Open_science|&lt;br /&gt;
	Expertise=Geosciences|&lt;br /&gt;
	Owner=Suzanne_Pierce|&lt;br /&gt;
	Progress=0|&lt;br /&gt;
	StartDate=2015-03-07|&lt;br /&gt;
	TargetDate=2015-03-20|&lt;br /&gt;
	Type=Low}}&lt;br /&gt;
&lt;br /&gt;
'''Selecting Data Sets for GPF and Tutorial:'''&lt;br /&gt;
&lt;br /&gt;
'''Observations on making data public'''&lt;br /&gt;
1) challenge to select and assure that dataset is absolutely correct&lt;br /&gt;
2) takes time to document dataset, source and context to be sure it is a standalone object.&lt;br /&gt;
3) Coordination among team members is needed to be sure that all the pieces are uploaded&lt;br /&gt;
4) There has to be a clear plan and choices for what is necessary&lt;br /&gt;
5) bundling and project options on figshare should be useful (but I'm still trying to understand how they work)&lt;br /&gt;
6) the free level access on figshare is very good, but the file size limitations are an issue for many file sets (e.g. set of candidate solutions over 1GB limit) - re-structuring uploads and figshare units to accomodate. May need to subscribe to figshare at appropriate level (=long term cost to researcher to keep data up and available)&lt;br /&gt;
7) Already see benefits because we know exactly where a dataset is (one that we have to search for all the time) - now it has a persistent place (easy to find)&lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
'''Tutorial Files, links, workflow'''&lt;br /&gt;
&lt;br /&gt;
OSGeo4w [http://trac.osgeo.org/osgeo4w/]&lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
'''Publishing/Sharing datasets'''&lt;br /&gt;
&lt;br /&gt;
Citation:&lt;br /&gt;
Pierce, Suzanne (2015): Gridded Shapefile for the Recharge Zones of the Barton Springs segment of the Edwards Aquifer. figshare.&lt;br /&gt;
http://dx.doi.org/10.6084/m9.figshare.1330145 Retrieved 16:56, Mar 10, 2015 (GMT)&lt;/div&gt;</summary>
		<author><name>Suzanne</name></author>	</entry>

	<entry>
		<id>https://www.organicdatascience.org/gpf/index.php?title=Document_provenance_of_results_by_Suzanne_Pierce&amp;diff=11000</id>
		<title>Document provenance of results by Suzanne Pierce</title>
		<link rel="alternate" type="text/html" href="https://www.organicdatascience.org/gpf/index.php?title=Document_provenance_of_results_by_Suzanne_Pierce&amp;diff=11000"/>
				<updated>2015-03-10T18:18:51Z</updated>
		
		<summary type="html">&lt;p&gt;Suzanne: &lt;/p&gt;
&lt;hr /&gt;
&lt;div&gt;[[Category:Task]]&lt;br /&gt;
&amp;lt;br/&amp;gt;&amp;lt;b&amp;gt;Details on how to do this task:&amp;lt;/b&amp;gt; [[Document the provenance of the results]]&amp;lt;br/&amp;gt;&amp;lt;br/&amp;gt;&lt;br /&gt;
&amp;lt;!-- Add any wiki Text above this Line --&amp;gt;&lt;br /&gt;
&amp;lt;!-- Do NOT Edit below this Line --&amp;gt;&lt;br /&gt;
{{#set:&lt;br /&gt;
	Expertise=Open_science|&lt;br /&gt;
	Expertise=Geosciences|&lt;br /&gt;
	Owner=Suzanne_Pierce|&lt;br /&gt;
	Progress=0|&lt;br /&gt;
	StartDate=2015-03-07|&lt;br /&gt;
	TargetDate=2015-03-20|&lt;br /&gt;
	Type=Low}}&lt;br /&gt;
&lt;br /&gt;
'''Selecting Data Sets for GPF and Tutorial:'''&lt;br /&gt;
&lt;br /&gt;
'''Observations on making data public'''&lt;br /&gt;
1) challenge to select and assure that dataset is absolutely correct&lt;br /&gt;
2) takes time to document dataset, source and context to be sure it is a standalone object.&lt;br /&gt;
3) Coordination among team members is needed to be sure that all the pieces are uploaded&lt;br /&gt;
4) There has to be a clear plan and choices for what is necessary&lt;br /&gt;
5) bundling and project options on figshare should be useful (but I'm still trying to understand how they work)&lt;br /&gt;
6) the free level access on figshare is very good, but the file size limitations are an issue for many file sets (e.g. set of candidate solutions over 1GB limit) - re-structuring uploads and figshare units to accomodate. May need to subscribe to figshare at appropriate level (=long term cost to researcher to keep data up and available)&lt;br /&gt;
7) Already see benefits because we know exactly where a dataset is (one that we have to search for all the time) - now it has a persistent place (easy to find)&lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
'''Tutorial Files, links, workflow'''&lt;br /&gt;
&lt;br /&gt;
[http://trac.osgeo.org/osgeo4w/]&lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
'''Publishing/Sharing datasets'''&lt;br /&gt;
&lt;br /&gt;
Citation:&lt;br /&gt;
Pierce, Suzanne (2015): Gridded Shapefile for the Recharge Zones of the Barton Springs segment of the Edwards Aquifer. figshare.&lt;br /&gt;
http://dx.doi.org/10.6084/m9.figshare.1330145 Retrieved 16:56, Mar 10, 2015 (GMT)&lt;/div&gt;</summary>
		<author><name>Suzanne</name></author>	</entry>

	<entry>
		<id>https://www.organicdatascience.org/gpf/index.php?title=Document_provenance_of_results_by_Suzanne_Pierce&amp;diff=10999</id>
		<title>Document provenance of results by Suzanne Pierce</title>
		<link rel="alternate" type="text/html" href="https://www.organicdatascience.org/gpf/index.php?title=Document_provenance_of_results_by_Suzanne_Pierce&amp;diff=10999"/>
				<updated>2015-03-10T18:17:37Z</updated>
		
		<summary type="html">&lt;p&gt;Suzanne: &lt;/p&gt;
&lt;hr /&gt;
&lt;div&gt;[[Category:Task]]&lt;br /&gt;
&amp;lt;br/&amp;gt;&amp;lt;b&amp;gt;Details on how to do this task:&amp;lt;/b&amp;gt; [[Document the provenance of the results]]&amp;lt;br/&amp;gt;&amp;lt;br/&amp;gt;&lt;br /&gt;
&amp;lt;!-- Add any wiki Text above this Line --&amp;gt;&lt;br /&gt;
&amp;lt;!-- Do NOT Edit below this Line --&amp;gt;&lt;br /&gt;
{{#set:&lt;br /&gt;
	Expertise=Open_science|&lt;br /&gt;
	Expertise=Geosciences|&lt;br /&gt;
	Owner=Suzanne_Pierce|&lt;br /&gt;
	Progress=0|&lt;br /&gt;
	StartDate=2015-03-07|&lt;br /&gt;
	TargetDate=2015-03-20|&lt;br /&gt;
	Type=Low}}&lt;br /&gt;
&lt;br /&gt;
'''Selecting Data Sets for GPF and Tutorial:'''&lt;br /&gt;
&lt;br /&gt;
'''Observations on making data public'''&lt;br /&gt;
1) challenge to select and assure that dataset is absolutely correct&lt;br /&gt;
2) takes time to document dataset, source and context to be sure it is a standalone object.&lt;br /&gt;
3) Coordination among team members is needed to be sure that all the pieces are uploaded&lt;br /&gt;
4) There has to be a clear plan and choices for what is necessary&lt;br /&gt;
5) bundling and project options on figshare should be useful (but I'm still trying to understand how they work)&lt;br /&gt;
6) the free level access on figshare is very good, but the file size limitations are an issue for many file sets (e.g. set of candidate solutions over 1GB limit) - re-structuring uploads and figshare units to accomodate. May need to subscribe to figshare at appropriate level (=long term cost to researcher to keep data up and available)&lt;br /&gt;
7) Already see benefits because we know exactly where a dataset is (one that we have to search for all the time) - now it has a persistent place (easy to find)&lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
'''Tutorial Files, links, workflow'''&lt;br /&gt;
&lt;br /&gt;
http://trac.osgeo.org/osgeo4w/&lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
'''Publishing/Sharing datasets'''&lt;br /&gt;
&lt;br /&gt;
Citation:&lt;br /&gt;
Pierce, Suzanne (2015): Gridded Shapefile for the Recharge Zones of the Barton Springs segment of the Edwards Aquifer. figshare.&lt;br /&gt;
http://dx.doi.org/10.6084/m9.figshare.1330145 Retrieved 16:56, Mar 10, 2015 (GMT)&lt;/div&gt;</summary>
		<author><name>Suzanne</name></author>	</entry>

	<entry>
		<id>https://www.organicdatascience.org/gpf/index.php?title=Document_provenance_of_results_by_Suzanne_Pierce&amp;diff=10998</id>
		<title>Document provenance of results by Suzanne Pierce</title>
		<link rel="alternate" type="text/html" href="https://www.organicdatascience.org/gpf/index.php?title=Document_provenance_of_results_by_Suzanne_Pierce&amp;diff=10998"/>
				<updated>2015-03-10T17:22:23Z</updated>
		
		<summary type="html">&lt;p&gt;Suzanne: &lt;/p&gt;
&lt;hr /&gt;
&lt;div&gt;[[Category:Task]]&lt;br /&gt;
&amp;lt;br/&amp;gt;&amp;lt;b&amp;gt;Details on how to do this task:&amp;lt;/b&amp;gt; [[Document the provenance of the results]]&amp;lt;br/&amp;gt;&amp;lt;br/&amp;gt;&lt;br /&gt;
&amp;lt;!-- Add any wiki Text above this Line --&amp;gt;&lt;br /&gt;
&amp;lt;!-- Do NOT Edit below this Line --&amp;gt;&lt;br /&gt;
{{#set:&lt;br /&gt;
	Expertise=Open_science|&lt;br /&gt;
	Expertise=Geosciences|&lt;br /&gt;
	Owner=Suzanne_Pierce|&lt;br /&gt;
	Progress=0|&lt;br /&gt;
	StartDate=2015-03-07|&lt;br /&gt;
	TargetDate=2015-03-20|&lt;br /&gt;
	Type=Low}}&lt;br /&gt;
&lt;br /&gt;
'''Selecting Data Sets for GPF and Tutorial:'''&lt;br /&gt;
&lt;br /&gt;
Observations on making data public&lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
'''Publishing/Sharing datasets'''&lt;br /&gt;
&lt;br /&gt;
Citation:&lt;br /&gt;
Pierce, Suzanne (2015): Gridded Shapefile for the Recharge Zones of the Barton Springs segment of the Edwards Aquifer. figshare.&lt;br /&gt;
http://dx.doi.org/10.6084/m9.figshare.1330145 Retrieved 16:56, Mar 10, 2015 (GMT)&lt;/div&gt;</summary>
		<author><name>Suzanne</name></author>	</entry>

	<entry>
		<id>https://www.organicdatascience.org/gpf/index.php?title=Document_provenance_of_results_by_Suzanne_Pierce&amp;diff=10997</id>
		<title>Document provenance of results by Suzanne Pierce</title>
		<link rel="alternate" type="text/html" href="https://www.organicdatascience.org/gpf/index.php?title=Document_provenance_of_results_by_Suzanne_Pierce&amp;diff=10997"/>
				<updated>2015-03-10T17:20:58Z</updated>
		
		<summary type="html">&lt;p&gt;Suzanne: &lt;/p&gt;
&lt;hr /&gt;
&lt;div&gt;[[Category:Task]]&lt;br /&gt;
&amp;lt;br/&amp;gt;&amp;lt;b&amp;gt;Details on how to do this task:&amp;lt;/b&amp;gt; [[Document the provenance of the results]]&amp;lt;br/&amp;gt;&amp;lt;br/&amp;gt;&lt;br /&gt;
&amp;lt;!-- Add any wiki Text above this Line --&amp;gt;&lt;br /&gt;
&amp;lt;!-- Do NOT Edit below this Line --&amp;gt;&lt;br /&gt;
{{#set:&lt;br /&gt;
	Expertise=Open_science|&lt;br /&gt;
	Expertise=Geosciences|&lt;br /&gt;
	Owner=Suzanne_Pierce|&lt;br /&gt;
	Progress=0|&lt;br /&gt;
	StartDate=2015-03-07|&lt;br /&gt;
	TargetDate=2015-03-20|&lt;br /&gt;
	Type=Low}}&lt;br /&gt;
&lt;br /&gt;
Selecting Data Sets for GPF and Tutorial&lt;br /&gt;
&lt;br /&gt;
Publishing/Sharing datasets&lt;br /&gt;
&lt;br /&gt;
Citation:&lt;br /&gt;
Pierce, Suzanne (2015): Gridded Shapefile for the Recharge Zones of the Barton Springs segment of the Edwards Aquifer. figshare.&lt;br /&gt;
http://dx.doi.org/10.6084/m9.figshare.1330145 Retrieved 16:56, Mar 10, 2015 (GMT)&lt;/div&gt;</summary>
		<author><name>Suzanne</name></author>	</entry>

	<entry>
		<id>https://www.organicdatascience.org/gpf/index.php?title=Make_data_accessible_by_Suzanne_Pierce&amp;diff=10939</id>
		<title>Make data accessible by Suzanne Pierce</title>
		<link rel="alternate" type="text/html" href="https://www.organicdatascience.org/gpf/index.php?title=Make_data_accessible_by_Suzanne_Pierce&amp;diff=10939"/>
				<updated>2015-03-06T22:16:08Z</updated>
		
		<summary type="html">&lt;p&gt;Suzanne: &lt;/p&gt;
&lt;hr /&gt;
&lt;div&gt;[[Category:Task]]&lt;br /&gt;
&amp;lt;br/&amp;gt;&amp;lt;b&amp;gt;Details on how to do this task:&amp;lt;/b&amp;gt; [[Make data accessible]]&amp;lt;br/&amp;gt;&amp;lt;br/&amp;gt;&lt;br /&gt;
&amp;lt;!-- Add any wiki Text above this Line --&amp;gt;&lt;br /&gt;
&amp;lt;!-- Do NOT Edit below this Line --&amp;gt;&lt;br /&gt;
{{#set:&lt;br /&gt;
	Expertise=Open_science|&lt;br /&gt;
	Expertise=Geosciences|&lt;br /&gt;
	Owner=Suzanne_Pierce|&lt;br /&gt;
	Progress=33&lt;br /&gt;
	StartDate=2015-02-21|&lt;br /&gt;
	TargetDate=2015-03-09|&lt;br /&gt;
	Type=Low}}&lt;br /&gt;
&lt;br /&gt;
Made good progress this week, but realized that loading and finalizing the data for access will take some time.&lt;br /&gt;
I've selected the datasets that I will put up on figshare.&lt;br /&gt;
And, I've been gathering scripts and working with my colleague to get things ready for loading.&lt;br /&gt;
&lt;br /&gt;
We (John Gentle and I) decided to merge several scripts into one file for upload.&lt;br /&gt;
We are meeting again on Feb. 9th to upload things on figshare.&lt;br /&gt;
&lt;br /&gt;
I am preparing an image that shows the workflow from raw data through parsing the datasets with the scripts to be able to show how to modify the data for use in the software application.&lt;/div&gt;</summary>
		<author><name>Suzanne</name></author>	</entry>

	<entry>
		<id>https://www.organicdatascience.org/gpf/index.php?title=Make_data_accessible_by_Suzanne_Pierce&amp;diff=10938</id>
		<title>Make data accessible by Suzanne Pierce</title>
		<link rel="alternate" type="text/html" href="https://www.organicdatascience.org/gpf/index.php?title=Make_data_accessible_by_Suzanne_Pierce&amp;diff=10938"/>
				<updated>2015-03-06T22:12:18Z</updated>
		
		<summary type="html">&lt;p&gt;Suzanne: &lt;/p&gt;
&lt;hr /&gt;
&lt;div&gt;[[Category:Task]]&lt;br /&gt;
&amp;lt;br/&amp;gt;&amp;lt;b&amp;gt;Details on how to do this task:&amp;lt;/b&amp;gt; [[Make data accessible]]&amp;lt;br/&amp;gt;&amp;lt;br/&amp;gt;&lt;br /&gt;
&amp;lt;!-- Add any wiki Text above this Line --&amp;gt;&lt;br /&gt;
&amp;lt;!-- Do NOT Edit below this Line --&amp;gt;&lt;br /&gt;
{{#set:&lt;br /&gt;
	Expertise=Open_science|&lt;br /&gt;
	Expertise=Geosciences|&lt;br /&gt;
	Owner=Suzanne_Pierce|&lt;br /&gt;
	Progress=33&lt;br /&gt;
	StartDate=2015-02-21|&lt;br /&gt;
	TargetDate=2015-03-09|&lt;br /&gt;
	Type=Low}}&lt;br /&gt;
&lt;br /&gt;
Made good progress this week, but realized that loading and finalizing the data for access will take some time.&lt;br /&gt;
I've selected the datasets that I will put up on figshare.&lt;br /&gt;
And, I've been gathering scripts and working with my colleague to get things ready for loading.&lt;br /&gt;
&lt;br /&gt;
We (John Gentle and I) decided to merge several scripts into one file for upload.&lt;br /&gt;
We are meeting again on Feb. 9th to upload things on figshare.&lt;/div&gt;</summary>
		<author><name>Suzanne</name></author>	</entry>

	</feed>