Difference between revisions of "Develop proposal for special issue"
(→[Villamizar 2015]) |
(→[Yu 2015]) |
||
Line 204: | Line 204: | ||
* '''Tentative title:''' | * '''Tentative title:''' | ||
* '''Short abstract:''' | * '''Short abstract:''' | ||
+ | * '''Challenge:''' | ||
* '''Relationship to other publications:''' (is the article based on a previously published article? is it new content?) | * '''Relationship to other publications:''' (is the article based on a previously published article? is it new content?) | ||
* '''Pointer to the wiki page that documents the article:''' [[Document_GPF_activities_by_Xuan_Yu | Page]] | * '''Pointer to the wiki page that documents the article:''' [[Document_GPF_activities_by_Xuan_Yu | Page]] |
Revision as of 16:59, 13 March 2015
Contents
Background: Why a Special Issue on Geoscience Papers of the Future?
Include here our discussion for the vision
Background should be 1-2 pages.
Motivated by need to fully document and make research accessible and reproducible.
Motivation: The EarthCube Initiative and the GeoSoft Project
Include here background about GeoSoft from the web site
OSTP memo. EarthCube reports. Other reports that talk about the need for new approaches to editing.
It's possible that small or very large contributions are not well captured in the current publishing paradigms. Nanopublications.
For example, nano-publications are a possible way to reflect advances in a research process that may not merit a full pubication but they are useful advances to share with the community. A challenge here is that there is a stigma in publishing for publishing units that are too small or very small.
Alternatively, a very large piece of research or work with many parts may be better suited to a GPF style publication.
Perhaps, the concept of a 'paper' can be better reflected in the concept of a 'wrapper' or a collection of materials and resources. The purpose is to assure that publications are representative of the work, effort, and results achieved in the research process.
What is a GPF
Include here our discussion of what is a GPF
The challenges of creating GPFs
The articles in this issue reflect the current best practice for generating a Geoscience Paper of the Future.
'''Figure discussions''' Do we want to do exactly the same figure automatically. Figures in the paper may be a clean versions of an image generated by software. To the extent possible, authors have included clear delineations of provenance. The goal is to assure that readers may regenerate the figures using documented workflows, data, and codes. An important note (Allen, Sandra) is that frequently figures are generated by code, scripts, etc. yet the actual figure is finalized with user..... Mimi is trying to say: is it really worth belaboring the point about how the prettified version of the figure is made? If it is: both of the visualization software I've used (Matlab and SigmaPlot) have actual code in the background that specifies how to set up the prettification, and this code can be found, copied out, and rerun to generate the exact same figure with all of the prettification in the same place. SigmaPlot uses Visual Basic (I think) in its macros. If it is an important point about explicit code, this should be doable. But I'm not sure it's strictly necessary to specify exactly where all the prettifications are to get the gist across.
How much of your experimental history does one include? (Ibrahim). The experimental process often ends up nowhere. Should we document all the failed experiments? Get one DOI for the results of the successful experiment? Another for failed trials?
Documenting: Timing and Intermediate proceses
When should we document and what are the bounds on what we document?
For example, should we document and include data and workflows for 'failed' experiments? Or should we assign datasets DOIs before we know the results from using them?
The group thinks that good ideas/practices may include documenting and sharing data when you have a clear understanding of the outcomes worth reporting. For example successful experiments should have clear, clean data documented and shared. Whereas one strategy with 'failed' experiments could include bundling the intermediate datasets with one DOI and a more general discussion of the process/methods.
Related work
Include here the related work we have discussed
Papers to be included
Would it be worthwhile to group the papers into broader categories rather than giving specifics about every single paper?
For each submission, we describe:
- Authors and affiliations
- Keywords of research area
- Tentative title
- Short abstract
- Relationship to other publications (is the article based on a previously published article? is it new content?)
- Pointer to the wiki page that documents the article
- Expected submission date
[David 2015]
- Authors and affiliations: Cedric David
- Keywords of research area:
- Tentative title:
- Short abstract:
- Challenge:
Ensure that updates to an existing model are able to reproduce a series of simulations published previously.
- Relationship to other publications:
- Pointer to the wiki page that documents the article: Page
- Expected submission date:
[Demir 2015]
- Authors and affiliations: Ibrahim Demir
- Keywords of research area:
- Tentative title:
- Short abstract:
- Relationship to other publications: (is the article based on a previously published article? is it new content?)
- Pointer to the wiki page that documents the article: Page
- Expected submission date:
[Fulweiler 2015]
- Authors and affiliations: Wally Fulweiler
- Keywords of research area:
- Tentative title:
- Short abstract:
- Relationship to other publications: (is the article based on a previously published article? is it new content?)
- Pointer to the wiki page that documents the article: Page
- Expected submission date:
[Karlstrom and Lay 2015]
- Authors and affiliations: Leif Karlstrom and Lay Kuan Loh
- Keywords of research area:
- Tentative title:
- Short abstract:
- Relationship to other publications: (is the article based on a previously published article? is it new content?)
- Pointer to the wiki page that documents the article: Page
- Expected submission date:
[Lee 2015]
- Authors and affiliations: Kyo Lee
- Keywords of research area:
- Tentative title:
- Short abstract:
- Relationship to other publications: (is the article based on a previously published article? is it new content?)
- Pointer to the wiki page that documents the article: Page
- Expected submission date:
[Miller 2015]
- Authors and affiliations: Kim Miller
- Keywords of research area:
- Tentative title:
- Short abstract:
- Relationship to other publications: (is the article based on a previously published article? is it new content?)
- Pointer to the wiki page that documents the article: Page
- Expected submission date:
[Mills 2015]
- Authors and affiliations: Heath Mills
- Keywords of research area:
- Tentative title:
- Short abstract:
- Relationship to other publications: (is the article based on a previously published article? is it new content?)
- Pointer to the wiki page that documents the article: Page
- Expected submission date:
[Oh 2015]
- Authors and affiliations: Ji-Hyun Oh
- Keywords of research area:
- Tentative title:
- Short abstract:
- Relationship to other publications: (is the article based on a previously published article? is it new content?)
- Pointer to the wiki page that documents the article: Page
- Expected submission date:
[Pierce 2015]
- Authors and affiliations: Suzanne Pierce
- Keywords of research area:
- Tentative title:
- Short abstract:
- Challenge: Fully document a new software application and framework using example case study data and tutorials.
- Relationship to other publications: (is the article based on a previously published article? is it new content?)
- Pointer to the wiki page that documents the article: Page
- Expected submission date:
[Pope 2015]
- Authors and affiliations: Allen Pope
- Keywords of research area: Glaciology, Remote Sensing, Polar Science
- Tentative title: Data and Code for Estimating and Evaluating Supraglacial Lake Depth With Landsat 8 and other Multispectral Sensors
- Short abstract:
- Challenge: Reproducibility, Dark Code
- Relationship to other publications: Documenting and explaining the data and code behind the analysis and results presented in another paper.
- Pointer to the wiki page that documents the article: Page
- Expected submission date:
[Read and Winslow 2015]
- Authors and affiliations: Jordan Read and Luke Winslow
- Keywords of research area:
- Tentative title:
- Short abstract:
- Relationship to other publications: (is the article based on a previously published article? is it new content?)
- Pointer to the wiki page that documents the article: Page
- Expected submission date:
[Tzeng 2015]
- Authors and affiliations: Mimi Tzeng
- Keywords of research area:
- Tentative title:
- Short abstract:
- Relationship to other publications: (is the article based on a previously published article? is it new content?)
- Pointer to the wiki page that documents the article: Page
- Expected submission date:
General challenge: My paper will be about the processing of data in a larger dataset, from which peer-reviewed papers have been written. The processing I did was not specific to any particular paper. I can point to an example paper that used some of the data from this dataset, that I processed, however all of the figures in the paper are composites that also include other data from elsewhere that I had nothing to do with (and it wouldn't be feasible to try to get hold of the other data within our timeframe).
[Villamizar 2015]
- Authors and affiliations: Sandra Villamizar
- Keywords of research area:
- Tentative title:
- Short abstract:
- Challenge: Reproduction of a process -- Develop strategies for data analysis and visualization -- Document new software/applications
- Relationship to other publications: (is the article based on a previously published article? is it new content?)
- Pointer to the wiki page that documents the article: Page
- Expected submission date:
[Yu 2015]
- Authors and affiliations: Xuan Yu
- Keywords of research area:
- Tentative title:
- Short abstract:
- Challenge:
- Relationship to other publications: (is the article based on a previously published article? is it new content?)
- Pointer to the wiki page that documents the article: Page
- Expected submission date:
Special Issue Editors
- Co-editor:
- Co-editor:
- Co-editor:
The editors will only accept submissions that follow the special issue review criteria.
The editors will select a set of reviewers to handle the submissions. Reviewers will include computer scientists, library scientists, and geoscientists.
Special Issue Review Criteria
The reviewers will be asked to provide feedback on the papers according to the following criteria:
- Documentation of the datasets: descriptions of datasets, unique identifiers, repositories.
- Documentation of software: description of all software used (including pre-processing of data, visualization steps, etc), unique identifiers, repositories.
- Documentation of the provenance of results: provenance for each figure or result, such as the workflow or the provenance record.
Tentative Timeline
- Journal committed to special issue: April 15, 2015
- Submissions due to editors: June 30, 2015
- Reviews due: Sept 15, 2015
- Decisions out to authors: Sept 30, 2015
- Revisions due: October 31, 2015
- Final versions due November 15, 2015
- Issue published December 31, 2015