Skip to content
This repository has been archived by the owner on Jul 6, 2023. It is now read-only.

Commit

Permalink
Added datset shield
Browse files Browse the repository at this point in the history
  • Loading branch information
Nicholas Geneva committed Dec 2, 2020
1 parent e506bea commit a1e9556
Show file tree
Hide file tree
Showing 2 changed files with 2 additions and 2 deletions.
2 changes: 1 addition & 1 deletion README.md
Original file line number Diff line number Diff line change
Expand Up @@ -4,7 +4,7 @@ Multi-fidelity Generative Deep Learning Turbulent Flows [[FoDS]()][[ArXiv](https
[Nicholas Geneva](http:https://nicholasgeneva.com/), [Nicholas Zabaras](https://cics.nd.edu)

---
[![Documentation Status](https://readthedocs.org/projects/deep-turbulence/badge/?version=latest)](https://deep-turbulence.readthedocs.io/en/latest/?badge=latest) ![liscense](https://img.shields.io/github/license/zabaras/deep-turbulence)
[![Documentation Status](https://readthedocs.org/projects/deep-turbulence/badge/?version=latest)](https://deep-turbulence.readthedocs.io/en/latest/?badge=latest) ![dataset](https://zenodo.org/badge/DOI/10.5281/zenodo.4298896.svg) ![liscense](https://img.shields.io/github/license/zabaras/deep-turbulence)

A novel multi-fidelity deep generative model is introduced for the surrogate modeling of high-fidelity turbulent flow fields given the solution of a computationally inexpensive but inaccurate low-fidelity solver.

Expand Down
2 changes: 1 addition & 1 deletion tmglow/args.py
Original file line number Diff line number Diff line change
Expand Up @@ -103,7 +103,7 @@ class Parser(argparse.ArgumentParser):
:param lr: ADAM optimizer learning rate
:type lr: float
:note: Use `python main.py --help` for more information. Only serval key of arguments are listed here.
:note: Use `python main.py --help` for more information. Only several key of arguments are listed here.
"""
def __init__(self):
super(Parser, self).__init__(description='Read')
Expand Down

0 comments on commit a1e9556

Please sign in to comment.