Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

[docs] add Dataset.assign_coords example (#6336) #6558

Merged
merged 5 commits into from
May 11, 2022

Conversation

gregbehm
Copy link
Contributor

@gregbehm gregbehm commented May 2, 2022

Copy link
Collaborator

@max-sixty max-sixty left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Thanks @gregbehm ! One small comment

(also a self-serving shout-out for https://github.com/max-sixty/pytest-accept if you want to replace the values without copying & pasting them...)

>>> np.random.seed(0)
>>> ds = xr.Dataset(
... data_vars=dict(
... temperature=(["x", "y", "time"], 15 + 8 * np.random.randn(2, 2, 3)),
Copy link
Collaborator

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

We've generally tried to not use np.random, both because of the seed issue, and because it can be difficult for people to track the numbers through the example. What do you think about replacing it with np.arange(12).reshape(2,2,3)? Or something similar.

Copy link
Contributor Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Thanks for the suggestions @max-sixty. I borrowed my example from the top-level xarray.Dataset examples, but your idea makes good sense. I'm quite new to xarray and wanted to keep my example similar to others already in the docs. Happy to submit another change if you think that will improve things.

Also thanks for pointing our https://github.com/max-sixty/pytest-accept. I'll take a look!

Copy link
Collaborator

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Yes, we've been gradually going through the examples and changing them, but doc PRs are rarer than code (and so very much appreciated!), and we still have quite a few left over on random.

If you're up for making the change that would be great. If you don't think you'll get to it, this is still a good improvement. Thanks @gregbehm !

Copy link
Contributor Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

@max-sixty, I think working through the docs and adding examples will be a great way for me to learn xarray, so I'll put the change on my to-do list and try to get to it soon.

One more noob question: how do we connect this PR with the issue that prompted it, #6336 so that can get closed?

Copy link
Collaborator

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

That would be great!

It used to be that saying "closes #6336" worked, but I've seen that not work at times. Often someone will just manually close it after merging...

Copy link
Collaborator

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

It used to be that saying closes #6336 worked

this does work, but only if you use exactly that spelling in markdown. Unfortunately, github displays urls to issues / PRs the same way, hence the confusion.

If it did work, "closes" will become some kind of help link (not sure what the official name is), and on the right you see a linked issue (section "development"). In general, I think you just have to make sure the issue is linked in that section for it to work.

Greg Behm added 3 commits May 3, 2022 21:32
* remove np.random uses

* fix one copy-paste artifact 'description:  Temperature data'
* remove np.random uses

 * fix one copy-paste artifact 'description:  Temperature data'
@dcherian dcherian added the plan to merge Final call for comments label May 4, 2022
@dcherian dcherian merged commit f6d0f84 into pydata:main May 11, 2022
@dcherian
Copy link
Contributor

Thanks @gregbehm . I see this is your first contribution. Welcome to xarray!

@max-sixty
Copy link
Collaborator

Thanks @gregbehm !

@gregbehm
Copy link
Contributor Author

Thanks for the help @max-sixty and @dcherian. I'm hoping to contribute more, starting with the docs.

@gregbehm gregbehm deleted the issue-6336-1 branch May 13, 2022 03:40
dcherian added a commit to dcherian/xarray that referenced this pull request May 20, 2022
* main: (24 commits)
  Fix overflow issue in decode_cf_datetime for dtypes <= np.uint32 (pydata#6598)
  Enable flox in GroupBy and resample (pydata#5734)
  Add setuptools as dependency in ASV benchmark CI (pydata#6609)
  change polyval dim ordering (pydata#6601)
  re-add timedelta support for polyval (pydata#6599)
  Minor Dataset.map docstr clarification (pydata#6595)
  New inline_array kwarg for open_dataset (pydata#6566)
  Fix polyval overloads (pydata#6593)
  Restore old MultiIndex dropping behaviour (pydata#6592)
  [docs] add Dataset.assign_coords example (pydata#6336) (pydata#6558)
  Fix zarr append dtype checks (pydata#6476)
  Add missing space in exception message (pydata#6590)
  Doc Link to accessors list in extending-xarray.rst (pydata#6587)
  Fix Dataset/DataArray.isel with drop=True and scalar DataArray indexes (pydata#6579)
  Add some warnings about rechunking to the docs (pydata#6569)
  [pre-commit.ci] pre-commit autoupdate (pydata#6584)
  terminology.rst: fix link to Unidata's "netcdf_dataset_components" (pydata#6583)
  Allow string formatting of scalar DataArrays (pydata#5981)
  Fix mypy issues & reenable in tests (pydata#6581)
  polyval: Use Horner's algorithm + support chunked inputs (pydata#6548)
  ...
dcherian added a commit to headtr1ck/xarray that referenced this pull request May 20, 2022
commit 398f1b6
Author: dcherian <[email protected]>
Date:   Fri May 20 08:47:56 2022 -0600

    Backward compatibility dask

commit bde40e4
Merge: 0783df3 4cae8d0
Author: dcherian <[email protected]>
Date:   Fri May 20 07:54:48 2022 -0600

    Merge branch 'main' into dask-datetime-to-numeric

    * main:
      concatenate docs style (pydata#6621)
      Typing for open_dataset/array/mfdataset and to_netcdf/zarr (pydata#6612)
      {full,zeros,ones}_like typing (pydata#6611)

commit 0783df3
Merge: 5cff4f1 8de7061
Author: dcherian <[email protected]>
Date:   Sun May 15 21:03:50 2022 -0600

    Merge branch 'main' into dask-datetime-to-numeric

    * main: (24 commits)
      Fix overflow issue in decode_cf_datetime for dtypes <= np.uint32 (pydata#6598)
      Enable flox in GroupBy and resample (pydata#5734)
      Add setuptools as dependency in ASV benchmark CI (pydata#6609)
      change polyval dim ordering (pydata#6601)
      re-add timedelta support for polyval (pydata#6599)
      Minor Dataset.map docstr clarification (pydata#6595)
      New inline_array kwarg for open_dataset (pydata#6566)
      Fix polyval overloads (pydata#6593)
      Restore old MultiIndex dropping behaviour (pydata#6592)
      [docs] add Dataset.assign_coords example (pydata#6336) (pydata#6558)
      Fix zarr append dtype checks (pydata#6476)
      Add missing space in exception message (pydata#6590)
      Doc Link to accessors list in extending-xarray.rst (pydata#6587)
      Fix Dataset/DataArray.isel with drop=True and scalar DataArray indexes (pydata#6579)
      Add some warnings about rechunking to the docs (pydata#6569)
      [pre-commit.ci] pre-commit autoupdate (pydata#6584)
      terminology.rst: fix link to Unidata's "netcdf_dataset_components" (pydata#6583)
      Allow string formatting of scalar DataArrays (pydata#5981)
      Fix mypy issues & reenable in tests (pydata#6581)
      polyval: Use Horner's algorithm + support chunked inputs (pydata#6548)
      ...

commit 5cff4f1
Merge: dfe200d 6144c61
Author: Maximilian Roos <[email protected]>
Date:   Sun May 1 15:16:33 2022 -0700

    Merge branch 'main' into dask-datetime-to-numeric

commit dfe200d
Author: dcherian <[email protected]>
Date:   Sun May 1 11:04:03 2022 -0600

    Minor cleanup

commit 35ed378
Author: dcherian <[email protected]>
Date:   Sun May 1 10:57:36 2022 -0600

    Support dask arrays in datetime_to_numeric
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
plan to merge Final call for comments
Projects
None yet
Development

Successfully merging this pull request may close these issues.

Documentation talks about Dataset, example is for DataArray
4 participants