././@PaxHeader0000000000000000000000000000003400000000000011452 xustar000000000000000028 mtime=1605787217.3911848 ndcube-1.4.2/0000755000175100001640000000000000000000000013246 5ustar00vstsdocker00000000000000././@PaxHeader0000000000000000000000000000003300000000000011451 xustar000000000000000027 mtime=1605787057.146476 ndcube-1.4.2/CHANGELOG.rst0000644000175100001640000001643500000000000015300 0ustar00vstsdocker00000000000000Ndcube v1.4.2 (2020-11-19) ========================== Bug Fixes --------- - Fix bug whereby common axis was not updated appropriately when slicing an NDCubeSequence. (`#310 `__) Ndcube v1.4.1 (2020-11-03) ========================== Features -------- - Add (cube_like)array_axis_physical_types properties to NDCubeSequence and deprecate the world_axis_physical_types properties of NDCube and NDCubeSequence. (`#309 `__) - Add new properties to NDCubeSequence, array_axis_physical_types and cube_like_array_axis_physical_types, and deprecate the world_axis_physical_types properties on NDCube and NDCubeSequence. (`#309 `__) Ndcube v1.4.0 (2020-11-02) ========================== Features -------- - Add new method, `~ndcube.NDCube.axis_world_coord_values`, to return world coords for all pixels for all axes in WCS as quantity objects. (`#283 `__) Bug Fixes --------- - Fix NDCube plotting bug when dependent axes are not first axes. (`#278 `__) - Change name of NDCube.axis_world_coord_values to NDCube.axis_world_coords_values to be consistent with NDCube.axis_world_coords (`#293 `__) - Move ImageAnimatorWCS class into ndcube from sunpy as it is no longer supported from sunpy 2.1 onwards. (`#306 `__) - Fix bug in setting y-axis limits for 1D animations when inf or nan present in data. (`#308 `__) Ndcube v1.3.2 (2020-04-20) ========================== Bug Fixes --------- - Generalize int type checking so it is independent of the bit-type of the OS. (`#269 `__) Ndcube v1.3.1 (2020-04-17) ========================== Bug Fixes --------- - Fix NDCollection.aligned_dimensions so it doesnt crash when components of collection are NDCubeSequences. (`#264 `__) Trivial/Internal Changes ------------------------ - Simplify and speed up implementation of NDCubeSequence slicing. (`#251 `__) Ndcube v1.3.0 (2020-03-27) ========================== Features -------- - Add new NDCollection class for linking and manipulating partially or non-aligned NDCubes or NDCubeSequences. (`#238 `__) Bug Fixes --------- - Fixed the files included and excluded from the tarball. (`#212 `__) - Fix crashing bug when an NDCube axis after the first is sliced with a numpy.int64. (`#223 `__) - Raises error if NDCube is sliced with an Ellipsis. (`#224 `__) - Changes behavior of NDCubeSequence slicing. Previously, a slice item of interval length 1 would cause an NDCube object to be returned. Now an NDCubeSequence made up of 1 NDCube is returned. This is consistent with how interval length 1 slice items slice arrays. (`#241 `__) Ndcube v1.2.0 (2019-09-10) ========================== Features -------- - Changed all instances of "missing_axis" to "missing_axes" (`#157 `__) - Added a feature to get the pixel_edges from `ndcube.NDCube.axis_world_coords` (`#174 `__) Bug Fixes --------- - `ndcube.NDCube.world_axis_physical_types` now sets the axis label to the WCS CTYPE if no corresponding IVOA name can be found. (`#164 `__) - Fixed the bug of using `pixel_edges` instead of `pixel_values` in plotting (`#176 `__) - Fix 2D plotting from crashing when both data and WCS are 2D. (`#182 `__) - Fix the ability to pass a custom Axes to `ndcube.NDCube.plot` for a 2D cube. (`#204 `__) Trivial/Internal Changes ------------------------ - Include more helpful error when invalid item type is used to slice an `~ndcube.NDCube`. (`#158 `__) 1.1 === API-Breaking Changes -------------------- - `~ndcube.NDCubeBase.crop_by_extra_coord` API has been broken and replaced. Old version: ``crop_by_extra_coord(min_coord_value, interval_width, coord_name)``. New version: ``crop_by_extra_coord(coord_name, min_coord_value, max_coord_value)``. [#142] New Features ------------ - Created a new `~ndcube.NDCubeBase` which has all the functionality of `~ncube.NDCube` except the plotting. The old ``NDCubeBase`` which outlined the ``NDCube`` API was renamed ``NDCubeABC``. `~ndcube.NDCube` has all the same functionality as before except is now simply inherits from `~ndcube.NDCubeBase` and `~ndcube.mixins.plotting.NDCubePlotMixin`. [#101] - Moved NDCubSequence plotting to a new mixin class, NDCubSequencePlotMixin, making the plotting an optional extra. All the non-plotting functionality now lives in the NDCubeSequenceBase class. [#98] - Created a new `~ndcube.NDCubeBase.explode_along_axis` method that breaks an NDCube out into an NDCubeSequence along a chosen axis. It is equivalent to `~ndcube.NDCubeSequenceBase.explode_along_axis`. [#118] - NDCubeSequence plot mixin can now animate a cube as a 1-D line if a single axis number is supplied to plot_axis_indices kwarg. API Changes ----------- - Replaced API of what was previously ``utils.wcs.get_dependent_axes``, with two new functions, ``utils.wcs.get_dependent_data_axes`` and ``utils.wcs.get_dependent_wcs_axes``. This was inspired by a new implementation in ``glue-viz`` which is intended to be merged into ``astropy`` in the future. This API change helped fix the ``NDCube.world_axis_physical_type`` bug listed below. [#80] - Give users more control in plotting both for NDCubePlotMixin and NDCubeSequencePlotMixin. In most cases the axes coordinates, axes units, and data unit can be supplied manually or via supplying the name of an extra coordinate if it is wanted to describe an axis. In the case of NDCube, the old API is currently still supported by will be removed in future versions. [#98 #103] Bug Fixes --------- - Allowed `~ndcube.NDCubeBase.axis_world_coords` to accept negative axis indices as arguments. [#106] - Fixed bug in ``NDCube.crop_by_coords`` in case where real world coordinate system was rotated relative to pixel grid. [#113]. - `~ndcube.NDCubeBase.world_axis_physical_types` is now not case-sensitive to the CTYPE values in the WCS. [#109] - `~ndcube.NDCubeBase.plot` now generates a 1-D line animation when image_axis is an integer. 1.0.1 ================== New Features ------------ - Added installation instructions to docs. [#77] Bug Fixes --------- - Fixed bugs in ``NDCubeSequence`` slicing and ``NDCubeSequence.dimensions`` in cases where sub-cubes contain scalar ``.data``. [#79] - Fixed ``NDCube.world_axis_physical_types`` in cases where there is a ``missing`` WCS axis. [#80] - Fixed bugs in converting between negative data and WCS axis numbers. [#91] - Add installation instruction to docs. [#77] - Fix function name called within NDCubeSequence.plot animation update plot. [#95] ././@PaxHeader0000000000000000000000000000003300000000000011451 xustar000000000000000027 mtime=1605787057.146476 ndcube-1.4.2/LICENSE.rst0000644000175100001640000000242700000000000015067 0ustar00vstsdocker00000000000000Copyright (c) 2017-2018, SunPy Developers All rights reserved. Redistribution and use in source and binary forms, with or without modification, are permitted provided that the following conditions are met: * Redistributions of source code must retain the above copyright notice, this list of conditions and the following disclaimer. * Redistributions in binary form must reproduce the above copyright notice, this list of conditions and the following disclaimer in the documentation and/or other materials provided with the distribution. THIS SOFTWARE IS PROVIDED BY THE COPYRIGHT HOLDERS AND CONTRIBUTORS "AS IS" AND ANY EXPRESS OR IMPLIED WARRANTIES, INCLUDING, BUT NOT LIMITED TO, THE IMPLIED WARRANTIES OF MERCHANTABILITY AND FITNESS FOR A PARTICULAR PURPOSE ARE DISCLAIMED. IN NO EVENT SHALL THE COPYRIGHT HOLDER OR CONTRIBUTORS BE LIABLE FOR ANY DIRECT, INDIRECT, INCIDENTAL, SPECIAL, EXEMPLARY, OR CONSEQUENTIAL DAMAGES (INCLUDING, BUT NOT LIMITED TO, PROCUREMENT OF SUBSTITUTE GOODS OR SERVICES; LOSS OF USE, DATA, OR PROFITS; OR BUSINESS INTERRUPTION) HOWEVER CAUSED AND ON ANY THEORY OF LIABILITY, WHETHER IN CONTRACT, STRICT LIABILITY, OR TORT (INCLUDING NEGLIGENCE OR OTHERWISE) ARISING IN ANY WAY OUT OF THE USE OF THIS SOFTWARE, EVEN IF ADVISED OF THE POSSIBILITY OF SUCH DAMAGE. ././@PaxHeader0000000000000000000000000000003400000000000011452 xustar000000000000000028 mtime=1605787217.3911848 ndcube-1.4.2/PKG-INFO0000644000175100001640000002150200000000000014343 0ustar00vstsdocker00000000000000Metadata-Version: 2.1 Name: ndcube Version: 1.4.2 Summary: A package for multi-dimensional contiguious and non-contiguious coordinate aware arrays. Home-page: http://docs.sunpy.org/projects/ndcube/ Author: The SunPy Community Author-email: sunpy@googlegroups.com License: BSD 2-Clause Description: ndcube ====== .. image:: http://img.shields.io/badge/powered%20by-SunPy-orange.svg?style=flat :target: http://www.sunpy.org :alt: Powered by SunPy Badge ndcube is an open-source SunPy affiliated package for manipulating, inspecting and visualizing multi-dimensional contiguous and non-contiguous coordinate-aware data arrays. It combines data, uncertainties, units, metadata, masking, and coordinate transformations into classes with unified slicing and generic coordinate transformations and plotting/animation capabilities. It is designed to handle data of any number of dimensions and axis types (e.g. spatial, temporal, spectral, etc.) whose relationship between the array elements and the real world can be described by World Coordinate System (WCS) translations. See the `ndcube docs`_ for more details. Installation ------------ ndcube requires Python 3.5+, SunPy 0.8+, astropy and matplotlib. Stable Version ############## There are two options for installing the stable version of ndcube. The first is via the anaconda distribution using the conda-forge channel:: $ conda install --channel conda-forge ndcube For more information on installing the anaconda distribution, see the `anaconda website`_. To update ndcube do:: $ conda update ndcube The second option for installing the stable verison of ndcube is via pip:: $ pip install ndcube Then to update ndcube do:: $ pip install ndcube --upgrade Development Version ################### The stable version of ndcube will be relatively reliable. However, if you value getting the latest updates immediately over reliablility, or want to contribute to the development of ndcube, you will need to install the bleeding edge version via github. The recommended way to set up your system is to first fork the `ndcube github repository`_ to your github account and then clone your forked repo to your local machine. This setup has the advantage of being able to push any changes you make in local version to your github account. From there you can issue pull requests to have your changes merged into the main repo and thus shared with other users. You can also set up a remote between your local version and the main repo so that you can stay updated with the latest changes to ndcube. Let's step through how to do this. Once you've forked the main `ndcube github repository`_ to your github account, create a conda environment on your local machine to hold the ndcube bleeding edge version and activate that environment. Type the following into a terminal:: $ conda config --append channels conda-forge $ conda create -n ndcube-dev python sunpy hypothesis pytest-mock $ source activate ndcube-dev Next clone the ndcube repo from your github account to a new directory. Let's call it ``ndcude-git``:: $ git clone https://github.com/your-github-name/ndcube.git ndcube-git To install, change into the new directory and run the install script:: $ cd ndcube-git $ pip install -e . Finally add a remote to the main repo so you can pull the latest version:: $ git remote add upstream https://github.com/sunpy/ndcube.git Then to ensure you stay up-to-date with the latest version of ndcube, regularly do:: $ git pull upstream master To push any changes you make to your github account by doing:: $ git push origin branch-name where ``branch-name`` is the name of the branch you're working on. Then from your github account you can request your changes to be merged to the main repo. For more information on on git version control, github, and issuing pull requests, see `SunPy's version control guide`_. Getting Help ------------ As a SunPy-affiliated package, ndcube relies on the SunPy support infrastructure. To pose questions to ndcube and SunPy developers and to get annoucements regarding ndcube and SunPy in general, sign up to the - `SunPy Mailing List`_ To get quicker feedback and chat directly to ndcube and SunPy developers check out the - `SunPy Matrix Channel`_. Contributing ------------ If you would like to get involved, start by joining the `SunPy mailing list`_ and check out the `Developer’s Guide`_ section of the SunPy docs. Stop by our chat room `#sunpy:matrix.org`_ if you have any questions. Help is always welcome so let us know what you like to work on, or check out the `issues page`_ for the list of known outstanding items. For more information on contributing to ncdube or the SunPy organization, please read the SunPy `contributing guide`_. **Imposter syndrome disclaimer**: We want your help. No, really. There may be a little voice inside your head that is telling you that you're not ready to be an open source contributor; that your skills aren't nearly good enough to contribute. What could you possibly offer a project like this one? We assure you - the little voice in your head is wrong. If you can write code at all, you can contribute code to open source. Contributing to open source projects is a fantastic way to advance one's coding skills. Writing perfect code isn't the measure of a good developer (that would disqualify all of us!); it's trying to create something, making mistakes, and learning from those mistakes. That's how we all improve, and we are happy to help others learn. Being an open source contributor doesn't just mean writing code, either. You can help out by writing documentation, tests, or even giving feedback about the project (and yes - that includes giving feedback about the contribution process). Some of these contributions may be the most valuable to the project as a whole, because you're coming to the project with fresh eyes, so you can see the errors and assumptions that seasoned contributors have glossed over. Note: This disclaimer was originally written by `Adrienne Lowe `_ for a `PyCon talk `_, and was adapted by ndcube based on its use in the README file for the `MetPy project `_. Code of Conduct --------------- When you are interacting with the SunPy community you are asked to follow our `Code of Conduct`_. License ------- This project is Copyright (c) SunPy Developers and licensed under the terms of the BSD 2-Clause license. See the licenses folder for more information. .. _ndcube docs: http://docs.sunpy.org/projects/ndcube/ .. _installation guide: http://docs.sunpy.org/en/stable/guide/installation/index.html .. _SunPy Matrix Channel: https://riot.im/app/#/room/#sunpy:matrix.org .. _`#sunpy:matrix.org`: https://riot.im/app/#/room/#sunpy:matrix.org .. _SunPy mailing list: https://groups.google.com/forum/#!forum/sunpy .. _Developer’s Guide: http://docs.sunpy.org/en/latest/dev_guide/index.html .. _issues page: https://github.com/sunpy/ndcube/issues .. _contributing guide: http://docs.sunpy.org/en/stable/dev_guide/newcomers.html#newcomers .. _Code of Conduct: http://docs.sunpy.org/en/stable/coc.html .. _anaconda website: https://docs.anaconda.com/anaconda/install.html .. _`ndcube github repository`: https://github.com/sunpy/ndcube .. _`SunPy's version control guide`: http://docs.sunpy.org/en/stable/dev_guide/version_control.html Platform: UNKNOWN Provides-Extra: tests Provides-Extra: docs Provides-Extra: dev Provides-Extra: all ././@PaxHeader0000000000000000000000000000003300000000000011451 xustar000000000000000027 mtime=1605787057.146476 ndcube-1.4.2/README.rst0000644000175100001640000001605200000000000014741 0ustar00vstsdocker00000000000000ndcube ====== .. image:: http://img.shields.io/badge/powered%20by-SunPy-orange.svg?style=flat :target: http://www.sunpy.org :alt: Powered by SunPy Badge ndcube is an open-source SunPy affiliated package for manipulating, inspecting and visualizing multi-dimensional contiguous and non-contiguous coordinate-aware data arrays. It combines data, uncertainties, units, metadata, masking, and coordinate transformations into classes with unified slicing and generic coordinate transformations and plotting/animation capabilities. It is designed to handle data of any number of dimensions and axis types (e.g. spatial, temporal, spectral, etc.) whose relationship between the array elements and the real world can be described by World Coordinate System (WCS) translations. See the `ndcube docs`_ for more details. Installation ------------ ndcube requires Python 3.5+, SunPy 0.8+, astropy and matplotlib. Stable Version ############## There are two options for installing the stable version of ndcube. The first is via the anaconda distribution using the conda-forge channel:: $ conda install --channel conda-forge ndcube For more information on installing the anaconda distribution, see the `anaconda website`_. To update ndcube do:: $ conda update ndcube The second option for installing the stable verison of ndcube is via pip:: $ pip install ndcube Then to update ndcube do:: $ pip install ndcube --upgrade Development Version ################### The stable version of ndcube will be relatively reliable. However, if you value getting the latest updates immediately over reliablility, or want to contribute to the development of ndcube, you will need to install the bleeding edge version via github. The recommended way to set up your system is to first fork the `ndcube github repository`_ to your github account and then clone your forked repo to your local machine. This setup has the advantage of being able to push any changes you make in local version to your github account. From there you can issue pull requests to have your changes merged into the main repo and thus shared with other users. You can also set up a remote between your local version and the main repo so that you can stay updated with the latest changes to ndcube. Let's step through how to do this. Once you've forked the main `ndcube github repository`_ to your github account, create a conda environment on your local machine to hold the ndcube bleeding edge version and activate that environment. Type the following into a terminal:: $ conda config --append channels conda-forge $ conda create -n ndcube-dev python sunpy hypothesis pytest-mock $ source activate ndcube-dev Next clone the ndcube repo from your github account to a new directory. Let's call it ``ndcude-git``:: $ git clone https://github.com/your-github-name/ndcube.git ndcube-git To install, change into the new directory and run the install script:: $ cd ndcube-git $ pip install -e . Finally add a remote to the main repo so you can pull the latest version:: $ git remote add upstream https://github.com/sunpy/ndcube.git Then to ensure you stay up-to-date with the latest version of ndcube, regularly do:: $ git pull upstream master To push any changes you make to your github account by doing:: $ git push origin branch-name where ``branch-name`` is the name of the branch you're working on. Then from your github account you can request your changes to be merged to the main repo. For more information on on git version control, github, and issuing pull requests, see `SunPy's version control guide`_. Getting Help ------------ As a SunPy-affiliated package, ndcube relies on the SunPy support infrastructure. To pose questions to ndcube and SunPy developers and to get annoucements regarding ndcube and SunPy in general, sign up to the - `SunPy Mailing List`_ To get quicker feedback and chat directly to ndcube and SunPy developers check out the - `SunPy Matrix Channel`_. Contributing ------------ If you would like to get involved, start by joining the `SunPy mailing list`_ and check out the `Developer’s Guide`_ section of the SunPy docs. Stop by our chat room `#sunpy:matrix.org`_ if you have any questions. Help is always welcome so let us know what you like to work on, or check out the `issues page`_ for the list of known outstanding items. For more information on contributing to ncdube or the SunPy organization, please read the SunPy `contributing guide`_. **Imposter syndrome disclaimer**: We want your help. No, really. There may be a little voice inside your head that is telling you that you're not ready to be an open source contributor; that your skills aren't nearly good enough to contribute. What could you possibly offer a project like this one? We assure you - the little voice in your head is wrong. If you can write code at all, you can contribute code to open source. Contributing to open source projects is a fantastic way to advance one's coding skills. Writing perfect code isn't the measure of a good developer (that would disqualify all of us!); it's trying to create something, making mistakes, and learning from those mistakes. That's how we all improve, and we are happy to help others learn. Being an open source contributor doesn't just mean writing code, either. You can help out by writing documentation, tests, or even giving feedback about the project (and yes - that includes giving feedback about the contribution process). Some of these contributions may be the most valuable to the project as a whole, because you're coming to the project with fresh eyes, so you can see the errors and assumptions that seasoned contributors have glossed over. Note: This disclaimer was originally written by `Adrienne Lowe `_ for a `PyCon talk `_, and was adapted by ndcube based on its use in the README file for the `MetPy project `_. Code of Conduct --------------- When you are interacting with the SunPy community you are asked to follow our `Code of Conduct`_. License ------- This project is Copyright (c) SunPy Developers and licensed under the terms of the BSD 2-Clause license. See the licenses folder for more information. .. _ndcube docs: http://docs.sunpy.org/projects/ndcube/ .. _installation guide: http://docs.sunpy.org/en/stable/guide/installation/index.html .. _SunPy Matrix Channel: https://riot.im/app/#/room/#sunpy:matrix.org .. _`#sunpy:matrix.org`: https://riot.im/app/#/room/#sunpy:matrix.org .. _SunPy mailing list: https://groups.google.com/forum/#!forum/sunpy .. _Developer’s Guide: http://docs.sunpy.org/en/latest/dev_guide/index.html .. _issues page: https://github.com/sunpy/ndcube/issues .. _contributing guide: http://docs.sunpy.org/en/stable/dev_guide/newcomers.html#newcomers .. _Code of Conduct: http://docs.sunpy.org/en/stable/coc.html .. _anaconda website: https://docs.anaconda.com/anaconda/install.html .. _`ndcube github repository`: https://github.com/sunpy/ndcube .. _`SunPy's version control guide`: http://docs.sunpy.org/en/stable/dev_guide/version_control.html ././@PaxHeader0000000000000000000000000000003300000000000011451 xustar000000000000000027 mtime=1605787057.146476 ndcube-1.4.2/RELEASE.rst0000644000175100001640000000242700000000000015065 0ustar00vstsdocker00000000000000The SunPy project is very happy to announce a new release of "ndcube". ndcube is a package built for handling, inspecting and visualizing a wide variety of data, of any number of dimensions, along with coordinate information described by WCS. The `~ndcube.NDCube` provides functionality for slicing the data, WCS and other metadata simultaneously, plotting and animating, and associating extra coordinate information along any axis. In addition, the ndcube package provides the `~nddata.NDCubeSequence` class for representing sequences of `~ndcube.NDCube` objects where the coordinate information may or may not align, and accessing these sequences in a way consistent with a singular cube. This release of ndcube contains 95 commits in 12 merged pull requests closing 5 issues from 5 people, 2 of which are first time contributors to ndcube. The people who have contributed to the code for this release are: Daniel Ryan David Stansby * Nabil Freij P. L. Lim * Stuart Mumford Where a * indicates their first contribution to ndcube. For more information about ndcube see the `documentation `_. ndcube can be installed from pip or conda using the following commands:: $ conda install -c conda-forge ndcube $ pip install ndcube ././@PaxHeader0000000000000000000000000000003300000000000011451 xustar000000000000000027 mtime=1605787057.150476 ndcube-1.4.2/ah_bootstrap.py0000644000175100001640000011063400000000000016312 0ustar00vstsdocker00000000000000""" This bootstrap module contains code for ensuring that the astropy_helpers package will be importable by the time the setup.py script runs. It also includes some workarounds to ensure that a recent-enough version of setuptools is being used for the installation. This module should be the first thing imported in the setup.py of distributions that make use of the utilities in astropy_helpers. If the distribution ships with its own copy of astropy_helpers, this module will first attempt to import from the shipped copy. However, it will also check PyPI to see if there are any bug-fix releases on top of the current version that may be useful to get past platform-specific bugs that have been fixed. When running setup.py, use the ``--offline`` command-line option to disable the auto-upgrade checks. When this module is imported or otherwise executed it automatically calls a main function that attempts to read the project's setup.cfg file, which it checks for a configuration section called ``[ah_bootstrap]`` the presences of that section, and options therein, determine the next step taken: If it contains an option called ``auto_use`` with a value of ``True``, it will automatically call the main function of this module called `use_astropy_helpers` (see that function's docstring for full details). Otherwise no further action is taken and by default the system-installed version of astropy-helpers will be used (however, ``ah_bootstrap.use_astropy_helpers`` may be called manually from within the setup.py script). This behavior can also be controlled using the ``--auto-use`` and ``--no-auto-use`` command-line flags. For clarity, an alias for ``--no-auto-use`` is ``--use-system-astropy-helpers``, and we recommend using the latter if needed. Additional options in the ``[ah_boostrap]`` section of setup.cfg have the same names as the arguments to `use_astropy_helpers`, and can be used to configure the bootstrap script when ``auto_use = True``. See https://github.com/astropy/astropy-helpers for more details, and for the latest version of this module. """ import contextlib import errno import io import locale import os import re import subprocess as sp import sys from distutils import log from distutils.debug import DEBUG from configparser import ConfigParser, RawConfigParser import pkg_resources from setuptools import Distribution from setuptools.package_index import PackageIndex # This is the minimum Python version required for astropy-helpers __minimum_python_version__ = (3, 5) # TODO: Maybe enable checking for a specific version of astropy_helpers? DIST_NAME = 'astropy-helpers' PACKAGE_NAME = 'astropy_helpers' UPPER_VERSION_EXCLUSIVE = None # Defaults for other options DOWNLOAD_IF_NEEDED = True INDEX_URL = 'https://pypi.python.org/simple' USE_GIT = True OFFLINE = False AUTO_UPGRADE = True # A list of all the configuration options and their required types CFG_OPTIONS = [ ('auto_use', bool), ('path', str), ('download_if_needed', bool), ('index_url', str), ('use_git', bool), ('offline', bool), ('auto_upgrade', bool) ] # Start off by parsing the setup.cfg file SETUP_CFG = ConfigParser() if os.path.exists('setup.cfg'): try: SETUP_CFG.read('setup.cfg') except Exception as e: if DEBUG: raise log.error( "Error reading setup.cfg: {0!r}\n{1} will not be " "automatically bootstrapped and package installation may fail." "\n{2}".format(e, PACKAGE_NAME, _err_help_msg)) # We used package_name in the package template for a while instead of name if SETUP_CFG.has_option('metadata', 'name'): parent_package = SETUP_CFG.get('metadata', 'name') elif SETUP_CFG.has_option('metadata', 'package_name'): parent_package = SETUP_CFG.get('metadata', 'package_name') else: parent_package = None if SETUP_CFG.has_option('options', 'python_requires'): python_requires = SETUP_CFG.get('options', 'python_requires') # The python_requires key has a syntax that can be parsed by SpecifierSet # in the packaging package. However, we don't want to have to depend on that # package, so instead we can use setuptools (which bundles packaging). We # have to add 'python' to parse it with Requirement. from pkg_resources import Requirement req = Requirement.parse('python' + python_requires) # We want the Python version as a string, which we can get from the platform module import platform # strip off trailing '+' incase this is a dev install of python python_version = platform.python_version().strip('+') # allow pre-releases to count as 'new enough' if not req.specifier.contains(python_version, True): if parent_package is None: message = "ERROR: Python {} is required by this package\n".format(req.specifier) else: message = "ERROR: Python {} is required by {}\n".format(req.specifier, parent_package) sys.stderr.write(message) sys.exit(1) if sys.version_info < __minimum_python_version__: if parent_package is None: message = "ERROR: Python {} or later is required by astropy-helpers\n".format( __minimum_python_version__) else: message = "ERROR: Python {} or later is required by astropy-helpers for {}\n".format( __minimum_python_version__, parent_package) sys.stderr.write(message) sys.exit(1) _str_types = (str, bytes) # What follows are several import statements meant to deal with install-time # issues with either missing or misbehaving pacakges (including making sure # setuptools itself is installed): # Check that setuptools 30.3 or later is present from distutils.version import LooseVersion try: import setuptools assert LooseVersion(setuptools.__version__) >= LooseVersion('30.3') except (ImportError, AssertionError): sys.stderr.write("ERROR: setuptools 30.3 or later is required by astropy-helpers\n") sys.exit(1) # typing as a dependency for 1.6.1+ Sphinx causes issues when imported after # initializing submodule with ah_boostrap.py # See discussion and references in # https://github.com/astropy/astropy-helpers/issues/302 try: import typing # noqa except ImportError: pass # Note: The following import is required as a workaround to # https://github.com/astropy/astropy-helpers/issues/89; if we don't import this # module now, it will get cleaned up after `run_setup` is called, but that will # later cause the TemporaryDirectory class defined in it to stop working when # used later on by setuptools try: import setuptools.py31compat # noqa except ImportError: pass # matplotlib can cause problems if it is imported from within a call of # run_setup(), because in some circumstances it will try to write to the user's # home directory, resulting in a SandboxViolation. See # https://github.com/matplotlib/matplotlib/pull/4165 # Making sure matplotlib, if it is available, is imported early in the setup # process can mitigate this (note importing matplotlib.pyplot has the same # issue) try: import matplotlib matplotlib.use('Agg') import matplotlib.pyplot except: # Ignore if this fails for *any* reason* pass # End compatibility imports... class _Bootstrapper(object): """ Bootstrapper implementation. See ``use_astropy_helpers`` for parameter documentation. """ def __init__(self, path=None, index_url=None, use_git=None, offline=None, download_if_needed=None, auto_upgrade=None): if path is None: path = PACKAGE_NAME if not (isinstance(path, _str_types) or path is False): raise TypeError('path must be a string or False') if not isinstance(path, str): fs_encoding = sys.getfilesystemencoding() path = path.decode(fs_encoding) # path to unicode self.path = path # Set other option attributes, using defaults where necessary self.index_url = index_url if index_url is not None else INDEX_URL self.offline = offline if offline is not None else OFFLINE # If offline=True, override download and auto-upgrade if self.offline: download_if_needed = False auto_upgrade = False self.download = (download_if_needed if download_if_needed is not None else DOWNLOAD_IF_NEEDED) self.auto_upgrade = (auto_upgrade if auto_upgrade is not None else AUTO_UPGRADE) # If this is a release then the .git directory will not exist so we # should not use git. git_dir_exists = os.path.exists(os.path.join(os.path.dirname(__file__), '.git')) if use_git is None and not git_dir_exists: use_git = False self.use_git = use_git if use_git is not None else USE_GIT # Declared as False by default--later we check if astropy-helpers can be # upgraded from PyPI, but only if not using a source distribution (as in # the case of import from a git submodule) self.is_submodule = False @classmethod def main(cls, argv=None): if argv is None: argv = sys.argv config = cls.parse_config() config.update(cls.parse_command_line(argv)) auto_use = config.pop('auto_use', False) bootstrapper = cls(**config) if auto_use: # Run the bootstrapper, otherwise the setup.py is using the old # use_astropy_helpers() interface, in which case it will run the # bootstrapper manually after reconfiguring it. bootstrapper.run() return bootstrapper @classmethod def parse_config(cls): if not SETUP_CFG.has_section('ah_bootstrap'): return {} config = {} for option, type_ in CFG_OPTIONS: if not SETUP_CFG.has_option('ah_bootstrap', option): continue if type_ is bool: value = SETUP_CFG.getboolean('ah_bootstrap', option) else: value = SETUP_CFG.get('ah_bootstrap', option) config[option] = value return config @classmethod def parse_command_line(cls, argv=None): if argv is None: argv = sys.argv config = {} # For now we just pop recognized ah_bootstrap options out of the # arg list. This is imperfect; in the unlikely case that a setup.py # custom command or even custom Distribution class defines an argument # of the same name then we will break that. However there's a catch22 # here that we can't just do full argument parsing right here, because # we don't yet know *how* to parse all possible command-line arguments. if '--no-git' in argv: config['use_git'] = False argv.remove('--no-git') if '--offline' in argv: config['offline'] = True argv.remove('--offline') if '--auto-use' in argv: config['auto_use'] = True argv.remove('--auto-use') if '--no-auto-use' in argv: config['auto_use'] = False argv.remove('--no-auto-use') if '--use-system-astropy-helpers' in argv: config['auto_use'] = False argv.remove('--use-system-astropy-helpers') return config def run(self): strategies = ['local_directory', 'local_file', 'index'] dist = None # First, remove any previously imported versions of astropy_helpers; # this is necessary for nested installs where one package's installer # is installing another package via setuptools.sandbox.run_setup, as in # the case of setup_requires for key in list(sys.modules): try: if key == PACKAGE_NAME or key.startswith(PACKAGE_NAME + '.'): del sys.modules[key] except AttributeError: # Sometimes mysterious non-string things can turn up in # sys.modules continue # Check to see if the path is a submodule self.is_submodule = self._check_submodule() for strategy in strategies: method = getattr(self, 'get_{0}_dist'.format(strategy)) dist = method() if dist is not None: break else: raise _AHBootstrapSystemExit( "No source found for the {0!r} package; {0} must be " "available and importable as a prerequisite to building " "or installing this package.".format(PACKAGE_NAME)) # This is a bit hacky, but if astropy_helpers was loaded from a # directory/submodule its Distribution object gets a "precedence" of # "DEVELOP_DIST". However, in other cases it gets a precedence of # "EGG_DIST". However, when activing the distribution it will only be # placed early on sys.path if it is treated as an EGG_DIST, so always # do that dist = dist.clone(precedence=pkg_resources.EGG_DIST) # Otherwise we found a version of astropy-helpers, so we're done # Just active the found distribution on sys.path--if we did a # download this usually happens automatically but it doesn't hurt to # do it again # Note: Adding the dist to the global working set also activates it # (makes it importable on sys.path) by default. try: pkg_resources.working_set.add(dist, replace=True) except TypeError: # Some (much) older versions of setuptools do not have the # replace=True option here. These versions are old enough that all # bets may be off anyways, but it's easy enough to work around just # in case... if dist.key in pkg_resources.working_set.by_key: del pkg_resources.working_set.by_key[dist.key] pkg_resources.working_set.add(dist) @property def config(self): """ A `dict` containing the options this `_Bootstrapper` was configured with. """ return dict((optname, getattr(self, optname)) for optname, _ in CFG_OPTIONS if hasattr(self, optname)) def get_local_directory_dist(self): """ Handle importing a vendored package from a subdirectory of the source distribution. """ if not os.path.isdir(self.path): return log.info('Attempting to import astropy_helpers from {0} {1!r}'.format( 'submodule' if self.is_submodule else 'directory', self.path)) dist = self._directory_import() if dist is None: log.warn( 'The requested path {0!r} for importing {1} does not ' 'exist, or does not contain a copy of the {1} ' 'package.'.format(self.path, PACKAGE_NAME)) elif self.auto_upgrade and not self.is_submodule: # A version of astropy-helpers was found on the available path, but # check to see if a bugfix release is available on PyPI upgrade = self._do_upgrade(dist) if upgrade is not None: dist = upgrade return dist def get_local_file_dist(self): """ Handle importing from a source archive; this also uses setup_requires but points easy_install directly to the source archive. """ if not os.path.isfile(self.path): return log.info('Attempting to unpack and import astropy_helpers from ' '{0!r}'.format(self.path)) try: dist = self._do_download(find_links=[self.path]) except Exception as e: if DEBUG: raise log.warn( 'Failed to import {0} from the specified archive {1!r}: ' '{2}'.format(PACKAGE_NAME, self.path, str(e))) dist = None if dist is not None and self.auto_upgrade: # A version of astropy-helpers was found on the available path, but # check to see if a bugfix release is available on PyPI upgrade = self._do_upgrade(dist) if upgrade is not None: dist = upgrade return dist def get_index_dist(self): if not self.download: log.warn('Downloading {0!r} disabled.'.format(DIST_NAME)) return None log.warn( "Downloading {0!r}; run setup.py with the --offline option to " "force offline installation.".format(DIST_NAME)) try: dist = self._do_download() except Exception as e: if DEBUG: raise log.warn( 'Failed to download and/or install {0!r} from {1!r}:\n' '{2}'.format(DIST_NAME, self.index_url, str(e))) dist = None # No need to run auto-upgrade here since we've already presumably # gotten the most up-to-date version from the package index return dist def _directory_import(self): """ Import astropy_helpers from the given path, which will be added to sys.path. Must return True if the import succeeded, and False otherwise. """ # Return True on success, False on failure but download is allowed, and # otherwise raise SystemExit path = os.path.abspath(self.path) # Use an empty WorkingSet rather than the man # pkg_resources.working_set, since on older versions of setuptools this # will invoke a VersionConflict when trying to install an upgrade ws = pkg_resources.WorkingSet([]) ws.add_entry(path) dist = ws.by_key.get(DIST_NAME) if dist is None: # We didn't find an egg-info/dist-info in the given path, but if a # setup.py exists we can generate it setup_py = os.path.join(path, 'setup.py') if os.path.isfile(setup_py): # We use subprocess instead of run_setup from setuptools to # avoid segmentation faults - see the following for more details: # https://github.com/cython/cython/issues/2104 sp.check_output([sys.executable, 'setup.py', 'egg_info'], cwd=path) for dist in pkg_resources.find_distributions(path, True): # There should be only one... return dist return dist def _do_download(self, version='', find_links=None): if find_links: allow_hosts = '' index_url = None else: allow_hosts = None index_url = self.index_url # Annoyingly, setuptools will not handle other arguments to # Distribution (such as options) before handling setup_requires, so it # is not straightforward to programmatically augment the arguments which # are passed to easy_install class _Distribution(Distribution): def get_option_dict(self, command_name): opts = Distribution.get_option_dict(self, command_name) if command_name == 'easy_install': if find_links is not None: opts['find_links'] = ('setup script', find_links) if index_url is not None: opts['index_url'] = ('setup script', index_url) if allow_hosts is not None: opts['allow_hosts'] = ('setup script', allow_hosts) return opts if version: req = '{0}=={1}'.format(DIST_NAME, version) else: if UPPER_VERSION_EXCLUSIVE is None: req = DIST_NAME else: req = '{0}<{1}'.format(DIST_NAME, UPPER_VERSION_EXCLUSIVE) attrs = {'setup_requires': [req]} # NOTE: we need to parse the config file (e.g. setup.cfg) to make sure # it honours the options set in the [easy_install] section, and we need # to explicitly fetch the requirement eggs as setup_requires does not # get honored in recent versions of setuptools: # https://github.com/pypa/setuptools/issues/1273 try: context = _verbose if DEBUG else _silence with context(): dist = _Distribution(attrs=attrs) try: dist.parse_config_files(ignore_option_errors=True) dist.fetch_build_eggs(req) except TypeError: # On older versions of setuptools, ignore_option_errors # doesn't exist, and the above two lines are not needed # so we can just continue pass # If the setup_requires succeeded it will have added the new dist to # the main working_set return pkg_resources.working_set.by_key.get(DIST_NAME) except Exception as e: if DEBUG: raise msg = 'Error retrieving {0} from {1}:\n{2}' if find_links: source = find_links[0] elif index_url != INDEX_URL: source = index_url else: source = 'PyPI' raise Exception(msg.format(DIST_NAME, source, repr(e))) def _do_upgrade(self, dist): # Build up a requirement for a higher bugfix release but a lower minor # release (so API compatibility is guaranteed) next_version = _next_version(dist.parsed_version) req = pkg_resources.Requirement.parse( '{0}>{1},<{2}'.format(DIST_NAME, dist.version, next_version)) package_index = PackageIndex(index_url=self.index_url) upgrade = package_index.obtain(req) if upgrade is not None: return self._do_download(version=upgrade.version) def _check_submodule(self): """ Check if the given path is a git submodule. See the docstrings for ``_check_submodule_using_git`` and ``_check_submodule_no_git`` for further details. """ if (self.path is None or (os.path.exists(self.path) and not os.path.isdir(self.path))): return False if self.use_git: return self._check_submodule_using_git() else: return self._check_submodule_no_git() def _check_submodule_using_git(self): """ Check if the given path is a git submodule. If so, attempt to initialize and/or update the submodule if needed. This function makes calls to the ``git`` command in subprocesses. The ``_check_submodule_no_git`` option uses pure Python to check if the given path looks like a git submodule, but it cannot perform updates. """ cmd = ['git', 'submodule', 'status', '--', self.path] try: log.info('Running `{0}`; use the --no-git option to disable git ' 'commands'.format(' '.join(cmd))) returncode, stdout, stderr = run_cmd(cmd) except _CommandNotFound: # The git command simply wasn't found; this is most likely the # case on user systems that don't have git and are simply # trying to install the package from PyPI or a source # distribution. Silently ignore this case and simply don't try # to use submodules return False stderr = stderr.strip() if returncode != 0 and stderr: # Unfortunately the return code alone cannot be relied on, as # earlier versions of git returned 0 even if the requested submodule # does not exist # This is a warning that occurs in perl (from running git submodule) # which only occurs with a malformatted locale setting which can # happen sometimes on OSX. See again # https://github.com/astropy/astropy/issues/2749 perl_warning = ('perl: warning: Falling back to the standard locale ' '("C").') if not stderr.strip().endswith(perl_warning): # Some other unknown error condition occurred log.warn('git submodule command failed ' 'unexpectedly:\n{0}'.format(stderr)) return False # Output of `git submodule status` is as follows: # # 1: Status indicator: '-' for submodule is uninitialized, '+' if # submodule is initialized but is not at the commit currently indicated # in .gitmodules (and thus needs to be updated), or 'U' if the # submodule is in an unstable state (i.e. has merge conflicts) # # 2. SHA-1 hash of the current commit of the submodule (we don't really # need this information but it's useful for checking that the output is # correct) # # 3. The output of `git describe` for the submodule's current commit # hash (this includes for example what branches the commit is on) but # only if the submodule is initialized. We ignore this information for # now _git_submodule_status_re = re.compile( r'^(?P[+-U ])(?P[0-9a-f]{40}) ' r'(?P\S+)( .*)?$') # The stdout should only contain one line--the status of the # requested submodule m = _git_submodule_status_re.match(stdout) if m: # Yes, the path *is* a git submodule self._update_submodule(m.group('submodule'), m.group('status')) return True else: log.warn( 'Unexpected output from `git submodule status`:\n{0}\n' 'Will attempt import from {1!r} regardless.'.format( stdout, self.path)) return False def _check_submodule_no_git(self): """ Like ``_check_submodule_using_git``, but simply parses the .gitmodules file to determine if the supplied path is a git submodule, and does not exec any subprocesses. This can only determine if a path is a submodule--it does not perform updates, etc. This function may need to be updated if the format of the .gitmodules file is changed between git versions. """ gitmodules_path = os.path.abspath('.gitmodules') if not os.path.isfile(gitmodules_path): return False # This is a minimal reader for gitconfig-style files. It handles a few of # the quirks that make gitconfig files incompatible with ConfigParser-style # files, but does not support the full gitconfig syntax (just enough # needed to read a .gitmodules file). gitmodules_fileobj = io.StringIO() # Must use io.open for cross-Python-compatible behavior wrt unicode with io.open(gitmodules_path) as f: for line in f: # gitconfig files are more flexible with leading whitespace; just # go ahead and remove it line = line.lstrip() # comments can start with either # or ; if line and line[0] in (':', ';'): continue gitmodules_fileobj.write(line) gitmodules_fileobj.seek(0) cfg = RawConfigParser() try: cfg.readfp(gitmodules_fileobj) except Exception as exc: log.warn('Malformatted .gitmodules file: {0}\n' '{1} cannot be assumed to be a git submodule.'.format( exc, self.path)) return False for section in cfg.sections(): if not cfg.has_option(section, 'path'): continue submodule_path = cfg.get(section, 'path').rstrip(os.sep) if submodule_path == self.path.rstrip(os.sep): return True return False def _update_submodule(self, submodule, status): if status == ' ': # The submodule is up to date; no action necessary return elif status == '-': if self.offline: raise _AHBootstrapSystemExit( "Cannot initialize the {0} submodule in --offline mode; " "this requires being able to clone the submodule from an " "online repository.".format(submodule)) cmd = ['update', '--init'] action = 'Initializing' elif status == '+': cmd = ['update'] action = 'Updating' if self.offline: cmd.append('--no-fetch') elif status == 'U': raise _AHBootstrapSystemExit( 'Error: Submodule {0} contains unresolved merge conflicts. ' 'Please complete or abandon any changes in the submodule so that ' 'it is in a usable state, then try again.'.format(submodule)) else: log.warn('Unknown status {0!r} for git submodule {1!r}. Will ' 'attempt to use the submodule as-is, but try to ensure ' 'that the submodule is in a clean state and contains no ' 'conflicts or errors.\n{2}'.format(status, submodule, _err_help_msg)) return err_msg = None cmd = ['git', 'submodule'] + cmd + ['--', submodule] log.warn('{0} {1} submodule with: `{2}`'.format( action, submodule, ' '.join(cmd))) try: log.info('Running `{0}`; use the --no-git option to disable git ' 'commands'.format(' '.join(cmd))) returncode, stdout, stderr = run_cmd(cmd) except OSError as e: err_msg = str(e) else: if returncode != 0: err_msg = stderr if err_msg is not None: log.warn('An unexpected error occurred updating the git submodule ' '{0!r}:\n{1}\n{2}'.format(submodule, err_msg, _err_help_msg)) class _CommandNotFound(OSError): """ An exception raised when a command run with run_cmd is not found on the system. """ def run_cmd(cmd): """ Run a command in a subprocess, given as a list of command-line arguments. Returns a ``(returncode, stdout, stderr)`` tuple. """ try: p = sp.Popen(cmd, stdout=sp.PIPE, stderr=sp.PIPE) # XXX: May block if either stdout or stderr fill their buffers; # however for the commands this is currently used for that is # unlikely (they should have very brief output) stdout, stderr = p.communicate() except OSError as e: if DEBUG: raise if e.errno == errno.ENOENT: msg = 'Command not found: `{0}`'.format(' '.join(cmd)) raise _CommandNotFound(msg, cmd) else: raise _AHBootstrapSystemExit( 'An unexpected error occurred when running the ' '`{0}` command:\n{1}'.format(' '.join(cmd), str(e))) # Can fail of the default locale is not configured properly. See # https://github.com/astropy/astropy/issues/2749. For the purposes under # consideration 'latin1' is an acceptable fallback. try: stdio_encoding = locale.getdefaultlocale()[1] or 'latin1' except ValueError: # Due to an OSX oddity locale.getdefaultlocale() can also crash # depending on the user's locale/language settings. See: # http://bugs.python.org/issue18378 stdio_encoding = 'latin1' # Unlikely to fail at this point but even then let's be flexible if not isinstance(stdout, str): stdout = stdout.decode(stdio_encoding, 'replace') if not isinstance(stderr, str): stderr = stderr.decode(stdio_encoding, 'replace') return (p.returncode, stdout, stderr) def _next_version(version): """ Given a parsed version from pkg_resources.parse_version, returns a new version string with the next minor version. Examples ======== >>> _next_version(pkg_resources.parse_version('1.2.3')) '1.3.0' """ if hasattr(version, 'base_version'): # New version parsing from setuptools >= 8.0 if version.base_version: parts = version.base_version.split('.') else: parts = [] else: parts = [] for part in version: if part.startswith('*'): break parts.append(part) parts = [int(p) for p in parts] if len(parts) < 3: parts += [0] * (3 - len(parts)) major, minor, micro = parts[:3] return '{0}.{1}.{2}'.format(major, minor + 1, 0) class _DummyFile(object): """A noop writeable object.""" errors = '' # Required for Python 3.x encoding = 'utf-8' def write(self, s): pass def flush(self): pass @contextlib.contextmanager def _verbose(): yield @contextlib.contextmanager def _silence(): """A context manager that silences sys.stdout and sys.stderr.""" old_stdout = sys.stdout old_stderr = sys.stderr sys.stdout = _DummyFile() sys.stderr = _DummyFile() exception_occurred = False try: yield except: exception_occurred = True # Go ahead and clean up so that exception handling can work normally sys.stdout = old_stdout sys.stderr = old_stderr raise if not exception_occurred: sys.stdout = old_stdout sys.stderr = old_stderr _err_help_msg = """ If the problem persists consider installing astropy_helpers manually using pip (`pip install astropy_helpers`) or by manually downloading the source archive, extracting it, and installing by running `python setup.py install` from the root of the extracted source code. """ class _AHBootstrapSystemExit(SystemExit): def __init__(self, *args): if not args: msg = 'An unknown problem occurred bootstrapping astropy_helpers.' else: msg = args[0] msg += '\n' + _err_help_msg super(_AHBootstrapSystemExit, self).__init__(msg, *args[1:]) BOOTSTRAPPER = _Bootstrapper.main() def use_astropy_helpers(**kwargs): """ Ensure that the `astropy_helpers` module is available and is importable. This supports automatic submodule initialization if astropy_helpers is included in a project as a git submodule, or will download it from PyPI if necessary. Parameters ---------- path : str or None, optional A filesystem path relative to the root of the project's source code that should be added to `sys.path` so that `astropy_helpers` can be imported from that path. If the path is a git submodule it will automatically be initialized and/or updated. The path may also be to a ``.tar.gz`` archive of the astropy_helpers source distribution. In this case the archive is automatically unpacked and made temporarily available on `sys.path` as a ``.egg`` archive. If `None` skip straight to downloading. download_if_needed : bool, optional If the provided filesystem path is not found an attempt will be made to download astropy_helpers from PyPI. It will then be made temporarily available on `sys.path` as a ``.egg`` archive (using the ``setup_requires`` feature of setuptools. If the ``--offline`` option is given at the command line the value of this argument is overridden to `False`. index_url : str, optional If provided, use a different URL for the Python package index than the main PyPI server. use_git : bool, optional If `False` no git commands will be used--this effectively disables support for git submodules. If the ``--no-git`` option is given at the command line the value of this argument is overridden to `False`. auto_upgrade : bool, optional By default, when installing a package from a non-development source distribution ah_boostrap will try to automatically check for patch releases to astropy-helpers on PyPI and use the patched version over any bundled versions. Setting this to `False` will disable that functionality. If the ``--offline`` option is given at the command line the value of this argument is overridden to `False`. offline : bool, optional If `False` disable all actions that require an internet connection, including downloading packages from the package index and fetching updates to any git submodule. Defaults to `True`. """ global BOOTSTRAPPER config = BOOTSTRAPPER.config config.update(**kwargs) # Create a new bootstrapper with the updated configuration and run it BOOTSTRAPPER = _Bootstrapper(**config) BOOTSTRAPPER.run() ././@PaxHeader0000000000000000000000000000003400000000000011452 xustar000000000000000028 mtime=1605787217.3871849 ndcube-1.4.2/astropy_helpers/0000755000175100001640000000000000000000000016471 5ustar00vstsdocker00000000000000././@PaxHeader0000000000000000000000000000003400000000000011452 xustar000000000000000028 mtime=1605787216.8991826 ndcube-1.4.2/astropy_helpers/CHANGES.rst0000644000175100001640000005252300000000000020302 0ustar00vstsdocker00000000000000astropy-helpers Changelog ************************* 3.2.1 (2019-06-13) ------------------ - Reverting issuing deprecation warning for the ``build_sphinx`` command. [#482] - Make sure that all data files get included in tar file releases. [#485] 3.2 (2019-05-29) ---------------- - Make sure that ``[options.package_data]`` in setup.cfg is taken into account when collecting package data. [#453] - Simplified the code for the custom build_ext command. [#446] - Avoid importing the astropy package when trying to get the test command when testing astropy itself. [#450] - Avoid importing whole package when trying to get version information. Note that this has also introduced a small API change - ``cython_version`` and ``compiler`` can no longer be imported from the ``package.version`` module generated by astropy-helpers. Instead, you can import these from ``package.cython_version`` and ``package.compiler_version`` respectively. [#442] - Make it possible to call ``generate_version_py`` and ``register_commands`` without any arguments, which causes information to be read in from the ``setup.cfg`` file. [#440] - Simplified setup.py and moved most of the configuration to setup.cfg. [#445] - Add a new ``astropy_helpers.setup_helpers.setup`` function that does all the default boilerplate in typical ``setup.py`` files that use astropy-helpers. [#443] - Remove ``deprecated``, ``deprecated_attribute``, and ``minversion`` from ``astropy_helpers.utils``. [#447] - Updated minimum required version of setuptools to 30.3.0. [#440] - Remove functionality to adjust compilers if a broken compiler is detected. This is not useful anymore as only a single compiler was previously patched (now unlikely to be used) and this was only to fix a compilation issue in the core astropy package. [#421] - ``sphinx-astropy`` is now a required dependency to build the docs, the machinery to install it as eggs have been removed. [#474] 3.1.1 (2019-02-22) ------------------ - Moved documentation from README to Sphinx. [#444] - Fixed broken OpenMP detection when building with ``-coverage``. [#434] 3.1 (2018-12-04) ---------------- - Added extensive documentation about astropy-helpers to the README.rst file. [#416] - Fixed the compatibility of the build_docs command with Sphinx 1.8 and above. [#413] - Removing deprecated test_helpers.py file. [#369] - Removing ez_setup.py file and requiring setuptools 1.0 or later. [#384] - Remove all sphinx components from ``astropy-helpers``. These are now replaced by the ``sphinx-astropy`` package in conjunction with the ``astropy-theme-sphinx``, ``sphinx-automodapi``, and ``numpydoc`` packages. [#368] - openmp_helpers.py: Make add_openmp_flags_if_available() work for clang. The necessary include, library, and runtime paths now get added to the C test code used to determine if openmp works. Autogenerator utility added ``openmp_enabled.is_openmp_enabled()`` which can be called post build to determine state of OpenMP support. [#382] - Add version_info tuple to autogenerated version.py. Allows for simple version checking, i.e. version_info > (2,0,1). [#385] 3.0.2 (2018-06-01) ------------------ - Nothing changed. 3.0.1 (2018-02-22) ------------------ - Nothing changed. 3.0 (2018-02-09) ---------------- - Removing Python 2 support, including 2to3. Packages wishing to keep Python 2 support should NOT update to this version. [#340] - Removing deprecated _test_compat making astropy a hard dependency for packages wishing to use the astropy tests machinery. [#314] - Removing unused 'register' command since packages should be uploaded with twine and get registered automatically. [#332] 2.0.10 (2019-05-29) ------------------- - Removed ``tocdepthfix`` sphinx extension that worked around a big in Sphinx that has been long fixed. [#475] - Allow Python dev versions to pass the python version check. [#476] - Updated bundled version of sphinx-automodapi to v0.11. [#478] 2.0.9 (2019-02-22) ------------------ - Updated bundled version of sphinx-automodapi to v0.10. [#439] - Updated bundled sphinx extensions version to sphinx-astropy v1.1.1. [#454] - Include package name in error message for Python version in ``ah_bootstrap.py``. [#441] 2.0.8 (2018-12-04) ------------------ - Fixed compatibility with Sphinx 1.8+. [#428] - Fixed error that occurs when installing a package in an environment where ``numpy`` is not already installed. [#404] - Updated bundled version of sphinx-automodapi to v0.9. [#422] - Updated bundled version of numpydoc to v0.8.0. [#423] 2.0.7 (2018-06-01) ------------------ - Removing ez_setup.py file and requiring setuptools 1.0 or later. [#384] 2.0.6 (2018-02-24) ------------------ - Avoid deprecation warning due to ``exclude=`` keyword in ``setup.py``. [#379] 2.0.5 (2018-02-22) ------------------ - Fix segmentation faults that occurred when the astropy-helpers submodule was first initialized in packages that also contained Cython code. [#375] 2.0.4 (2018-02-09) ------------------ - Support dotted package names as namespace packages in generate_version_py. [#370] - Fix compatibility with setuptools 36.x and above. [#372] - Fix false negative in add_openmp_flags_if_available when measuring code coverage with gcc. [#374] 2.0.3 (2018-01-20) ------------------ - Make sure that astropy-helpers 3.x.x is not downloaded on Python 2. [#362, #363] - The bundled version of sphinx-automodapi has been updated to v0.7. [#365] - Add --auto-use and --no-auto-use command-line flags to match the ``auto_use`` configuration option, and add an alias ``--use-system-astropy-helpers`` for ``--no-auto-use``. [#366] 2.0.2 (2017-10-13) ------------------ - Added new helper function add_openmp_flags_if_available that can add OpenMP compilation flags to a C/Cython extension if needed. [#346] - Update numpydoc to v0.7. [#343] - The function ``get_git_devstr`` now returns ``'0'`` instead of ``None`` when no git repository is present. This allows generation of development version strings that are in a format that ``setuptools`` expects (e.g. "1.1.3.dev0" instead of "1.1.3.dev"). [#330] - It is now possible to override generated timestamps to make builds reproducible by setting the ``SOURCE_DATE_EPOCH`` environment variable [#341] - Mark Sphinx extensions as parallel-safe. [#344] - Switch to using mathjax instead of imgmath for local builds. [#342] - Deprecate ``exclude`` parameter of various functions in setup_helpers since it could not work as intended. Add new function ``add_exclude_packages`` to provide intended behavior. [#331] - Allow custom Sphinx doctest extension to recognize and process standard doctest directives ``testsetup`` and ``doctest``. [#335] 2.0.1 (2017-07-28) ------------------ - Fix compatibility with Sphinx <1.5. [#326] 2.0 (2017-07-06) ---------------- - Add support for package that lies in a subdirectory. [#249] - Removing ``compat.subprocess``. [#298] - Python 3.3 is no longer supported. [#300] - The 'automodapi' Sphinx extension (and associated dependencies) has now been moved to a standalone package which can be found at https://github.com/astropy/sphinx-automodapi - this is now bundled in astropy-helpers under astropy_helpers.extern.automodapi for convenience. Version shipped with astropy-helpers is v0.6. [#278, #303, #309, #323] - The ``numpydoc`` Sphinx extension has now been moved to ``astropy_helpers.extern``. [#278] - Fix ``build_docs`` error catching, so it doesn't hide Sphinx errors. [#292] - Fix compatibility with Sphinx 1.6. [#318] - Updating ez_setup.py to the last version before it's removal. [#321] 1.3.1 (2017-03-18) ------------------ - Fixed the missing button to hide output in documentation code blocks. [#287] - Fixed bug when ``build_docs`` when running with the clean (-l) option. [#289] - Add alternative location for various intersphinx inventories to fall back to. [#293] 1.3 (2016-12-16) ---------------- - ``build_sphinx`` has been deprecated in favor of the ``build_docs`` command. [#246] - Force the use of Cython's old ``build_ext`` command. A new ``build_ext`` command was added in Cython 0.25, but it does not work with astropy-helpers currently. [#261] 1.2 (2016-06-18) ---------------- - Added sphinx configuration value ``automodsumm_inherited_members``. If ``True`` this will include members that are inherited from a base class in the generated API docs. Defaults to ``False`` which matches the previous behavior. [#215] - Fixed ``build_sphinx`` to recognize builds that succeeded but have output *after* the "build succeeded." statement. This only applies when ``--warnings-returncode`` is given (which is primarily relevant for Travis documentation builds). [#223] - Fixed ``build_sphinx`` the sphinx extensions to not output a spurious warning for sphinx versions > 1.4. [#229] - Add Python version dependent local sphinx inventories that contain otherwise missing references. [#216] - ``astropy_helpers`` now require Sphinx 1.3 or later. [#226] 1.1.2 (2016-03-9) ----------------- - The CSS for the sphinx documentation was altered to prevent some text overflow problems. [#217] 1.1.1 (2015-12-23) ------------------ - Fixed crash in build with ``AttributeError: cython_create_listing`` with older versions of setuptools. [#209, #210] 1.1 (2015-12-10) ---------------- - The original ``AstropyTest`` class in ``astropy_helpers``, which implements the ``setup.py test`` command, is deprecated in favor of moving the implementation of that command closer to the actual Astropy test runner in ``astropy.tests``. Now a dummy ``test`` command is provided solely for informing users that they need ``astropy`` installed to run the tests (however, the previous, now deprecated implementation is still provided and continues to work with older versions of Astropy). See the related issue for more details. [#184] - Added a useful new utility function to ``astropy_helpers.utils`` called ``find_data_files``. This is similar to the ``find_packages`` function in setuptools in that it can be used to search a package for data files (matching a pattern) that can be passed to the ``package_data`` argument for ``setup()``. See the docstring to ``astropy_helpers.utils.find_data_files`` for more details. [#42] - The ``astropy_helpers`` module now sets the global ``_ASTROPY_SETUP_`` flag upon import (from within a ``setup.py``) script, so it's not necessary to have this in the ``setup.py`` script explicitly. If in doubt though, there's no harm in setting it twice. Putting it in ``astropy_helpers`` just ensures that any other imports that occur during build will have this flag set. [#191] - It is now possible to use Cython as a ``setup_requires`` build requirement, and still build Cython extensions even if Cython wasn't available at the beginning of the build processes (that is, is automatically downloaded via setuptools' processing of ``setup_requires``). [#185] - Moves the ``adjust_compiler`` check into the ``build_ext`` command itself, so it's only used when actually building extension modules. This also deprecates the stand-alone ``adjust_compiler`` function. [#76] - When running the ``build_sphinx`` / ``build_docs`` command with the ``-w`` option, the output from Sphinx is streamed as it runs instead of silently buffering until the doc build is complete. [#197] 1.0.7 (unreleased) ------------------ - Fix missing import in ``astropy_helpers/utils.py``. [#196] 1.0.6 (2015-12-04) ------------------ - Fixed bug where running ``./setup.py build_sphinx`` could return successfully even when the build was not successful (and should have returned a non-zero error code). [#199] 1.0.5 (2015-10-02) ------------------ - Fixed a regression in the ``./setup.py test`` command that was introduced in v1.0.4. 1.0.4 (2015-10-02) ------------------ - Fixed issue with the sphinx documentation css where the line numbers for code blocks were not aligned with the code. [#179, #180] - Fixed crash that could occur when trying to build Cython extension modules when Cython isn't installed. Normally this still results in a failed build, but was supposed to provide a useful error message rather than crash outright (this was a regression introduced in v1.0.3). [#181] - Fixed a crash that could occur on Python 3 when a working C compiler isn't found. [#182] - Quieted warnings about deprecated Numpy API in Cython extensions, when building Cython extensions against Numpy >= 1.7. [#183, #186] - Improved support for py.test >= 2.7--running the ``./setup.py test`` command now copies all doc pages into the temporary test directory as well, so that all test files have a "common root directory". [#189, #190] 1.0.3 (2015-07-22) ------------------ - Added workaround for sphinx-doc/sphinx#1843, a but in Sphinx which prevented descriptor classes with a custom metaclass from being documented correctly. [#158] - Added an alias for the ``./setup.py build_sphinx`` command as ``./setup.py build_docs`` which, to a new contributor, should hopefully be less cryptic. [#161] - The fonts in graphviz diagrams now match the font of the HTML content. [#169] - When the documentation is built on readthedocs.org, MathJax will be used for math rendering. When built elsewhere, the "pngmath" extension is still used for math rendering. [#170] - Fix crash when importing astropy_helpers when running with ``python -OO`` [#171] - The ``build`` and ``build_ext`` stages now correctly recognize the presence of C++ files in Cython extensions (previously only vanilla C worked). [#173] 1.0.2 (2015-04-02) ------------------ - Various fixes enabling the astropy-helpers Sphinx build command and Sphinx extensions to work with Sphinx 1.3. [#148] - More improvement to the ability to handle multiple versions of astropy-helpers being imported in the same Python interpreter session in the (somewhat rare) case of nested installs. [#147] - To better support high resolution displays, use SVG for the astropy logo and linkout image, falling back to PNGs for browsers that support it. [#150, #151] - Improve ``setup_helpers.get_compiler_version`` to work with more compilers, and to return more info. This will help fix builds of Astropy on less common compilers, like Sun C. [#153] 1.0.1 (2015-03-04) ------------------ - Released in concert with v0.4.8 to address the same issues. 0.4.8 (2015-03-04) ------------------ - Improved the ``ah_bootstrap`` script's ability to override existing installations of astropy-helpers with new versions in the context of installing multiple packages simultaneously within the same Python interpreter (e.g. when one package has in its ``setup_requires`` another package that uses a different version of astropy-helpers. [#144] - Added a workaround to an issue in matplotlib that can, in rare cases, lead to a crash when installing packages that import matplotlib at build time. [#144] 1.0 (2015-02-17) ---------------- - Added new pre-/post-command hook points for ``setup.py`` commands. Now any package can define code to run before and/or after any ``setup.py`` command without having to manually subclass that command by adding ``pre__hook`` and ``post__hook`` callables to the package's ``setup_package.py`` module. See the PR for more details. [#112] - The following objects in the ``astropy_helpers.setup_helpers`` module have been relocated: - ``get_dummy_distribution``, ``get_distutils_*``, ``get_compiler_option``, ``add_command_option``, ``is_distutils_display_option`` -> ``astropy_helpers.distutils_helpers`` - ``should_build_with_cython``, ``generate_build_ext_command`` -> ``astropy_helpers.commands.build_ext`` - ``AstropyBuildPy`` -> ``astropy_helpers.commands.build_py`` - ``AstropyBuildSphinx`` -> ``astropy_helpers.commands.build_sphinx`` - ``AstropyInstall`` -> ``astropy_helpers.commands.install`` - ``AstropyInstallLib`` -> ``astropy_helpers.commands.install_lib`` - ``AstropyRegister`` -> ``astropy_helpers.commands.register`` - ``get_pkg_version_module`` -> ``astropy_helpers.version_helpers`` - ``write_if_different``, ``import_file``, ``get_numpy_include_path`` -> ``astropy_helpers.utils`` All of these are "soft" deprecations in the sense that they are still importable from ``astropy_helpers.setup_helpers`` for now, and there is no (easy) way to produce deprecation warnings when importing these objects from ``setup_helpers`` rather than directly from the modules they are defined in. But please consider updating any imports to these objects. [#110] - Use of the ``astropy.sphinx.ext.astropyautosummary`` extension is deprecated for use with Sphinx < 1.2. Instead it should suffice to remove this extension for the ``extensions`` list in your ``conf.py`` and add the stock ``sphinx.ext.autosummary`` instead. [#131] 0.4.7 (2015-02-17) ------------------ - Fixed incorrect/missing git hash being added to the generated ``version.py`` when creating a release. [#141] 0.4.6 (2015-02-16) ------------------ - Fixed problems related to the automatically generated _compiler module not being created properly. [#139] 0.4.5 (2015-02-11) ------------------ - Fixed an issue where ah_bootstrap.py could blow up when astropy_helper's version number is 1.0. - Added a workaround for documentation of properties in the rare case where the class's metaclass has a property of the same name. [#130] - Fixed an issue on Python 3 where importing a package using astropy-helper's generated version.py module would crash when the current working directory is an empty git repository. [#114, #137] - Fixed an issue where the "revision count" appended to .dev versions by the generated version.py did not accurately reflect the revision count for the package it belongs to, and could be invalid if the current working directory is an unrelated git repository. [#107, #137] - Likewise, fixed a confusing warning message that could occur in the same circumstances as the above issue. [#121, #137] 0.4.4 (2014-12-31) ------------------ - More improvements for building the documentation using Python 3.x. [#100] - Additional minor fixes to Python 3 support. [#115] - Updates to support new test features in Astropy [#92, #106] 0.4.3 (2014-10-22) ------------------ - The generated ``version.py`` file now preserves the git hash of installed copies of the package as well as when building a source distribution. That is, the git hash of the changeset that was installed/released is preserved. [#87] - In smart resolver add resolution for class links when they exist in the intersphinx inventory, but not the mapping of the current package (e.g. when an affiliated package uses an astropy core class of which "actual" and "documented" location differs) [#88] - Fixed a bug that could occur when running ``setup.py`` for the first time in a repository that uses astropy-helpers as a submodule: ``AttributeError: 'NoneType' object has no attribute 'mkdtemp'`` [#89] - Fixed a bug where optional arguments to the ``doctest-skip`` Sphinx directive were sometimes being left in the generated documentation output. [#90] - Improved support for building the documentation using Python 3.x. [#96] - Avoid error message if .git directory is not present. [#91] 0.4.2 (2014-08-09) ------------------ - Fixed some CSS issues in generated API docs. [#69] - Fixed the warning message that could be displayed when generating a version number with some older versions of git. [#77] - Fixed automodsumm to work with new versions of Sphinx (>= 1.2.2). [#80] 0.4.1 (2014-08-08) ------------------ - Fixed git revision count on systems with git versions older than v1.7.2. [#70] - Fixed display of warning text when running a git command fails (previously the output of stderr was not being decoded properly). [#70] - The ``--offline`` flag to ``setup.py`` understood by ``ah_bootstrap.py`` now also prevents git from going online to fetch submodule updates. [#67] - The Sphinx extension for converting issue numbers to links in the changelog now supports working on arbitrary pages via a new ``conf.py`` setting: ``changelog_links_docpattern``. By default it affects the ``changelog`` and ``whatsnew`` pages in one's Sphinx docs. [#61] - Fixed crash that could result from users with missing/misconfigured locale settings. [#58] - The font used for code examples in the docs is now the system-defined ``monospace`` font, rather than ``Minaco``, which is not available on all platforms. [#50] 0.4 (2014-07-15) ---------------- - Initial release of astropy-helpers. See `APE4 `_ for details of the motivation and design of this package. - The ``astropy_helpers`` package replaces the following modules in the ``astropy`` package: - ``astropy.setup_helpers`` -> ``astropy_helpers.setup_helpers`` - ``astropy.version_helpers`` -> ``astropy_helpers.version_helpers`` - ``astropy.sphinx`` - > ``astropy_helpers.sphinx`` These modules should be considered deprecated in ``astropy``, and any new, non-critical changes to those modules will be made in ``astropy_helpers`` instead. Affiliated packages wishing to make use those modules (as in the Astropy package-template) should use the versions from ``astropy_helpers`` instead, and include the ``ah_bootstrap.py`` script in their project, for bootstrapping the ``astropy_helpers`` package in their setup.py script. ././@PaxHeader0000000000000000000000000000003400000000000011452 xustar000000000000000028 mtime=1605787216.8991826 ndcube-1.4.2/astropy_helpers/LICENSE.rst0000644000175100001640000000272300000000000020311 0ustar00vstsdocker00000000000000Copyright (c) 2014, Astropy Developers All rights reserved. Redistribution and use in source and binary forms, with or without modification, are permitted provided that the following conditions are met: * Redistributions of source code must retain the above copyright notice, this list of conditions and the following disclaimer. * Redistributions in binary form must reproduce the above copyright notice, this list of conditions and the following disclaimer in the documentation and/or other materials provided with the distribution. * Neither the name of the Astropy Team nor the names of its contributors may be used to endorse or promote products derived from this software without specific prior written permission. THIS SOFTWARE IS PROVIDED BY THE COPYRIGHT HOLDERS AND CONTRIBUTORS "AS IS" AND ANY EXPRESS OR IMPLIED WARRANTIES, INCLUDING, BUT NOT LIMITED TO, THE IMPLIED WARRANTIES OF MERCHANTABILITY AND FITNESS FOR A PARTICULAR PURPOSE ARE DISCLAIMED. IN NO EVENT SHALL THE COPYRIGHT HOLDER OR CONTRIBUTORS BE LIABLE FOR ANY DIRECT, INDIRECT, INCIDENTAL, SPECIAL, EXEMPLARY, OR CONSEQUENTIAL DAMAGES (INCLUDING, BUT NOT LIMITED TO, PROCUREMENT OF SUBSTITUTE GOODS OR SERVICES; LOSS OF USE, DATA, OR PROFITS; OR BUSINESS INTERRUPTION) HOWEVER CAUSED AND ON ANY THEORY OF LIABILITY, WHETHER IN CONTRACT, STRICT LIABILITY, OR TORT (INCLUDING NEGLIGENCE OR OTHERWISE) ARISING IN ANY WAY OUT OF THE USE OF THIS SOFTWARE, EVEN IF ADVISED OF THE POSSIBILITY OF SUCH DAMAGE. ././@PaxHeader0000000000000000000000000000003400000000000011452 xustar000000000000000028 mtime=1605787216.8991826 ndcube-1.4.2/astropy_helpers/README.rst0000644000175100001640000000266300000000000020167 0ustar00vstsdocker00000000000000astropy-helpers =============== .. image:: https://travis-ci.org/astropy/astropy-helpers.svg :target: https://travis-ci.org/astropy/astropy-helpers .. image:: https://ci.appveyor.com/api/projects/status/rt9161t9mhx02xp7/branch/master?svg=true :target: https://ci.appveyor.com/project/Astropy/astropy-helpers .. image:: https://codecov.io/gh/astropy/astropy-helpers/branch/master/graph/badge.svg :target: https://codecov.io/gh/astropy/astropy-helpers The **astropy-helpers** package includes many build, installation, and documentation-related tools used by the Astropy project, but packaged separately for use by other projects that wish to leverage this work. The motivation behind this package and details of its implementation are in the accepted `Astropy Proposal for Enhancement (APE) 4 `_. Astropy-helpers is not a traditional package in the sense that it is not intended to be installed directly by users or developers. Instead, it is meant to be accessed when the ``setup.py`` command is run - see the "Using astropy-helpers in a package" section in the documentation for how to do this. For a real-life example of how to implement astropy-helpers in a project, see the ``setup.py`` and ``setup.cfg`` files of the `Affiliated package template `_. For more information, see the documentation at http://astropy-helpers.readthedocs.io ././@PaxHeader0000000000000000000000000000003400000000000011452 xustar000000000000000028 mtime=1605787216.8991826 ndcube-1.4.2/astropy_helpers/ah_bootstrap.py0000644000175100001640000011063400000000000021535 0ustar00vstsdocker00000000000000""" This bootstrap module contains code for ensuring that the astropy_helpers package will be importable by the time the setup.py script runs. It also includes some workarounds to ensure that a recent-enough version of setuptools is being used for the installation. This module should be the first thing imported in the setup.py of distributions that make use of the utilities in astropy_helpers. If the distribution ships with its own copy of astropy_helpers, this module will first attempt to import from the shipped copy. However, it will also check PyPI to see if there are any bug-fix releases on top of the current version that may be useful to get past platform-specific bugs that have been fixed. When running setup.py, use the ``--offline`` command-line option to disable the auto-upgrade checks. When this module is imported or otherwise executed it automatically calls a main function that attempts to read the project's setup.cfg file, which it checks for a configuration section called ``[ah_bootstrap]`` the presences of that section, and options therein, determine the next step taken: If it contains an option called ``auto_use`` with a value of ``True``, it will automatically call the main function of this module called `use_astropy_helpers` (see that function's docstring for full details). Otherwise no further action is taken and by default the system-installed version of astropy-helpers will be used (however, ``ah_bootstrap.use_astropy_helpers`` may be called manually from within the setup.py script). This behavior can also be controlled using the ``--auto-use`` and ``--no-auto-use`` command-line flags. For clarity, an alias for ``--no-auto-use`` is ``--use-system-astropy-helpers``, and we recommend using the latter if needed. Additional options in the ``[ah_boostrap]`` section of setup.cfg have the same names as the arguments to `use_astropy_helpers`, and can be used to configure the bootstrap script when ``auto_use = True``. See https://github.com/astropy/astropy-helpers for more details, and for the latest version of this module. """ import contextlib import errno import io import locale import os import re import subprocess as sp import sys from distutils import log from distutils.debug import DEBUG from configparser import ConfigParser, RawConfigParser import pkg_resources from setuptools import Distribution from setuptools.package_index import PackageIndex # This is the minimum Python version required for astropy-helpers __minimum_python_version__ = (3, 5) # TODO: Maybe enable checking for a specific version of astropy_helpers? DIST_NAME = 'astropy-helpers' PACKAGE_NAME = 'astropy_helpers' UPPER_VERSION_EXCLUSIVE = None # Defaults for other options DOWNLOAD_IF_NEEDED = True INDEX_URL = 'https://pypi.python.org/simple' USE_GIT = True OFFLINE = False AUTO_UPGRADE = True # A list of all the configuration options and their required types CFG_OPTIONS = [ ('auto_use', bool), ('path', str), ('download_if_needed', bool), ('index_url', str), ('use_git', bool), ('offline', bool), ('auto_upgrade', bool) ] # Start off by parsing the setup.cfg file SETUP_CFG = ConfigParser() if os.path.exists('setup.cfg'): try: SETUP_CFG.read('setup.cfg') except Exception as e: if DEBUG: raise log.error( "Error reading setup.cfg: {0!r}\n{1} will not be " "automatically bootstrapped and package installation may fail." "\n{2}".format(e, PACKAGE_NAME, _err_help_msg)) # We used package_name in the package template for a while instead of name if SETUP_CFG.has_option('metadata', 'name'): parent_package = SETUP_CFG.get('metadata', 'name') elif SETUP_CFG.has_option('metadata', 'package_name'): parent_package = SETUP_CFG.get('metadata', 'package_name') else: parent_package = None if SETUP_CFG.has_option('options', 'python_requires'): python_requires = SETUP_CFG.get('options', 'python_requires') # The python_requires key has a syntax that can be parsed by SpecifierSet # in the packaging package. However, we don't want to have to depend on that # package, so instead we can use setuptools (which bundles packaging). We # have to add 'python' to parse it with Requirement. from pkg_resources import Requirement req = Requirement.parse('python' + python_requires) # We want the Python version as a string, which we can get from the platform module import platform # strip off trailing '+' incase this is a dev install of python python_version = platform.python_version().strip('+') # allow pre-releases to count as 'new enough' if not req.specifier.contains(python_version, True): if parent_package is None: message = "ERROR: Python {} is required by this package\n".format(req.specifier) else: message = "ERROR: Python {} is required by {}\n".format(req.specifier, parent_package) sys.stderr.write(message) sys.exit(1) if sys.version_info < __minimum_python_version__: if parent_package is None: message = "ERROR: Python {} or later is required by astropy-helpers\n".format( __minimum_python_version__) else: message = "ERROR: Python {} or later is required by astropy-helpers for {}\n".format( __minimum_python_version__, parent_package) sys.stderr.write(message) sys.exit(1) _str_types = (str, bytes) # What follows are several import statements meant to deal with install-time # issues with either missing or misbehaving pacakges (including making sure # setuptools itself is installed): # Check that setuptools 30.3 or later is present from distutils.version import LooseVersion try: import setuptools assert LooseVersion(setuptools.__version__) >= LooseVersion('30.3') except (ImportError, AssertionError): sys.stderr.write("ERROR: setuptools 30.3 or later is required by astropy-helpers\n") sys.exit(1) # typing as a dependency for 1.6.1+ Sphinx causes issues when imported after # initializing submodule with ah_boostrap.py # See discussion and references in # https://github.com/astropy/astropy-helpers/issues/302 try: import typing # noqa except ImportError: pass # Note: The following import is required as a workaround to # https://github.com/astropy/astropy-helpers/issues/89; if we don't import this # module now, it will get cleaned up after `run_setup` is called, but that will # later cause the TemporaryDirectory class defined in it to stop working when # used later on by setuptools try: import setuptools.py31compat # noqa except ImportError: pass # matplotlib can cause problems if it is imported from within a call of # run_setup(), because in some circumstances it will try to write to the user's # home directory, resulting in a SandboxViolation. See # https://github.com/matplotlib/matplotlib/pull/4165 # Making sure matplotlib, if it is available, is imported early in the setup # process can mitigate this (note importing matplotlib.pyplot has the same # issue) try: import matplotlib matplotlib.use('Agg') import matplotlib.pyplot except: # Ignore if this fails for *any* reason* pass # End compatibility imports... class _Bootstrapper(object): """ Bootstrapper implementation. See ``use_astropy_helpers`` for parameter documentation. """ def __init__(self, path=None, index_url=None, use_git=None, offline=None, download_if_needed=None, auto_upgrade=None): if path is None: path = PACKAGE_NAME if not (isinstance(path, _str_types) or path is False): raise TypeError('path must be a string or False') if not isinstance(path, str): fs_encoding = sys.getfilesystemencoding() path = path.decode(fs_encoding) # path to unicode self.path = path # Set other option attributes, using defaults where necessary self.index_url = index_url if index_url is not None else INDEX_URL self.offline = offline if offline is not None else OFFLINE # If offline=True, override download and auto-upgrade if self.offline: download_if_needed = False auto_upgrade = False self.download = (download_if_needed if download_if_needed is not None else DOWNLOAD_IF_NEEDED) self.auto_upgrade = (auto_upgrade if auto_upgrade is not None else AUTO_UPGRADE) # If this is a release then the .git directory will not exist so we # should not use git. git_dir_exists = os.path.exists(os.path.join(os.path.dirname(__file__), '.git')) if use_git is None and not git_dir_exists: use_git = False self.use_git = use_git if use_git is not None else USE_GIT # Declared as False by default--later we check if astropy-helpers can be # upgraded from PyPI, but only if not using a source distribution (as in # the case of import from a git submodule) self.is_submodule = False @classmethod def main(cls, argv=None): if argv is None: argv = sys.argv config = cls.parse_config() config.update(cls.parse_command_line(argv)) auto_use = config.pop('auto_use', False) bootstrapper = cls(**config) if auto_use: # Run the bootstrapper, otherwise the setup.py is using the old # use_astropy_helpers() interface, in which case it will run the # bootstrapper manually after reconfiguring it. bootstrapper.run() return bootstrapper @classmethod def parse_config(cls): if not SETUP_CFG.has_section('ah_bootstrap'): return {} config = {} for option, type_ in CFG_OPTIONS: if not SETUP_CFG.has_option('ah_bootstrap', option): continue if type_ is bool: value = SETUP_CFG.getboolean('ah_bootstrap', option) else: value = SETUP_CFG.get('ah_bootstrap', option) config[option] = value return config @classmethod def parse_command_line(cls, argv=None): if argv is None: argv = sys.argv config = {} # For now we just pop recognized ah_bootstrap options out of the # arg list. This is imperfect; in the unlikely case that a setup.py # custom command or even custom Distribution class defines an argument # of the same name then we will break that. However there's a catch22 # here that we can't just do full argument parsing right here, because # we don't yet know *how* to parse all possible command-line arguments. if '--no-git' in argv: config['use_git'] = False argv.remove('--no-git') if '--offline' in argv: config['offline'] = True argv.remove('--offline') if '--auto-use' in argv: config['auto_use'] = True argv.remove('--auto-use') if '--no-auto-use' in argv: config['auto_use'] = False argv.remove('--no-auto-use') if '--use-system-astropy-helpers' in argv: config['auto_use'] = False argv.remove('--use-system-astropy-helpers') return config def run(self): strategies = ['local_directory', 'local_file', 'index'] dist = None # First, remove any previously imported versions of astropy_helpers; # this is necessary for nested installs where one package's installer # is installing another package via setuptools.sandbox.run_setup, as in # the case of setup_requires for key in list(sys.modules): try: if key == PACKAGE_NAME or key.startswith(PACKAGE_NAME + '.'): del sys.modules[key] except AttributeError: # Sometimes mysterious non-string things can turn up in # sys.modules continue # Check to see if the path is a submodule self.is_submodule = self._check_submodule() for strategy in strategies: method = getattr(self, 'get_{0}_dist'.format(strategy)) dist = method() if dist is not None: break else: raise _AHBootstrapSystemExit( "No source found for the {0!r} package; {0} must be " "available and importable as a prerequisite to building " "or installing this package.".format(PACKAGE_NAME)) # This is a bit hacky, but if astropy_helpers was loaded from a # directory/submodule its Distribution object gets a "precedence" of # "DEVELOP_DIST". However, in other cases it gets a precedence of # "EGG_DIST". However, when activing the distribution it will only be # placed early on sys.path if it is treated as an EGG_DIST, so always # do that dist = dist.clone(precedence=pkg_resources.EGG_DIST) # Otherwise we found a version of astropy-helpers, so we're done # Just active the found distribution on sys.path--if we did a # download this usually happens automatically but it doesn't hurt to # do it again # Note: Adding the dist to the global working set also activates it # (makes it importable on sys.path) by default. try: pkg_resources.working_set.add(dist, replace=True) except TypeError: # Some (much) older versions of setuptools do not have the # replace=True option here. These versions are old enough that all # bets may be off anyways, but it's easy enough to work around just # in case... if dist.key in pkg_resources.working_set.by_key: del pkg_resources.working_set.by_key[dist.key] pkg_resources.working_set.add(dist) @property def config(self): """ A `dict` containing the options this `_Bootstrapper` was configured with. """ return dict((optname, getattr(self, optname)) for optname, _ in CFG_OPTIONS if hasattr(self, optname)) def get_local_directory_dist(self): """ Handle importing a vendored package from a subdirectory of the source distribution. """ if not os.path.isdir(self.path): return log.info('Attempting to import astropy_helpers from {0} {1!r}'.format( 'submodule' if self.is_submodule else 'directory', self.path)) dist = self._directory_import() if dist is None: log.warn( 'The requested path {0!r} for importing {1} does not ' 'exist, or does not contain a copy of the {1} ' 'package.'.format(self.path, PACKAGE_NAME)) elif self.auto_upgrade and not self.is_submodule: # A version of astropy-helpers was found on the available path, but # check to see if a bugfix release is available on PyPI upgrade = self._do_upgrade(dist) if upgrade is not None: dist = upgrade return dist def get_local_file_dist(self): """ Handle importing from a source archive; this also uses setup_requires but points easy_install directly to the source archive. """ if not os.path.isfile(self.path): return log.info('Attempting to unpack and import astropy_helpers from ' '{0!r}'.format(self.path)) try: dist = self._do_download(find_links=[self.path]) except Exception as e: if DEBUG: raise log.warn( 'Failed to import {0} from the specified archive {1!r}: ' '{2}'.format(PACKAGE_NAME, self.path, str(e))) dist = None if dist is not None and self.auto_upgrade: # A version of astropy-helpers was found on the available path, but # check to see if a bugfix release is available on PyPI upgrade = self._do_upgrade(dist) if upgrade is not None: dist = upgrade return dist def get_index_dist(self): if not self.download: log.warn('Downloading {0!r} disabled.'.format(DIST_NAME)) return None log.warn( "Downloading {0!r}; run setup.py with the --offline option to " "force offline installation.".format(DIST_NAME)) try: dist = self._do_download() except Exception as e: if DEBUG: raise log.warn( 'Failed to download and/or install {0!r} from {1!r}:\n' '{2}'.format(DIST_NAME, self.index_url, str(e))) dist = None # No need to run auto-upgrade here since we've already presumably # gotten the most up-to-date version from the package index return dist def _directory_import(self): """ Import astropy_helpers from the given path, which will be added to sys.path. Must return True if the import succeeded, and False otherwise. """ # Return True on success, False on failure but download is allowed, and # otherwise raise SystemExit path = os.path.abspath(self.path) # Use an empty WorkingSet rather than the man # pkg_resources.working_set, since on older versions of setuptools this # will invoke a VersionConflict when trying to install an upgrade ws = pkg_resources.WorkingSet([]) ws.add_entry(path) dist = ws.by_key.get(DIST_NAME) if dist is None: # We didn't find an egg-info/dist-info in the given path, but if a # setup.py exists we can generate it setup_py = os.path.join(path, 'setup.py') if os.path.isfile(setup_py): # We use subprocess instead of run_setup from setuptools to # avoid segmentation faults - see the following for more details: # https://github.com/cython/cython/issues/2104 sp.check_output([sys.executable, 'setup.py', 'egg_info'], cwd=path) for dist in pkg_resources.find_distributions(path, True): # There should be only one... return dist return dist def _do_download(self, version='', find_links=None): if find_links: allow_hosts = '' index_url = None else: allow_hosts = None index_url = self.index_url # Annoyingly, setuptools will not handle other arguments to # Distribution (such as options) before handling setup_requires, so it # is not straightforward to programmatically augment the arguments which # are passed to easy_install class _Distribution(Distribution): def get_option_dict(self, command_name): opts = Distribution.get_option_dict(self, command_name) if command_name == 'easy_install': if find_links is not None: opts['find_links'] = ('setup script', find_links) if index_url is not None: opts['index_url'] = ('setup script', index_url) if allow_hosts is not None: opts['allow_hosts'] = ('setup script', allow_hosts) return opts if version: req = '{0}=={1}'.format(DIST_NAME, version) else: if UPPER_VERSION_EXCLUSIVE is None: req = DIST_NAME else: req = '{0}<{1}'.format(DIST_NAME, UPPER_VERSION_EXCLUSIVE) attrs = {'setup_requires': [req]} # NOTE: we need to parse the config file (e.g. setup.cfg) to make sure # it honours the options set in the [easy_install] section, and we need # to explicitly fetch the requirement eggs as setup_requires does not # get honored in recent versions of setuptools: # https://github.com/pypa/setuptools/issues/1273 try: context = _verbose if DEBUG else _silence with context(): dist = _Distribution(attrs=attrs) try: dist.parse_config_files(ignore_option_errors=True) dist.fetch_build_eggs(req) except TypeError: # On older versions of setuptools, ignore_option_errors # doesn't exist, and the above two lines are not needed # so we can just continue pass # If the setup_requires succeeded it will have added the new dist to # the main working_set return pkg_resources.working_set.by_key.get(DIST_NAME) except Exception as e: if DEBUG: raise msg = 'Error retrieving {0} from {1}:\n{2}' if find_links: source = find_links[0] elif index_url != INDEX_URL: source = index_url else: source = 'PyPI' raise Exception(msg.format(DIST_NAME, source, repr(e))) def _do_upgrade(self, dist): # Build up a requirement for a higher bugfix release but a lower minor # release (so API compatibility is guaranteed) next_version = _next_version(dist.parsed_version) req = pkg_resources.Requirement.parse( '{0}>{1},<{2}'.format(DIST_NAME, dist.version, next_version)) package_index = PackageIndex(index_url=self.index_url) upgrade = package_index.obtain(req) if upgrade is not None: return self._do_download(version=upgrade.version) def _check_submodule(self): """ Check if the given path is a git submodule. See the docstrings for ``_check_submodule_using_git`` and ``_check_submodule_no_git`` for further details. """ if (self.path is None or (os.path.exists(self.path) and not os.path.isdir(self.path))): return False if self.use_git: return self._check_submodule_using_git() else: return self._check_submodule_no_git() def _check_submodule_using_git(self): """ Check if the given path is a git submodule. If so, attempt to initialize and/or update the submodule if needed. This function makes calls to the ``git`` command in subprocesses. The ``_check_submodule_no_git`` option uses pure Python to check if the given path looks like a git submodule, but it cannot perform updates. """ cmd = ['git', 'submodule', 'status', '--', self.path] try: log.info('Running `{0}`; use the --no-git option to disable git ' 'commands'.format(' '.join(cmd))) returncode, stdout, stderr = run_cmd(cmd) except _CommandNotFound: # The git command simply wasn't found; this is most likely the # case on user systems that don't have git and are simply # trying to install the package from PyPI or a source # distribution. Silently ignore this case and simply don't try # to use submodules return False stderr = stderr.strip() if returncode != 0 and stderr: # Unfortunately the return code alone cannot be relied on, as # earlier versions of git returned 0 even if the requested submodule # does not exist # This is a warning that occurs in perl (from running git submodule) # which only occurs with a malformatted locale setting which can # happen sometimes on OSX. See again # https://github.com/astropy/astropy/issues/2749 perl_warning = ('perl: warning: Falling back to the standard locale ' '("C").') if not stderr.strip().endswith(perl_warning): # Some other unknown error condition occurred log.warn('git submodule command failed ' 'unexpectedly:\n{0}'.format(stderr)) return False # Output of `git submodule status` is as follows: # # 1: Status indicator: '-' for submodule is uninitialized, '+' if # submodule is initialized but is not at the commit currently indicated # in .gitmodules (and thus needs to be updated), or 'U' if the # submodule is in an unstable state (i.e. has merge conflicts) # # 2. SHA-1 hash of the current commit of the submodule (we don't really # need this information but it's useful for checking that the output is # correct) # # 3. The output of `git describe` for the submodule's current commit # hash (this includes for example what branches the commit is on) but # only if the submodule is initialized. We ignore this information for # now _git_submodule_status_re = re.compile( r'^(?P[+-U ])(?P[0-9a-f]{40}) ' r'(?P\S+)( .*)?$') # The stdout should only contain one line--the status of the # requested submodule m = _git_submodule_status_re.match(stdout) if m: # Yes, the path *is* a git submodule self._update_submodule(m.group('submodule'), m.group('status')) return True else: log.warn( 'Unexpected output from `git submodule status`:\n{0}\n' 'Will attempt import from {1!r} regardless.'.format( stdout, self.path)) return False def _check_submodule_no_git(self): """ Like ``_check_submodule_using_git``, but simply parses the .gitmodules file to determine if the supplied path is a git submodule, and does not exec any subprocesses. This can only determine if a path is a submodule--it does not perform updates, etc. This function may need to be updated if the format of the .gitmodules file is changed between git versions. """ gitmodules_path = os.path.abspath('.gitmodules') if not os.path.isfile(gitmodules_path): return False # This is a minimal reader for gitconfig-style files. It handles a few of # the quirks that make gitconfig files incompatible with ConfigParser-style # files, but does not support the full gitconfig syntax (just enough # needed to read a .gitmodules file). gitmodules_fileobj = io.StringIO() # Must use io.open for cross-Python-compatible behavior wrt unicode with io.open(gitmodules_path) as f: for line in f: # gitconfig files are more flexible with leading whitespace; just # go ahead and remove it line = line.lstrip() # comments can start with either # or ; if line and line[0] in (':', ';'): continue gitmodules_fileobj.write(line) gitmodules_fileobj.seek(0) cfg = RawConfigParser() try: cfg.readfp(gitmodules_fileobj) except Exception as exc: log.warn('Malformatted .gitmodules file: {0}\n' '{1} cannot be assumed to be a git submodule.'.format( exc, self.path)) return False for section in cfg.sections(): if not cfg.has_option(section, 'path'): continue submodule_path = cfg.get(section, 'path').rstrip(os.sep) if submodule_path == self.path.rstrip(os.sep): return True return False def _update_submodule(self, submodule, status): if status == ' ': # The submodule is up to date; no action necessary return elif status == '-': if self.offline: raise _AHBootstrapSystemExit( "Cannot initialize the {0} submodule in --offline mode; " "this requires being able to clone the submodule from an " "online repository.".format(submodule)) cmd = ['update', '--init'] action = 'Initializing' elif status == '+': cmd = ['update'] action = 'Updating' if self.offline: cmd.append('--no-fetch') elif status == 'U': raise _AHBootstrapSystemExit( 'Error: Submodule {0} contains unresolved merge conflicts. ' 'Please complete or abandon any changes in the submodule so that ' 'it is in a usable state, then try again.'.format(submodule)) else: log.warn('Unknown status {0!r} for git submodule {1!r}. Will ' 'attempt to use the submodule as-is, but try to ensure ' 'that the submodule is in a clean state and contains no ' 'conflicts or errors.\n{2}'.format(status, submodule, _err_help_msg)) return err_msg = None cmd = ['git', 'submodule'] + cmd + ['--', submodule] log.warn('{0} {1} submodule with: `{2}`'.format( action, submodule, ' '.join(cmd))) try: log.info('Running `{0}`; use the --no-git option to disable git ' 'commands'.format(' '.join(cmd))) returncode, stdout, stderr = run_cmd(cmd) except OSError as e: err_msg = str(e) else: if returncode != 0: err_msg = stderr if err_msg is not None: log.warn('An unexpected error occurred updating the git submodule ' '{0!r}:\n{1}\n{2}'.format(submodule, err_msg, _err_help_msg)) class _CommandNotFound(OSError): """ An exception raised when a command run with run_cmd is not found on the system. """ def run_cmd(cmd): """ Run a command in a subprocess, given as a list of command-line arguments. Returns a ``(returncode, stdout, stderr)`` tuple. """ try: p = sp.Popen(cmd, stdout=sp.PIPE, stderr=sp.PIPE) # XXX: May block if either stdout or stderr fill their buffers; # however for the commands this is currently used for that is # unlikely (they should have very brief output) stdout, stderr = p.communicate() except OSError as e: if DEBUG: raise if e.errno == errno.ENOENT: msg = 'Command not found: `{0}`'.format(' '.join(cmd)) raise _CommandNotFound(msg, cmd) else: raise _AHBootstrapSystemExit( 'An unexpected error occurred when running the ' '`{0}` command:\n{1}'.format(' '.join(cmd), str(e))) # Can fail of the default locale is not configured properly. See # https://github.com/astropy/astropy/issues/2749. For the purposes under # consideration 'latin1' is an acceptable fallback. try: stdio_encoding = locale.getdefaultlocale()[1] or 'latin1' except ValueError: # Due to an OSX oddity locale.getdefaultlocale() can also crash # depending on the user's locale/language settings. See: # http://bugs.python.org/issue18378 stdio_encoding = 'latin1' # Unlikely to fail at this point but even then let's be flexible if not isinstance(stdout, str): stdout = stdout.decode(stdio_encoding, 'replace') if not isinstance(stderr, str): stderr = stderr.decode(stdio_encoding, 'replace') return (p.returncode, stdout, stderr) def _next_version(version): """ Given a parsed version from pkg_resources.parse_version, returns a new version string with the next minor version. Examples ======== >>> _next_version(pkg_resources.parse_version('1.2.3')) '1.3.0' """ if hasattr(version, 'base_version'): # New version parsing from setuptools >= 8.0 if version.base_version: parts = version.base_version.split('.') else: parts = [] else: parts = [] for part in version: if part.startswith('*'): break parts.append(part) parts = [int(p) for p in parts] if len(parts) < 3: parts += [0] * (3 - len(parts)) major, minor, micro = parts[:3] return '{0}.{1}.{2}'.format(major, minor + 1, 0) class _DummyFile(object): """A noop writeable object.""" errors = '' # Required for Python 3.x encoding = 'utf-8' def write(self, s): pass def flush(self): pass @contextlib.contextmanager def _verbose(): yield @contextlib.contextmanager def _silence(): """A context manager that silences sys.stdout and sys.stderr.""" old_stdout = sys.stdout old_stderr = sys.stderr sys.stdout = _DummyFile() sys.stderr = _DummyFile() exception_occurred = False try: yield except: exception_occurred = True # Go ahead and clean up so that exception handling can work normally sys.stdout = old_stdout sys.stderr = old_stderr raise if not exception_occurred: sys.stdout = old_stdout sys.stderr = old_stderr _err_help_msg = """ If the problem persists consider installing astropy_helpers manually using pip (`pip install astropy_helpers`) or by manually downloading the source archive, extracting it, and installing by running `python setup.py install` from the root of the extracted source code. """ class _AHBootstrapSystemExit(SystemExit): def __init__(self, *args): if not args: msg = 'An unknown problem occurred bootstrapping astropy_helpers.' else: msg = args[0] msg += '\n' + _err_help_msg super(_AHBootstrapSystemExit, self).__init__(msg, *args[1:]) BOOTSTRAPPER = _Bootstrapper.main() def use_astropy_helpers(**kwargs): """ Ensure that the `astropy_helpers` module is available and is importable. This supports automatic submodule initialization if astropy_helpers is included in a project as a git submodule, or will download it from PyPI if necessary. Parameters ---------- path : str or None, optional A filesystem path relative to the root of the project's source code that should be added to `sys.path` so that `astropy_helpers` can be imported from that path. If the path is a git submodule it will automatically be initialized and/or updated. The path may also be to a ``.tar.gz`` archive of the astropy_helpers source distribution. In this case the archive is automatically unpacked and made temporarily available on `sys.path` as a ``.egg`` archive. If `None` skip straight to downloading. download_if_needed : bool, optional If the provided filesystem path is not found an attempt will be made to download astropy_helpers from PyPI. It will then be made temporarily available on `sys.path` as a ``.egg`` archive (using the ``setup_requires`` feature of setuptools. If the ``--offline`` option is given at the command line the value of this argument is overridden to `False`. index_url : str, optional If provided, use a different URL for the Python package index than the main PyPI server. use_git : bool, optional If `False` no git commands will be used--this effectively disables support for git submodules. If the ``--no-git`` option is given at the command line the value of this argument is overridden to `False`. auto_upgrade : bool, optional By default, when installing a package from a non-development source distribution ah_boostrap will try to automatically check for patch releases to astropy-helpers on PyPI and use the patched version over any bundled versions. Setting this to `False` will disable that functionality. If the ``--offline`` option is given at the command line the value of this argument is overridden to `False`. offline : bool, optional If `False` disable all actions that require an internet connection, including downloading packages from the package index and fetching updates to any git submodule. Defaults to `True`. """ global BOOTSTRAPPER config = BOOTSTRAPPER.config config.update(**kwargs) # Create a new bootstrapper with the updated configuration and run it BOOTSTRAPPER = _Bootstrapper(**config) BOOTSTRAPPER.run() ././@PaxHeader0000000000000000000000000000003400000000000011452 xustar000000000000000028 mtime=1605787217.3871849 ndcube-1.4.2/astropy_helpers/astropy_helpers/0000755000175100001640000000000000000000000021714 5ustar00vstsdocker00000000000000././@PaxHeader0000000000000000000000000000003400000000000011452 xustar000000000000000028 mtime=1605787216.9031825 ndcube-1.4.2/astropy_helpers/astropy_helpers/__init__.py0000644000175100001640000000331200000000000024024 0ustar00vstsdocker00000000000000try: from .version import version as __version__ from .version import githash as __githash__ except ImportError: __version__ = '' __githash__ = '' # If we've made it as far as importing astropy_helpers, we don't need # ah_bootstrap in sys.modules anymore. Getting rid of it is actually necessary # if the package we're installing has a setup_requires of another package that # uses astropy_helpers (and possibly a different version at that) # See https://github.com/astropy/astropy/issues/3541 import sys if 'ah_bootstrap' in sys.modules: del sys.modules['ah_bootstrap'] # Note, this is repeated from ah_bootstrap.py, but is here too in case this # astropy-helpers was upgraded to from an older version that did not have this # check in its ah_bootstrap. # matplotlib can cause problems if it is imported from within a call of # run_setup(), because in some circumstances it will try to write to the user's # home directory, resulting in a SandboxViolation. See # https://github.com/matplotlib/matplotlib/pull/4165 # Making sure matplotlib, if it is available, is imported early in the setup # process can mitigate this (note importing matplotlib.pyplot has the same # issue) try: import matplotlib matplotlib.use('Agg') import matplotlib.pyplot except: # Ignore if this fails for *any* reason* pass import os # Ensure that all module-level code in astropy or other packages know that # we're in setup mode: if ('__main__' in sys.modules and hasattr(sys.modules['__main__'], '__file__')): filename = os.path.basename(sys.modules['__main__'].__file__) if filename.rstrip('co') == 'setup.py': import builtins builtins._ASTROPY_SETUP_ = True del filename ././@PaxHeader0000000000000000000000000000003400000000000011452 xustar000000000000000028 mtime=1605787217.3871849 ndcube-1.4.2/astropy_helpers/astropy_helpers/commands/0000755000175100001640000000000000000000000023515 5ustar00vstsdocker00000000000000././@PaxHeader0000000000000000000000000000003400000000000011452 xustar000000000000000028 mtime=1605787216.9031825 ndcube-1.4.2/astropy_helpers/astropy_helpers/commands/__init__.py0000644000175100001640000000000000000000000025614 0ustar00vstsdocker00000000000000././@PaxHeader0000000000000000000000000000003400000000000011452 xustar000000000000000028 mtime=1605787216.9031825 ndcube-1.4.2/astropy_helpers/astropy_helpers/commands/_dummy.py0000644000175100001640000000526200000000000025366 0ustar00vstsdocker00000000000000""" Provides a base class for a 'dummy' setup.py command that has no functionality (probably due to a missing requirement). This dummy command can raise an exception when it is run, explaining to the user what dependencies must be met to use this command. The reason this is at all tricky is that we want the command to be able to provide this message even when the user passes arguments to the command. If we don't know ahead of time what arguments the command can take, this is difficult, because distutils does not allow unknown arguments to be passed to a setup.py command. This hacks around that restriction to provide a useful error message even when a user passes arguments to the dummy implementation of a command. Use this like: try: from some_dependency import SetupCommand except ImportError: from ._dummy import _DummyCommand class SetupCommand(_DummyCommand): description = \ 'Implementation of SetupCommand from some_dependency; ' 'some_dependency must be installed to run this command' # This is the message that will be raised when a user tries to # run this command--define it as a class attribute. error_msg = \ "The 'setup_command' command requires the some_dependency " "package to be installed and importable." """ import sys from setuptools import Command from distutils.errors import DistutilsArgError from textwrap import dedent class _DummyCommandMeta(type): """ Causes an exception to be raised on accessing attributes of a command class so that if ``./setup.py command_name`` is run with additional command-line options we can provide a useful error message instead of the default that tells users the options are unrecognized. """ def __init__(cls, name, bases, members): if bases == (Command, object): # This is the _DummyCommand base class, presumably return if not hasattr(cls, 'description'): raise TypeError( "_DummyCommand subclass must have a 'description' " "attribute.") if not hasattr(cls, 'error_msg'): raise TypeError( "_DummyCommand subclass must have an 'error_msg' " "attribute.") def __getattribute__(cls, attr): if attr in ('description', 'error_msg'): # Allow cls.description to work so that `./setup.py # --help-commands` still works return super(_DummyCommandMeta, cls).__getattribute__(attr) raise DistutilsArgError(cls.error_msg) class _DummyCommand(Command, object, metaclass=_DummyCommandMeta): pass ././@PaxHeader0000000000000000000000000000003400000000000011452 xustar000000000000000028 mtime=1605787216.9031825 ndcube-1.4.2/astropy_helpers/astropy_helpers/commands/build_ext.py0000644000175100001640000002074200000000000026053 0ustar00vstsdocker00000000000000import errno import os import shutil from distutils.core import Extension from distutils.ccompiler import get_default_compiler from distutils.command.build_ext import build_ext as DistutilsBuildExt from ..distutils_helpers import get_main_package_directory from ..utils import get_numpy_include_path, import_file __all__ = ['AstropyHelpersBuildExt'] def should_build_with_cython(previous_cython_version, is_release): """ Returns the previously used Cython version (or 'unknown' if not previously built) if Cython should be used to build extension modules from pyx files. """ # Only build with Cython if, of course, Cython is installed, we're in a # development version (i.e. not release) or the Cython-generated source # files haven't been created yet (cython_version == 'unknown'). The latter # case can happen even when release is True if checking out a release tag # from the repository have_cython = False try: from Cython import __version__ as cython_version # noqa have_cython = True except ImportError: pass if have_cython and (not is_release or previous_cython_version == 'unknown'): return cython_version else: return False class AstropyHelpersBuildExt(DistutilsBuildExt): """ A custom 'build_ext' command that allows for manipulating some of the C extension options at build time. """ _uses_cython = False _force_rebuild = False def __new__(cls, value, **kwargs): # NOTE: we need to wait until AstropyHelpersBuildExt is initialized to # import setuptools.command.build_ext because when that package is # imported, setuptools tries to import Cython - and if it's not found # it will affect the rest of the build process. This is an issue because # if we import that module at the top of this one, setup_requires won't # have been honored yet, so Cython may not yet be available - and if we # import build_ext too soon, it will think Cython is not available even # if it is then intalled when setup_requires is processed. To get around # this we dynamically create a new class that inherits from the # setuptools build_ext, and by this point setup_requires has been # processed. from setuptools.command.build_ext import build_ext as SetuptoolsBuildExt class FinalBuildExt(AstropyHelpersBuildExt, SetuptoolsBuildExt): pass new_type = type(cls.__name__, (FinalBuildExt,), dict(cls.__dict__)) obj = SetuptoolsBuildExt.__new__(new_type) obj.__init__(value) return obj def finalize_options(self): # First let's find the package folder, then we can check if the # version and cython_version are accessible self.package_dir = get_main_package_directory(self.distribution) version = import_file(os.path.join(self.package_dir, 'version.py'), name='version').version self.is_release = 'dev' not in version try: self.previous_cython_version = import_file(os.path.join(self.package_dir, 'cython_version.py'), name='cython_version').cython_version except (FileNotFoundError, ImportError): self.previous_cython_version = 'unknown' self._uses_cython = should_build_with_cython(self.previous_cython_version, self.is_release) # Add a copy of the _compiler.so module as well, but only if there # are in fact C modules to compile (otherwise there's no reason to # include a record of the compiler used). Note that self.extensions # may not be set yet, but self.distribution.ext_modules is where any # extension modules passed to setup() can be found extensions = self.distribution.ext_modules if extensions: build_py = self.get_finalized_command('build_py') package_dir = build_py.get_package_dir(self.package_dir) src_path = os.path.relpath( os.path.join(os.path.dirname(__file__), 'src')) shutil.copy(os.path.join(src_path, 'compiler.c'), os.path.join(package_dir, '_compiler.c')) ext = Extension(self.package_dir + '.compiler_version', [os.path.join(package_dir, '_compiler.c')]) extensions.insert(0, ext) super().finalize_options() # If we are using Cython, then make sure we re-build if the version # of Cython that is installed is different from the version last # used to generate the C files. if self._uses_cython and self._uses_cython != self.previous_cython_version: self._force_rebuild = True # Regardless of the value of the '--force' option, force a rebuild # if the debug flag changed from the last build if self._force_rebuild: self.force = True def run(self): # For extensions that require 'numpy' in their include dirs, # replace 'numpy' with the actual paths np_include = None for extension in self.extensions: if 'numpy' in extension.include_dirs: if np_include is None: np_include = get_numpy_include_path() idx = extension.include_dirs.index('numpy') extension.include_dirs.insert(idx, np_include) extension.include_dirs.remove('numpy') self._check_cython_sources(extension) # Note that setuptools automatically uses Cython to discover and # build extensions if available, so we don't have to explicitly call # e.g. cythonize. super().run() # Update cython_version.py if building with Cython if self._uses_cython and self._uses_cython != self.previous_cython_version: build_py = self.get_finalized_command('build_py') package_dir = build_py.get_package_dir(self.package_dir) cython_py = os.path.join(package_dir, 'cython_version.py') with open(cython_py, 'w') as f: f.write('# Generated file; do not modify\n') f.write('cython_version = {0!r}\n'.format(self._uses_cython)) if os.path.isdir(self.build_lib): # The build/lib directory may not exist if the build_py # command was not previously run, which may sometimes be # the case self.copy_file(cython_py, os.path.join(self.build_lib, cython_py), preserve_mode=False) def _check_cython_sources(self, extension): """ Where relevant, make sure that the .c files associated with .pyx modules are present (if building without Cython installed). """ # Determine the compiler we'll be using if self.compiler is None: compiler = get_default_compiler() else: compiler = self.compiler # Replace .pyx with C-equivalents, unless c files are missing for jdx, src in enumerate(extension.sources): base, ext = os.path.splitext(src) pyxfn = base + '.pyx' cfn = base + '.c' cppfn = base + '.cpp' if not os.path.isfile(pyxfn): continue if self._uses_cython: extension.sources[jdx] = pyxfn else: if os.path.isfile(cfn): extension.sources[jdx] = cfn elif os.path.isfile(cppfn): extension.sources[jdx] = cppfn else: msg = ( 'Could not find C/C++ file {0}.(c/cpp) for Cython ' 'file {1} when building extension {2}. Cython ' 'must be installed to build from a git ' 'checkout.'.format(base, pyxfn, extension.name)) raise IOError(errno.ENOENT, msg, cfn) # Cython (at least as of 0.29.2) uses deprecated Numpy API features # the use of which produces a few warnings when compiling. # These additional flags should squelch those warnings. # TODO: Feel free to remove this if/when a Cython update # removes use of the deprecated Numpy API if compiler == 'unix': extension.extra_compile_args.extend([ '-Wp,-w', '-Wno-unused-function']) ././@PaxHeader0000000000000000000000000000003400000000000011452 xustar000000000000000028 mtime=1605787216.9031825 ndcube-1.4.2/astropy_helpers/astropy_helpers/commands/build_sphinx.py0000644000175100001640000002177700000000000026575 0ustar00vstsdocker00000000000000from __future__ import print_function import os import pkgutil import re import shutil import subprocess import sys from distutils.version import LooseVersion from distutils import log from sphinx import __version__ as sphinx_version from sphinx.setup_command import BuildDoc as SphinxBuildDoc SPHINX_LT_16 = LooseVersion(sphinx_version) < LooseVersion('1.6') SPHINX_LT_17 = LooseVersion(sphinx_version) < LooseVersion('1.7') SUBPROCESS_TEMPLATE = """ import os import sys {build_main} os.chdir({srcdir!r}) {sys_path_inserts} for builder in {builders!r}: retcode = build_main(argv={argv!r} + ['-b', builder, '.', os.path.join({output_dir!r}, builder)]) if retcode != 0: sys.exit(retcode) """ def ensure_sphinx_astropy_installed(): """ Make sure that sphinx-astropy is available. This returns the available version of sphinx-astropy as well as any paths that should be added to sys.path for sphinx-astropy to be available. """ # We've split out the Sphinx part of astropy-helpers into sphinx-astropy # but we want it to be auto-installed seamlessly for anyone using # build_docs. We check if it's already installed, and if not, we install # it to a local .eggs directory and add the eggs to the path (these # have to each be added to the path, we can't add them by simply adding # .eggs to the path) sys_path_inserts = [] sphinx_astropy_version = None try: from sphinx_astropy import __version__ as sphinx_astropy_version # noqa except ImportError: raise ImportError("sphinx-astropy needs to be installed to build" "the documentation.") return sphinx_astropy_version, sys_path_inserts class AstropyBuildDocs(SphinxBuildDoc): """ A version of the ``build_docs`` command that uses the version of Astropy that is built by the setup ``build`` command, rather than whatever is installed on the system. To build docs against the installed version, run ``make html`` in the ``astropy/docs`` directory. """ description = 'Build Sphinx documentation for Astropy environment' user_options = SphinxBuildDoc.user_options[:] user_options.append( ('warnings-returncode', 'w', 'Parses the sphinx output and sets the return code to 1 if there ' 'are any warnings. Note that this will cause the sphinx log to ' 'only update when it completes, rather than continuously as is ' 'normally the case.')) user_options.append( ('clean-docs', 'l', 'Completely clean previous builds, including ' 'automodapi-generated files before building new ones')) user_options.append( ('no-intersphinx', 'n', 'Skip intersphinx, even if conf.py says to use it')) user_options.append( ('open-docs-in-browser', 'o', 'Open the docs in a browser (using the webbrowser module) if the ' 'build finishes successfully.')) boolean_options = SphinxBuildDoc.boolean_options[:] boolean_options.append('warnings-returncode') boolean_options.append('clean-docs') boolean_options.append('no-intersphinx') boolean_options.append('open-docs-in-browser') _self_iden_rex = re.compile(r"self\.([^\d\W][\w]+)", re.UNICODE) def initialize_options(self): SphinxBuildDoc.initialize_options(self) self.clean_docs = False self.no_intersphinx = False self.open_docs_in_browser = False self.warnings_returncode = False self.traceback = False def finalize_options(self): # This has to happen before we call the parent class's finalize_options if self.build_dir is None: self.build_dir = 'docs/_build' SphinxBuildDoc.finalize_options(self) # Clear out previous sphinx builds, if requested if self.clean_docs: dirstorm = [os.path.join(self.source_dir, 'api'), os.path.join(self.source_dir, 'generated')] dirstorm.append(self.build_dir) for d in dirstorm: if os.path.isdir(d): log.info('Cleaning directory ' + d) shutil.rmtree(d) else: log.info('Not cleaning directory ' + d + ' because ' 'not present or not a directory') def run(self): # TODO: Break this method up into a few more subroutines and # document them better import webbrowser from urllib.request import pathname2url # This is used at the very end of `run` to decide if sys.exit should # be called. If it's None, it won't be. retcode = None # Now make sure Astropy is built and determine where it was built build_cmd = self.reinitialize_command('build') build_cmd.inplace = 0 self.run_command('build') build_cmd = self.get_finalized_command('build') build_cmd_path = os.path.abspath(build_cmd.build_lib) ah_importer = pkgutil.get_importer('astropy_helpers') if ah_importer is None: ah_path = '.' else: ah_path = os.path.abspath(ah_importer.path) if SPHINX_LT_17: build_main = 'from sphinx import build_main' else: build_main = 'from sphinx.cmd.build import build_main' # We need to make sure sphinx-astropy is installed sphinx_astropy_version, extra_paths = ensure_sphinx_astropy_installed() sys_path_inserts = [build_cmd_path, ah_path] + extra_paths sys_path_inserts = os.linesep.join(['sys.path.insert(0, {0!r})'.format(path) for path in sys_path_inserts]) argv = [] if self.warnings_returncode: argv.append('-W') if self.no_intersphinx: # Note, if sphinx_astropy_version is None, this could indicate an # old version of setuptools, but sphinx-astropy is likely ok, so # we can proceed. if sphinx_astropy_version is None or LooseVersion(sphinx_astropy_version) >= LooseVersion('1.1'): argv.extend(['-D', 'disable_intersphinx=1']) else: log.warn('The -n option to disable intersphinx requires ' 'sphinx-astropy>=1.1. Ignoring.') # We now need to adjust the flags based on the parent class's options if self.fresh_env: argv.append('-E') if self.all_files: argv.append('-a') if getattr(self, 'pdb', False): argv.append('-P') if getattr(self, 'nitpicky', False): argv.append('-n') if self.traceback: argv.append('-T') # The default verbosity level is 1, so in that case we just don't add a flag if self.verbose == 0: argv.append('-q') elif self.verbose > 1: argv.append('-v') if SPHINX_LT_17: argv.insert(0, 'sphinx-build') if isinstance(self.builder, str): builders = [self.builder] else: builders = self.builder subproccode = SUBPROCESS_TEMPLATE.format(build_main=build_main, srcdir=self.source_dir, sys_path_inserts=sys_path_inserts, builders=builders, argv=argv, output_dir=os.path.abspath(self.build_dir)) log.debug('Starting subprocess of {0} with python code:\n{1}\n' '[CODE END])'.format(sys.executable, subproccode)) proc = subprocess.Popen([sys.executable], stdin=subprocess.PIPE) proc.communicate(subproccode.encode('utf-8')) if proc.returncode != 0: retcode = proc.returncode if retcode is None: if self.open_docs_in_browser: if self.builder == 'html': absdir = os.path.abspath(self.builder_target_dir) index_path = os.path.join(absdir, 'index.html') fileurl = 'file://' + pathname2url(index_path) webbrowser.open(fileurl) else: log.warn('open-docs-in-browser option was given, but ' 'the builder is not html! Ignoring.') # Here we explicitly check proc.returncode since we only want to output # this for cases where the return code really wasn't 0. if proc.returncode: log.warn('Sphinx Documentation subprocess failed with return ' 'code ' + str(proc.returncode)) if retcode is not None: # this is potentially dangerous in that there might be something # after the call to `setup` in `setup.py`, and exiting here will # prevent that from running. But there's no other apparent way # to signal what the return code should be. sys.exit(retcode) class AstropyBuildSphinx(AstropyBuildDocs): # pragma: no cover def run(self): AstropyBuildDocs.run(self) ././@PaxHeader0000000000000000000000000000003400000000000011452 xustar000000000000000028 mtime=1605787217.3871849 ndcube-1.4.2/astropy_helpers/astropy_helpers/commands/src/0000755000175100001640000000000000000000000024304 5ustar00vstsdocker00000000000000././@PaxHeader0000000000000000000000000000003400000000000011452 xustar000000000000000028 mtime=1605787216.9031825 ndcube-1.4.2/astropy_helpers/astropy_helpers/commands/src/compiler.c0000644000175100001640000000524000000000000026263 0ustar00vstsdocker00000000000000#include /*************************************************************************** * Macros for determining the compiler version. * * These are borrowed from boost, and majorly abridged to include only * the compilers we care about. ***************************************************************************/ #define STRINGIZE(X) DO_STRINGIZE(X) #define DO_STRINGIZE(X) #X #if defined __clang__ /* Clang C++ emulates GCC, so it has to appear early. */ # define COMPILER "Clang version " __clang_version__ #elif defined(__INTEL_COMPILER) || defined(__ICL) || defined(__ICC) || defined(__ECC) /* Intel */ # if defined(__INTEL_COMPILER) # define INTEL_VERSION __INTEL_COMPILER # elif defined(__ICL) # define INTEL_VERSION __ICL # elif defined(__ICC) # define INTEL_VERSION __ICC # elif defined(__ECC) # define INTEL_VERSION __ECC # endif # define COMPILER "Intel C compiler version " STRINGIZE(INTEL_VERSION) #elif defined(__GNUC__) /* gcc */ # define COMPILER "GCC version " __VERSION__ #elif defined(__SUNPRO_CC) /* Sun Workshop Compiler */ # define COMPILER "Sun compiler version " STRINGIZE(__SUNPRO_CC) #elif defined(_MSC_VER) /* Microsoft Visual C/C++ Must be last since other compilers define _MSC_VER for compatibility as well */ # if _MSC_VER < 1200 # define COMPILER_VERSION 5.0 # elif _MSC_VER < 1300 # define COMPILER_VERSION 6.0 # elif _MSC_VER == 1300 # define COMPILER_VERSION 7.0 # elif _MSC_VER == 1310 # define COMPILER_VERSION 7.1 # elif _MSC_VER == 1400 # define COMPILER_VERSION 8.0 # elif _MSC_VER == 1500 # define COMPILER_VERSION 9.0 # elif _MSC_VER == 1600 # define COMPILER_VERSION 10.0 # else # define COMPILER_VERSION _MSC_VER # endif # define COMPILER "Microsoft Visual C++ version " STRINGIZE(COMPILER_VERSION) #else /* Fallback */ # define COMPILER "Unknown compiler" #endif /*************************************************************************** * Module-level ***************************************************************************/ struct module_state { /* The Sun compiler can't handle empty structs */ #if defined(__SUNPRO_C) || defined(_MSC_VER) int _dummy; #endif }; static struct PyModuleDef moduledef = { PyModuleDef_HEAD_INIT, "compiler_version", NULL, sizeof(struct module_state), NULL, NULL, NULL, NULL, NULL }; #define INITERROR return NULL PyMODINIT_FUNC PyInit_compiler_version(void) { PyObject* m; m = PyModule_Create(&moduledef); if (m == NULL) INITERROR; PyModule_AddStringConstant(m, "compiler", COMPILER); return m; } ././@PaxHeader0000000000000000000000000000003400000000000011452 xustar000000000000000028 mtime=1605787216.9031825 ndcube-1.4.2/astropy_helpers/astropy_helpers/commands/test.py0000644000175100001640000000267200000000000025055 0ustar00vstsdocker00000000000000""" Different implementations of the ``./setup.py test`` command depending on what's locally available. If Astropy v1.1 or later is available it should be possible to import AstropyTest from ``astropy.tests.command``. Otherwise there is a skeleton implementation that allows users to at least discover the ``./setup.py test`` command and learn that they need Astropy to run it. """ import os from ..utils import import_file # Previously these except statements caught only ImportErrors, but there are # some other obscure exceptional conditions that can occur when importing # astropy.tests (at least on older versions) that can cause these imports to # fail try: # If we are testing astropy itself, we need to use import_file to avoid # actually importing astropy (just the file we need). command_file = os.path.join('astropy', 'tests', 'command.py') if os.path.exists(command_file): AstropyTest = import_file(command_file, 'astropy_tests_command').AstropyTest else: import astropy # noqa from astropy.tests.command import AstropyTest except Exception: # No astropy at all--provide the dummy implementation from ._dummy import _DummyCommand class AstropyTest(_DummyCommand): command_name = 'test' description = 'Run the tests for this package' error_msg = ( "The 'test' command requires the astropy package to be " "installed and importable.") ././@PaxHeader0000000000000000000000000000003400000000000011452 xustar000000000000000028 mtime=1605787216.9031825 ndcube-1.4.2/astropy_helpers/astropy_helpers/conftest.py0000644000175100001640000000361200000000000024115 0ustar00vstsdocker00000000000000# This file contains settings for pytest that are specific to astropy-helpers. # Since we run many of the tests in sub-processes, we need to collect coverage # data inside each subprocess and then combine it into a single .coverage file. # To do this we set up a list which run_setup appends coverage objects to. # This is not intended to be used by packages other than astropy-helpers. import os import glob try: from coverage import CoverageData except ImportError: HAS_COVERAGE = False else: HAS_COVERAGE = True if HAS_COVERAGE: SUBPROCESS_COVERAGE = [] def pytest_configure(config): if HAS_COVERAGE: SUBPROCESS_COVERAGE.clear() def pytest_unconfigure(config): if HAS_COVERAGE: # We create an empty coverage data object combined_cdata = CoverageData() # Add all files from astropy_helpers to make sure we compute the total # coverage, not just the coverage of the files that have non-zero # coverage. lines = {} for filename in glob.glob(os.path.join('astropy_helpers', '**', '*.py'), recursive=True): lines[os.path.abspath(filename)] = [] for cdata in SUBPROCESS_COVERAGE: # For each CoverageData object, we go through all the files and # change the filename from one which might be a temporary path # to the local filename. We then only keep files that actually # exist. for filename in cdata.measured_files(): try: pos = filename.rindex('astropy_helpers') except ValueError: continue short_filename = filename[pos:] if os.path.exists(short_filename): lines[os.path.abspath(short_filename)].extend(cdata.lines(filename)) combined_cdata.add_lines(lines) combined_cdata.write_file('.coverage.subprocess') ././@PaxHeader0000000000000000000000000000003400000000000011452 xustar000000000000000028 mtime=1605787216.9031825 ndcube-1.4.2/astropy_helpers/astropy_helpers/distutils_helpers.py0000644000175100001640000001764400000000000026050 0ustar00vstsdocker00000000000000""" This module contains various utilities for introspecting the distutils module and the setup process. Some of these utilities require the `astropy_helpers.setup_helpers.register_commands` function to be called first, as it will affect introspection of setuptools command-line arguments. Other utilities in this module do not have that restriction. """ import os import sys from distutils import ccompiler, log from distutils.dist import Distribution from distutils.errors import DistutilsError from .utils import silence # This function, and any functions that call it, require the setup in # `astropy_helpers.setup_helpers.register_commands` to be run first. def get_dummy_distribution(): """ Returns a distutils Distribution object used to instrument the setup environment before calling the actual setup() function. """ from .setup_helpers import _module_state if _module_state['registered_commands'] is None: raise RuntimeError( 'astropy_helpers.setup_helpers.register_commands() must be ' 'called before using ' 'astropy_helpers.setup_helpers.get_dummy_distribution()') # Pre-parse the Distutils command-line options and config files to if # the option is set. dist = Distribution({'script_name': os.path.basename(sys.argv[0]), 'script_args': sys.argv[1:]}) dist.cmdclass.update(_module_state['registered_commands']) with silence(): try: dist.parse_config_files() dist.parse_command_line() except (DistutilsError, AttributeError, SystemExit): # Let distutils handle DistutilsErrors itself AttributeErrors can # get raise for ./setup.py --help SystemExit can be raised if a # display option was used, for example pass return dist def get_main_package_directory(distribution): """ Given a Distribution object, return the main package directory. """ return min(distribution.packages, key=len) def get_distutils_option(option, commands): """ Returns the value of the given distutils option. Parameters ---------- option : str The name of the option commands : list of str The list of commands on which this option is available Returns ------- val : str or None the value of the given distutils option. If the option is not set, returns None. """ dist = get_dummy_distribution() for cmd in commands: cmd_opts = dist.command_options.get(cmd) if cmd_opts is not None and option in cmd_opts: return cmd_opts[option][1] else: return None def get_distutils_build_option(option): """ Returns the value of the given distutils build option. Parameters ---------- option : str The name of the option Returns ------- val : str or None The value of the given distutils build option. If the option is not set, returns None. """ return get_distutils_option(option, ['build', 'build_ext', 'build_clib']) def get_distutils_install_option(option): """ Returns the value of the given distutils install option. Parameters ---------- option : str The name of the option Returns ------- val : str or None The value of the given distutils build option. If the option is not set, returns None. """ return get_distutils_option(option, ['install']) def get_distutils_build_or_install_option(option): """ Returns the value of the given distutils build or install option. Parameters ---------- option : str The name of the option Returns ------- val : str or None The value of the given distutils build or install option. If the option is not set, returns None. """ return get_distutils_option(option, ['build', 'build_ext', 'build_clib', 'install']) def get_compiler_option(): """ Determines the compiler that will be used to build extension modules. Returns ------- compiler : str The compiler option specified for the build, build_ext, or build_clib command; or the default compiler for the platform if none was specified. """ compiler = get_distutils_build_option('compiler') if compiler is None: return ccompiler.get_default_compiler() return compiler def add_command_option(command, name, doc, is_bool=False): """ Add a custom option to a setup command. Issues a warning if the option already exists on that command. Parameters ---------- command : str The name of the command as given on the command line name : str The name of the build option doc : str A short description of the option, for the `--help` message is_bool : bool, optional When `True`, the option is a boolean option and doesn't require an associated value. """ dist = get_dummy_distribution() cmdcls = dist.get_command_class(command) if (hasattr(cmdcls, '_astropy_helpers_options') and name in cmdcls._astropy_helpers_options): return attr = name.replace('-', '_') if hasattr(cmdcls, attr): raise RuntimeError( '{0!r} already has a {1!r} class attribute, barring {2!r} from ' 'being usable as a custom option name.'.format(cmdcls, attr, name)) for idx, cmd in enumerate(cmdcls.user_options): if cmd[0] == name: log.warn('Overriding existing {0!r} option ' '{1!r}'.format(command, name)) del cmdcls.user_options[idx] if name in cmdcls.boolean_options: cmdcls.boolean_options.remove(name) break cmdcls.user_options.append((name, None, doc)) if is_bool: cmdcls.boolean_options.append(name) # Distutils' command parsing requires that a command object have an # attribute with the same name as the option (with '-' replaced with '_') # in order for that option to be recognized as valid setattr(cmdcls, attr, None) # This caches the options added through add_command_option so that if it is # run multiple times in the same interpreter repeated adds are ignored # (this way we can still raise a RuntimeError if a custom option overrides # a built-in option) if not hasattr(cmdcls, '_astropy_helpers_options'): cmdcls._astropy_helpers_options = set([name]) else: cmdcls._astropy_helpers_options.add(name) def get_distutils_display_options(): """ Returns a set of all the distutils display options in their long and short forms. These are the setup.py arguments such as --name or --version which print the project's metadata and then exit. Returns ------- opts : set The long and short form display option arguments, including the - or -- """ short_display_opts = set('-' + o[1] for o in Distribution.display_options if o[1]) long_display_opts = set('--' + o[0] for o in Distribution.display_options) # Include -h and --help which are not explicitly listed in # Distribution.display_options (as they are handled by optparse) short_display_opts.add('-h') long_display_opts.add('--help') # This isn't the greatest approach to hardcode these commands. # However, there doesn't seem to be a good way to determine # whether build *will be* run as part of the command at this # phase. display_commands = set([ 'clean', 'register', 'setopt', 'saveopts', 'egg_info', 'alias']) return short_display_opts.union(long_display_opts.union(display_commands)) def is_distutils_display_option(): """ Returns True if sys.argv contains any of the distutils display options such as --version or --name. """ display_options = get_distutils_display_options() return bool(set(sys.argv[1:]).intersection(display_options)) ././@PaxHeader0000000000000000000000000000003400000000000011452 xustar000000000000000028 mtime=1605787216.9031825 ndcube-1.4.2/astropy_helpers/astropy_helpers/git_helpers.py0000644000175100001640000001453700000000000024605 0ustar00vstsdocker00000000000000# Licensed under a 3-clause BSD style license - see LICENSE.rst """ Utilities for retrieving revision information from a project's git repository. """ # Do not remove the following comment; it is used by # astropy_helpers.version_helpers to determine the beginning of the code in # this module # BEGIN import locale import os import subprocess import warnings __all__ = ['get_git_devstr'] def _decode_stdio(stream): try: stdio_encoding = locale.getdefaultlocale()[1] or 'utf-8' except ValueError: stdio_encoding = 'utf-8' try: text = stream.decode(stdio_encoding) except UnicodeDecodeError: # Final fallback text = stream.decode('latin1') return text def update_git_devstr(version, path=None): """ Updates the git revision string if and only if the path is being imported directly from a git working copy. This ensures that the revision number in the version string is accurate. """ try: # Quick way to determine if we're in git or not - returns '' if not devstr = get_git_devstr(sha=True, show_warning=False, path=path) except OSError: return version if not devstr: # Probably not in git so just pass silently return version if 'dev' in version: # update to the current git revision version_base = version.split('.dev', 1)[0] devstr = get_git_devstr(sha=False, show_warning=False, path=path) return version_base + '.dev' + devstr else: # otherwise it's already the true/release version return version def get_git_devstr(sha=False, show_warning=True, path=None): """ Determines the number of revisions in this repository. Parameters ---------- sha : bool If True, the full SHA1 hash will be returned. Otherwise, the total count of commits in the repository will be used as a "revision number". show_warning : bool If True, issue a warning if git returns an error code, otherwise errors pass silently. path : str or None If a string, specifies the directory to look in to find the git repository. If `None`, the current working directory is used, and must be the root of the git repository. If given a filename it uses the directory containing that file. Returns ------- devversion : str Either a string with the revision number (if `sha` is False), the SHA1 hash of the current commit (if `sha` is True), or an empty string if git version info could not be identified. """ if path is None: path = os.getcwd() if not os.path.isdir(path): path = os.path.abspath(os.path.dirname(path)) if sha: # Faster for getting just the hash of HEAD cmd = ['rev-parse', 'HEAD'] else: cmd = ['rev-list', '--count', 'HEAD'] def run_git(cmd): try: p = subprocess.Popen(['git'] + cmd, cwd=path, stdout=subprocess.PIPE, stderr=subprocess.PIPE, stdin=subprocess.PIPE) stdout, stderr = p.communicate() except OSError as e: if show_warning: warnings.warn('Error running git: ' + str(e)) return (None, b'', b'') if p.returncode == 128: if show_warning: warnings.warn('No git repository present at {0!r}! Using ' 'default dev version.'.format(path)) return (p.returncode, b'', b'') if p.returncode == 129: if show_warning: warnings.warn('Your git looks old (does it support {0}?); ' 'consider upgrading to v1.7.2 or ' 'later.'.format(cmd[0])) return (p.returncode, stdout, stderr) elif p.returncode != 0: if show_warning: warnings.warn('Git failed while determining revision ' 'count: {0}'.format(_decode_stdio(stderr))) return (p.returncode, stdout, stderr) return p.returncode, stdout, stderr returncode, stdout, stderr = run_git(cmd) if not sha and returncode == 128: # git returns 128 if the command is not run from within a git # repository tree. In this case, a warning is produced above but we # return the default dev version of '0'. return '0' elif not sha and returncode == 129: # git returns 129 if a command option failed to parse; in # particular this could happen in git versions older than 1.7.2 # where the --count option is not supported # Also use --abbrev-commit and --abbrev=0 to display the minimum # number of characters needed per-commit (rather than the full hash) cmd = ['rev-list', '--abbrev-commit', '--abbrev=0', 'HEAD'] returncode, stdout, stderr = run_git(cmd) # Fall back on the old method of getting all revisions and counting # the lines if returncode == 0: return str(stdout.count(b'\n')) else: return '' elif sha: return _decode_stdio(stdout)[:40] else: return _decode_stdio(stdout).strip() # This function is tested but it is only ever executed within a subprocess when # creating a fake package, so it doesn't get picked up by coverage metrics. def _get_repo_path(pathname, levels=None): # pragma: no cover """ Given a file or directory name, determine the root of the git repository this path is under. If given, this won't look any higher than ``levels`` (that is, if ``levels=0`` then the given path must be the root of the git repository and is returned if so. Returns `None` if the given path could not be determined to belong to a git repo. """ if os.path.isfile(pathname): current_dir = os.path.abspath(os.path.dirname(pathname)) elif os.path.isdir(pathname): current_dir = os.path.abspath(pathname) else: return None current_level = 0 while levels is None or current_level <= levels: if os.path.exists(os.path.join(current_dir, '.git')): return current_dir current_level += 1 if current_dir == os.path.dirname(current_dir): break current_dir = os.path.dirname(current_dir) return None ././@PaxHeader0000000000000000000000000000003400000000000011452 xustar000000000000000028 mtime=1605787216.9031825 ndcube-1.4.2/astropy_helpers/astropy_helpers/openmp_helpers.py0000644000175100001640000002256700000000000025322 0ustar00vstsdocker00000000000000# This module defines functions that can be used to check whether OpenMP is # available and if so what flags to use. To use this, import the # add_openmp_flags_if_available function in a setup_package.py file where you # are defining your extensions: # # from astropy_helpers.openmp_helpers import add_openmp_flags_if_available # # then call it with a single extension as the only argument: # # add_openmp_flags_if_available(extension) # # this will add the OpenMP flags if available. from __future__ import absolute_import, print_function import os import sys import glob import time import datetime import tempfile import subprocess from distutils import log from distutils.ccompiler import new_compiler from distutils.sysconfig import customize_compiler, get_config_var from distutils.errors import CompileError, LinkError from .distutils_helpers import get_compiler_option __all__ = ['add_openmp_flags_if_available'] try: # Check if this has already been instantiated, only set the default once. _ASTROPY_DISABLE_SETUP_WITH_OPENMP_ except NameError: import builtins # It hasn't, so do so. builtins._ASTROPY_DISABLE_SETUP_WITH_OPENMP_ = False CCODE = """ #include #include int main(void) { #pragma omp parallel printf("nthreads=%d\\n", omp_get_num_threads()); return 0; } """ def _get_flag_value_from_var(flag, var, delim=' '): """ Extract flags from an environment variable. Parameters ---------- flag : str The flag to extract, for example '-I' or '-L' var : str The environment variable to extract the flag from, e.g. CFLAGS or LDFLAGS. delim : str, optional The delimiter separating flags inside the environment variable Examples -------- Let's assume the LDFLAGS is set to '-L/usr/local/include -customflag'. This function will then return the following: >>> _get_flag_value_from_var('-L', 'LDFLAGS') '/usr/local/include' Notes ----- Environment variables are first checked in ``os.environ[var]``, then in ``distutils.sysconfig.get_config_var(var)``. This function is not supported on Windows. """ if sys.platform.startswith('win'): return None # Simple input validation if not var or not flag: return None flag_length = len(flag) if not flag_length: return None # Look for var in os.eviron then in get_config_var if var in os.environ: flags = os.environ[var] else: try: flags = get_config_var(var) except KeyError: return None # Extract flag from {var:value} if flags: for item in flags.split(delim): if item.startswith(flag): return item[flag_length:] def get_openmp_flags(): """ Utility for returning compiler and linker flags possibly needed for OpenMP support. Returns ------- result : `{'compiler_flags':, 'linker_flags':}` Notes ----- The flags returned are not tested for validity, use `check_openmp_support(openmp_flags=get_openmp_flags())` to do so. """ compile_flags = [] link_flags = [] if get_compiler_option() == 'msvc': compile_flags.append('-openmp') else: include_path = _get_flag_value_from_var('-I', 'CFLAGS') if include_path: compile_flags.append('-I' + include_path) lib_path = _get_flag_value_from_var('-L', 'LDFLAGS') if lib_path: link_flags.append('-L' + lib_path) link_flags.append('-Wl,-rpath,' + lib_path) compile_flags.append('-fopenmp') link_flags.append('-fopenmp') return {'compiler_flags': compile_flags, 'linker_flags': link_flags} def check_openmp_support(openmp_flags=None): """ Check whether OpenMP test code can be compiled and run. Parameters ---------- openmp_flags : dict, optional This should be a dictionary with keys ``compiler_flags`` and ``linker_flags`` giving the compiliation and linking flags respectively. These are passed as `extra_postargs` to `compile()` and `link_executable()` respectively. If this is not set, the flags will be automatically determined using environment variables. Returns ------- result : bool `True` if the test passed, `False` otherwise. """ ccompiler = new_compiler() customize_compiler(ccompiler) if not openmp_flags: # customize_compiler() extracts info from os.environ. If certain keys # exist it uses these plus those from sysconfig.get_config_vars(). # If the key is missing in os.environ it is not extracted from # sysconfig.get_config_var(). E.g. 'LDFLAGS' get left out, preventing # clang from finding libomp.dylib because -L is not passed to # linker. Call get_openmp_flags() to get flags missed by # customize_compiler(). openmp_flags = get_openmp_flags() compile_flags = openmp_flags.get('compiler_flags') link_flags = openmp_flags.get('linker_flags') # Pass -coverage flag to linker. # https://github.com/astropy/astropy-helpers/pull/374 if '-coverage' in compile_flags and '-coverage' not in link_flags: link_flags.append('-coverage') tmp_dir = tempfile.mkdtemp() start_dir = os.path.abspath('.') try: os.chdir(tmp_dir) # Write test program with open('test_openmp.c', 'w') as f: f.write(CCODE) os.mkdir('objects') # Compile, test program ccompiler.compile(['test_openmp.c'], output_dir='objects', extra_postargs=compile_flags) # Link test program objects = glob.glob(os.path.join('objects', '*' + ccompiler.obj_extension)) ccompiler.link_executable(objects, 'test_openmp', extra_postargs=link_flags) # Run test program output = subprocess.check_output('./test_openmp') output = output.decode(sys.stdout.encoding or 'utf-8').splitlines() if 'nthreads=' in output[0]: nthreads = int(output[0].strip().split('=')[1]) if len(output) == nthreads: is_openmp_supported = True else: log.warn("Unexpected number of lines from output of test OpenMP " "program (output was {0})".format(output)) is_openmp_supported = False else: log.warn("Unexpected output from test OpenMP " "program (output was {0})".format(output)) is_openmp_supported = False except (CompileError, LinkError, subprocess.CalledProcessError): is_openmp_supported = False finally: os.chdir(start_dir) return is_openmp_supported def is_openmp_supported(): """ Determine whether the build compiler has OpenMP support. """ log_threshold = log.set_threshold(log.FATAL) ret = check_openmp_support() log.set_threshold(log_threshold) return ret def add_openmp_flags_if_available(extension): """ Add OpenMP compilation flags, if supported (if not a warning will be printed to the console and no flags will be added.) Returns `True` if the flags were added, `False` otherwise. """ if _ASTROPY_DISABLE_SETUP_WITH_OPENMP_: log.info("OpenMP support has been explicitly disabled.") return False openmp_flags = get_openmp_flags() using_openmp = check_openmp_support(openmp_flags=openmp_flags) if using_openmp: compile_flags = openmp_flags.get('compiler_flags') link_flags = openmp_flags.get('linker_flags') log.info("Compiling Cython/C/C++ extension with OpenMP support") extension.extra_compile_args.extend(compile_flags) extension.extra_link_args.extend(link_flags) else: log.warn("Cannot compile Cython/C/C++ extension with OpenMP, reverting " "to non-parallel code") return using_openmp _IS_OPENMP_ENABLED_SRC = """ # Autogenerated by {packagetitle}'s setup.py on {timestamp!s} def is_openmp_enabled(): \"\"\" Determine whether this package was built with OpenMP support. \"\"\" return {return_bool} """[1:] def generate_openmp_enabled_py(packagename, srcdir='.', disable_openmp=None): """ Generate ``package.openmp_enabled.is_openmp_enabled``, which can then be used to determine, post build, whether the package was built with or without OpenMP support. """ if packagename.lower() == 'astropy': packagetitle = 'Astropy' else: packagetitle = packagename epoch = int(os.environ.get('SOURCE_DATE_EPOCH', time.time())) timestamp = datetime.datetime.utcfromtimestamp(epoch) if disable_openmp is not None: import builtins builtins._ASTROPY_DISABLE_SETUP_WITH_OPENMP_ = disable_openmp if _ASTROPY_DISABLE_SETUP_WITH_OPENMP_: log.info("OpenMP support has been explicitly disabled.") openmp_support = False if _ASTROPY_DISABLE_SETUP_WITH_OPENMP_ else is_openmp_supported() src = _IS_OPENMP_ENABLED_SRC.format(packagetitle=packagetitle, timestamp=timestamp, return_bool=openmp_support) package_srcdir = os.path.join(srcdir, *packagename.split('.')) is_openmp_enabled_py = os.path.join(package_srcdir, 'openmp_enabled.py') with open(is_openmp_enabled_py, 'w') as f: f.write(src) ././@PaxHeader0000000000000000000000000000003400000000000011452 xustar000000000000000028 mtime=1605787216.9031825 ndcube-1.4.2/astropy_helpers/astropy_helpers/setup_helpers.py0000644000175100001640000007062200000000000025157 0ustar00vstsdocker00000000000000# Licensed under a 3-clause BSD style license - see LICENSE.rst """ This module contains a number of utilities for use during setup/build/packaging that are useful to astropy as a whole. """ from __future__ import absolute_import import collections import os import re import subprocess import sys import traceback import warnings from configparser import ConfigParser import builtins from distutils import log from distutils.errors import DistutilsOptionError, DistutilsModuleError from distutils.core import Extension from distutils.core import Command from distutils.command.sdist import sdist as DistutilsSdist from setuptools import setup as setuptools_setup from setuptools.config import read_configuration from setuptools import find_packages as _find_packages from .distutils_helpers import (add_command_option, get_compiler_option, get_dummy_distribution, get_distutils_build_option, get_distutils_build_or_install_option) from .version_helpers import get_pkg_version_module, generate_version_py from .utils import (walk_skip_hidden, import_file, extends_doc, resolve_name, AstropyDeprecationWarning) from .commands.build_ext import AstropyHelpersBuildExt from .commands.test import AstropyTest # These imports are not used in this module, but are included for backwards # compat with older versions of this module from .utils import get_numpy_include_path, write_if_different # noqa __all__ = ['register_commands', 'get_package_info'] _module_state = {'registered_commands': None, 'have_sphinx': False, 'package_cache': None, 'exclude_packages': set(), 'excludes_too_late': False} try: import sphinx # noqa _module_state['have_sphinx'] = True except ValueError as e: # This can occur deep in the bowels of Sphinx's imports by way of docutils # and an occurrence of this bug: http://bugs.python.org/issue18378 # In this case sphinx is effectively unusable if 'unknown locale' in e.args[0]: log.warn( "Possible misconfiguration of one of the environment variables " "LC_ALL, LC_CTYPES, LANG, or LANGUAGE. For an example of how to " "configure your system's language environment on OSX see " "http://blog.remibergsma.com/2012/07/10/" "setting-locales-correctly-on-mac-osx-terminal-application/") except ImportError: pass except SyntaxError: # occurs if markupsafe is recent version, which doesn't support Python 3.2 pass def setup(**kwargs): """ A wrapper around setuptools' setup() function that automatically sets up custom commands, generates a version file, and customizes the setup process via the ``setup_package.py`` files. """ # DEPRECATED: store the package name in a built-in variable so it's easy # to get from other parts of the setup infrastructure. We should phase this # out in packages that use it - the cookiecutter template should now be # able to put the right package name where needed. conf = read_configuration('setup.cfg') builtins._ASTROPY_PACKAGE_NAME_ = conf['metadata']['name'] # Create a dictionary with setup command overrides. Note that this gets # information about the package (name and version) from the setup.cfg file. cmdclass = register_commands() # Freeze build information in version.py. Note that this gets information # about the package (name and version) from the setup.cfg file. version = generate_version_py() # Get configuration information from all of the various subpackages. # See the docstring for setup_helpers.update_package_files for more # details. package_info = get_package_info() package_info['cmdclass'] = cmdclass package_info['version'] = version # Override using any specified keyword arguments package_info.update(kwargs) setuptools_setup(**package_info) def adjust_compiler(package): warnings.warn( 'The adjust_compiler function in setup.py is ' 'deprecated and can be removed from your setup.py.', AstropyDeprecationWarning) def get_debug_option(packagename): """ Determines if the build is in debug mode. Returns ------- debug : bool True if the current build was started with the debug option, False otherwise. """ try: current_debug = get_pkg_version_module(packagename, fromlist=['debug'])[0] except (ImportError, AttributeError): current_debug = None # Only modify the debug flag if one of the build commands was explicitly # run (i.e. not as a sub-command of something else) dist = get_dummy_distribution() if any(cmd in dist.commands for cmd in ['build', 'build_ext']): debug = bool(get_distutils_build_option('debug')) else: debug = bool(current_debug) if current_debug is not None and current_debug != debug: build_ext_cmd = dist.get_command_class('build_ext') build_ext_cmd._force_rebuild = True return debug def add_exclude_packages(excludes): if _module_state['excludes_too_late']: raise RuntimeError( "add_package_excludes must be called before all other setup helper " "functions in order to properly handle excluded packages") _module_state['exclude_packages'].update(set(excludes)) def register_commands(package=None, version=None, release=None, srcdir='.'): """ This function generates a dictionary containing customized commands that can then be passed to the ``cmdclass`` argument in ``setup()``. """ if package is not None: warnings.warn('The package argument to generate_version_py has ' 'been deprecated and will be removed in future. Specify ' 'the package name in setup.cfg instead', AstropyDeprecationWarning) if version is not None: warnings.warn('The version argument to generate_version_py has ' 'been deprecated and will be removed in future. Specify ' 'the version number in setup.cfg instead', AstropyDeprecationWarning) if release is not None: warnings.warn('The release argument to generate_version_py has ' 'been deprecated and will be removed in future. We now ' 'use the presence of the "dev" string in the version to ' 'determine whether this is a release', AstropyDeprecationWarning) # We use ConfigParser instead of read_configuration here because the latter # only reads in keys recognized by setuptools, but we need to access # package_name below. conf = ConfigParser() conf.read('setup.cfg') if conf.has_option('metadata', 'name'): package = conf.get('metadata', 'name') elif conf.has_option('metadata', 'package_name'): # The package-template used package_name instead of name for a while warnings.warn('Specifying the package name using the "package_name" ' 'option in setup.cfg is deprecated - use the "name" ' 'option instead.', AstropyDeprecationWarning) package = conf.get('metadata', 'package_name') elif package is not None: # deprecated pass else: sys.stderr.write('ERROR: Could not read package name from setup.cfg\n') sys.exit(1) if _module_state['registered_commands'] is not None: return _module_state['registered_commands'] if _module_state['have_sphinx']: try: from .commands.build_sphinx import (AstropyBuildSphinx, AstropyBuildDocs) except ImportError: AstropyBuildSphinx = AstropyBuildDocs = FakeBuildSphinx else: AstropyBuildSphinx = AstropyBuildDocs = FakeBuildSphinx _module_state['registered_commands'] = registered_commands = { 'test': generate_test_command(package), # Use distutils' sdist because it respects package_data. # setuptools/distributes sdist requires duplication of information in # MANIFEST.in 'sdist': DistutilsSdist, 'build_ext': AstropyHelpersBuildExt, 'build_sphinx': AstropyBuildSphinx, 'build_docs': AstropyBuildDocs } # Need to override the __name__ here so that the commandline options are # presented as being related to the "build" command, for example; normally # this wouldn't be necessary since commands also have a command_name # attribute, but there is a bug in distutils' help display code that it # uses __name__ instead of command_name. Yay distutils! for name, cls in registered_commands.items(): cls.__name__ = name # Add a few custom options; more of these can be added by specific packages # later for option in [ ('use-system-libraries', "Use system libraries whenever possible", True)]: add_command_option('build', *option) add_command_option('install', *option) add_command_hooks(registered_commands, srcdir=srcdir) return registered_commands def add_command_hooks(commands, srcdir='.'): """ Look through setup_package.py modules for functions with names like ``pre__hook`` and ``post__hook`` where ```` is the name of a ``setup.py`` command (e.g. build_ext). If either hook is present this adds a wrapped version of that command to the passed in ``commands`` `dict`. ``commands`` may be pre-populated with other custom distutils command classes that should be wrapped if there are hooks for them (e.g. `AstropyBuildPy`). """ hook_re = re.compile(r'^(pre|post)_(.+)_hook$') # Distutils commands have a method of the same name, but it is not a # *classmethod* (which probably didn't exist when distutils was first # written) def get_command_name(cmdcls): if hasattr(cmdcls, 'command_name'): return cmdcls.command_name else: return cmdcls.__name__ packages = find_packages(srcdir) dist = get_dummy_distribution() hooks = collections.defaultdict(dict) for setuppkg in iter_setup_packages(srcdir, packages): for name, obj in vars(setuppkg).items(): match = hook_re.match(name) if not match: continue hook_type = match.group(1) cmd_name = match.group(2) if hook_type not in hooks[cmd_name]: hooks[cmd_name][hook_type] = [] hooks[cmd_name][hook_type].append((setuppkg.__name__, obj)) for cmd_name, cmd_hooks in hooks.items(): commands[cmd_name] = generate_hooked_command( cmd_name, dist.get_command_class(cmd_name), cmd_hooks) def generate_hooked_command(cmd_name, cmd_cls, hooks): """ Returns a generated subclass of ``cmd_cls`` that runs the pre- and post-command hooks for that command before and after the ``cmd_cls.run`` method. """ def run(self, orig_run=cmd_cls.run): self.run_command_hooks('pre_hooks') orig_run(self) self.run_command_hooks('post_hooks') return type(cmd_name, (cmd_cls, object), {'run': run, 'run_command_hooks': run_command_hooks, 'pre_hooks': hooks.get('pre', []), 'post_hooks': hooks.get('post', [])}) def run_command_hooks(cmd_obj, hook_kind): """Run hooks registered for that command and phase. *cmd_obj* is a finalized command object; *hook_kind* is either 'pre_hook' or 'post_hook'. """ hooks = getattr(cmd_obj, hook_kind, None) if not hooks: return for modname, hook in hooks: if isinstance(hook, str): try: hook_obj = resolve_name(hook) except ImportError as exc: raise DistutilsModuleError( 'cannot find hook {0}: {1}'.format(hook, exc)) else: hook_obj = hook if not callable(hook_obj): raise DistutilsOptionError('hook {0!r} is not callable' % hook) log.info('running {0} from {1} for {2} command'.format( hook_kind.rstrip('s'), modname, cmd_obj.get_command_name())) try: hook_obj(cmd_obj) except Exception: log.error('{0} command hook {1} raised an exception: %s\n'.format( hook_obj.__name__, cmd_obj.get_command_name())) log.error(traceback.format_exc()) sys.exit(1) def generate_test_command(package_name): """ Creates a custom 'test' command for the given package which sets the command's ``package_name`` class attribute to the name of the package being tested. """ return type(package_name.title() + 'Test', (AstropyTest,), {'package_name': package_name}) def update_package_files(srcdir, extensions, package_data, packagenames, package_dirs): """ This function is deprecated and maintained for backward compatibility with affiliated packages. Affiliated packages should update their setup.py to use `get_package_info` instead. """ info = get_package_info(srcdir) extensions.extend(info['ext_modules']) package_data.update(info['package_data']) packagenames = list(set(packagenames + info['packages'])) package_dirs.update(info['package_dir']) def get_package_info(srcdir='.', exclude=()): """ Collates all of the information for building all subpackages and returns a dictionary of keyword arguments that can be passed directly to `distutils.setup`. The purpose of this function is to allow subpackages to update the arguments to the package's ``setup()`` function in its setup.py script, rather than having to specify all extensions/package data directly in the ``setup.py``. See Astropy's own ``setup.py`` for example usage and the Astropy development docs for more details. This function obtains that information by iterating through all packages in ``srcdir`` and locating a ``setup_package.py`` module. This module can contain the following functions: ``get_extensions()``, ``get_package_data()``, ``get_build_options()``, and ``get_external_libraries()``. Each of those functions take no arguments. - ``get_extensions`` returns a list of `distutils.extension.Extension` objects. - ``get_package_data()`` returns a dict formatted as required by the ``package_data`` argument to ``setup()``. - ``get_build_options()`` returns a list of tuples describing the extra build options to add. - ``get_external_libraries()`` returns a list of libraries that can optionally be built using external dependencies. """ ext_modules = [] packages = [] package_dir = {} # Read in existing package data, and add to it below setup_cfg = os.path.join(srcdir, 'setup.cfg') if os.path.exists(setup_cfg): conf = read_configuration(setup_cfg) if 'options' in conf and 'package_data' in conf['options']: package_data = conf['options']['package_data'] else: package_data = {} else: package_data = {} if exclude: warnings.warn( "Use of the exclude parameter is no longer supported since it does " "not work as expected. Use add_exclude_packages instead. Note that " "it must be called prior to any other calls from setup helpers.", AstropyDeprecationWarning) # Use the find_packages tool to locate all packages and modules packages = find_packages(srcdir, exclude=exclude) # Update package_dir if the package lies in a subdirectory if srcdir != '.': package_dir[''] = srcdir # For each of the setup_package.py modules, extract any # information that is needed to install them. The build options # are extracted first, so that their values will be available in # subsequent calls to `get_extensions`, etc. for setuppkg in iter_setup_packages(srcdir, packages): if hasattr(setuppkg, 'get_build_options'): options = setuppkg.get_build_options() for option in options: add_command_option('build', *option) if hasattr(setuppkg, 'get_external_libraries'): libraries = setuppkg.get_external_libraries() for library in libraries: add_external_library(library) for setuppkg in iter_setup_packages(srcdir, packages): # get_extensions must include any Cython extensions by their .pyx # filename. if hasattr(setuppkg, 'get_extensions'): ext_modules.extend(setuppkg.get_extensions()) if hasattr(setuppkg, 'get_package_data'): package_data.update(setuppkg.get_package_data()) # Locate any .pyx files not already specified, and add their extensions in. # The default include dirs include numpy to facilitate numerical work. ext_modules.extend(get_cython_extensions(srcdir, packages, ext_modules, ['numpy'])) # Now remove extensions that have the special name 'skip_cython', as they # exist Only to indicate that the cython extensions shouldn't be built for i, ext in reversed(list(enumerate(ext_modules))): if ext.name == 'skip_cython': del ext_modules[i] # On Microsoft compilers, we need to pass the '/MANIFEST' # commandline argument. This was the default on MSVC 9.0, but is # now required on MSVC 10.0, but it doesn't seem to hurt to add # it unconditionally. if get_compiler_option() == 'msvc': for ext in ext_modules: ext.extra_link_args.append('/MANIFEST') return { 'ext_modules': ext_modules, 'packages': packages, 'package_dir': package_dir, 'package_data': package_data, } def iter_setup_packages(srcdir, packages): """ A generator that finds and imports all of the ``setup_package.py`` modules in the source packages. Returns ------- modgen : generator A generator that yields (modname, mod), where `mod` is the module and `modname` is the module name for the ``setup_package.py`` modules. """ for packagename in packages: package_parts = packagename.split('.') package_path = os.path.join(srcdir, *package_parts) setup_package = os.path.relpath( os.path.join(package_path, 'setup_package.py')) if os.path.isfile(setup_package): module = import_file(setup_package, name=packagename + '.setup_package') yield module def iter_pyx_files(package_dir, package_name): """ A generator that yields Cython source files (ending in '.pyx') in the source packages. Returns ------- pyxgen : generator A generator that yields (extmod, fullfn) where `extmod` is the full name of the module that the .pyx file would live in based on the source directory structure, and `fullfn` is the path to the .pyx file. """ for dirpath, dirnames, filenames in walk_skip_hidden(package_dir): for fn in filenames: if fn.endswith('.pyx'): fullfn = os.path.relpath(os.path.join(dirpath, fn)) # Package must match file name extmod = '.'.join([package_name, fn[:-4]]) yield (extmod, fullfn) break # Don't recurse into subdirectories def get_cython_extensions(srcdir, packages, prevextensions=tuple(), extincludedirs=None): """ Looks for Cython files and generates Extensions if needed. Parameters ---------- srcdir : str Path to the root of the source directory to search. prevextensions : list of `~distutils.core.Extension` objects The extensions that are already defined. Any .pyx files already here will be ignored. extincludedirs : list of str or None Directories to include as the `include_dirs` argument to the generated `~distutils.core.Extension` objects. Returns ------- exts : list of `~distutils.core.Extension` objects The new extensions that are needed to compile all .pyx files (does not include any already in `prevextensions`). """ # Vanilla setuptools and old versions of distribute include Cython files # as .c files in the sources, not .pyx, so we cannot simply look for # existing .pyx sources in the previous sources, but we should also check # for .c files with the same remaining filename. So we look for .pyx and # .c files, and we strip the extension. prevsourcepaths = [] ext_modules = [] for ext in prevextensions: for s in ext.sources: if s.endswith(('.pyx', '.c', '.cpp')): sourcepath = os.path.realpath(os.path.splitext(s)[0]) prevsourcepaths.append(sourcepath) for package_name in packages: package_parts = package_name.split('.') package_path = os.path.join(srcdir, *package_parts) for extmod, pyxfn in iter_pyx_files(package_path, package_name): sourcepath = os.path.realpath(os.path.splitext(pyxfn)[0]) if sourcepath not in prevsourcepaths: ext_modules.append(Extension(extmod, [pyxfn], include_dirs=extincludedirs)) return ext_modules class DistutilsExtensionArgs(collections.defaultdict): """ A special dictionary whose default values are the empty list. This is useful for building up a set of arguments for `distutils.Extension` without worrying whether the entry is already present. """ def __init__(self, *args, **kwargs): def default_factory(): return [] super(DistutilsExtensionArgs, self).__init__( default_factory, *args, **kwargs) def update(self, other): for key, val in other.items(): self[key].extend(val) def pkg_config(packages, default_libraries, executable='pkg-config'): """ Uses pkg-config to update a set of distutils Extension arguments to include the flags necessary to link against the given packages. If the pkg-config lookup fails, default_libraries is applied to libraries. Parameters ---------- packages : list of str A list of pkg-config packages to look up. default_libraries : list of str A list of library names to use if the pkg-config lookup fails. Returns ------- config : dict A dictionary containing keyword arguments to `distutils.Extension`. These entries include: - ``include_dirs``: A list of include directories - ``library_dirs``: A list of library directories - ``libraries``: A list of libraries - ``define_macros``: A list of macro defines - ``undef_macros``: A list of macros to undefine - ``extra_compile_args``: A list of extra arguments to pass to the compiler """ flag_map = {'-I': 'include_dirs', '-L': 'library_dirs', '-l': 'libraries', '-D': 'define_macros', '-U': 'undef_macros'} command = "{0} --libs --cflags {1}".format(executable, ' '.join(packages)), result = DistutilsExtensionArgs() try: pipe = subprocess.Popen(command, shell=True, stdout=subprocess.PIPE) output = pipe.communicate()[0].strip() except subprocess.CalledProcessError as e: lines = [ ("{0} failed. This may cause the build to fail below." .format(executable)), " command: {0}".format(e.cmd), " returncode: {0}".format(e.returncode), " output: {0}".format(e.output) ] log.warn('\n'.join(lines)) result['libraries'].extend(default_libraries) else: if pipe.returncode != 0: lines = [ "pkg-config could not lookup up package(s) {0}.".format( ", ".join(packages)), "This may cause the build to fail below." ] log.warn('\n'.join(lines)) result['libraries'].extend(default_libraries) else: for token in output.split(): # It's not clear what encoding the output of # pkg-config will come to us in. It will probably be # some combination of pure ASCII (for the compiler # flags) and the filesystem encoding (for any argument # that includes directories or filenames), but this is # just conjecture, as the pkg-config documentation # doesn't seem to address it. arg = token[:2].decode('ascii') value = token[2:].decode(sys.getfilesystemencoding()) if arg in flag_map: if arg == '-D': value = tuple(value.split('=', 1)) result[flag_map[arg]].append(value) else: result['extra_compile_args'].append(value) return result def add_external_library(library): """ Add a build option for selecting the internal or system copy of a library. Parameters ---------- library : str The name of the library. If the library is `foo`, the build option will be called `--use-system-foo`. """ for command in ['build', 'build_ext', 'install']: add_command_option(command, str('use-system-' + library), 'Use the system {0} library'.format(library), is_bool=True) def use_system_library(library): """ Returns `True` if the build configuration indicates that the given library should use the system copy of the library rather than the internal one. For the given library `foo`, this will be `True` if `--use-system-foo` or `--use-system-libraries` was provided at the commandline or in `setup.cfg`. Parameters ---------- library : str The name of the library Returns ------- use_system : bool `True` if the build should use the system copy of the library. """ return ( get_distutils_build_or_install_option('use_system_{0}'.format(library)) or get_distutils_build_or_install_option('use_system_libraries')) @extends_doc(_find_packages) def find_packages(where='.', exclude=(), invalidate_cache=False): """ This version of ``find_packages`` caches previous results to speed up subsequent calls. Use ``invalide_cache=True`` to ignore cached results from previous ``find_packages`` calls, and repeat the package search. """ if exclude: warnings.warn( "Use of the exclude parameter is no longer supported since it does " "not work as expected. Use add_exclude_packages instead. Note that " "it must be called prior to any other calls from setup helpers.", AstropyDeprecationWarning) # Calling add_exclude_packages after this point will have no effect _module_state['excludes_too_late'] = True if not invalidate_cache and _module_state['package_cache'] is not None: return _module_state['package_cache'] packages = _find_packages( where=where, exclude=list(_module_state['exclude_packages'])) _module_state['package_cache'] = packages return packages class FakeBuildSphinx(Command): """ A dummy build_sphinx command that is called if Sphinx is not installed and displays a relevant error message """ # user options inherited from sphinx.setup_command.BuildDoc user_options = [ ('fresh-env', 'E', ''), ('all-files', 'a', ''), ('source-dir=', 's', ''), ('build-dir=', None, ''), ('config-dir=', 'c', ''), ('builder=', 'b', ''), ('project=', None, ''), ('version=', None, ''), ('release=', None, ''), ('today=', None, ''), ('link-index', 'i', '')] # user options appended in astropy.setup_helpers.AstropyBuildSphinx user_options.append(('warnings-returncode', 'w', '')) user_options.append(('clean-docs', 'l', '')) user_options.append(('no-intersphinx', 'n', '')) user_options.append(('open-docs-in-browser', 'o', '')) def initialize_options(self): try: raise RuntimeError("Sphinx and its dependencies must be installed " "for build_docs.") except: log.error('error: Sphinx and its dependencies must be installed ' 'for build_docs.') sys.exit(1) ././@PaxHeader0000000000000000000000000000003400000000000011452 xustar000000000000000028 mtime=1605787217.3871849 ndcube-1.4.2/astropy_helpers/astropy_helpers/sphinx/0000755000175100001640000000000000000000000023225 5ustar00vstsdocker00000000000000././@PaxHeader0000000000000000000000000000003400000000000011452 xustar000000000000000028 mtime=1605787216.9031825 ndcube-1.4.2/astropy_helpers/astropy_helpers/sphinx/__init__.py0000644000175100001640000000000000000000000025324 0ustar00vstsdocker00000000000000././@PaxHeader0000000000000000000000000000003400000000000011452 xustar000000000000000028 mtime=1605787216.9031825 ndcube-1.4.2/astropy_helpers/astropy_helpers/sphinx/conf.py0000644000175100001640000000023400000000000024523 0ustar00vstsdocker00000000000000import warnings from sphinx_astropy.conf import * warnings.warn("Note that astropy_helpers.sphinx.conf is deprecated - use sphinx_astropy.conf instead") ././@PaxHeader0000000000000000000000000000003400000000000011452 xustar000000000000000028 mtime=1605787216.9031825 ndcube-1.4.2/astropy_helpers/astropy_helpers/utils.py0000644000175100001640000002070100000000000023426 0ustar00vstsdocker00000000000000# Licensed under a 3-clause BSD style license - see LICENSE.rst from __future__ import absolute_import, unicode_literals import contextlib import functools import imp import os import sys import glob from importlib import machinery as import_machinery # Note: The following Warning subclasses are simply copies of the Warnings in # Astropy of the same names. class AstropyWarning(Warning): """ The base warning class from which all Astropy warnings should inherit. Any warning inheriting from this class is handled by the Astropy logger. """ class AstropyDeprecationWarning(AstropyWarning): """ A warning class to indicate a deprecated feature. """ class AstropyPendingDeprecationWarning(PendingDeprecationWarning, AstropyWarning): """ A warning class to indicate a soon-to-be deprecated feature. """ def _get_platlib_dir(cmd): """ Given a build command, return the name of the appropriate platform-specific build subdirectory directory (e.g. build/lib.linux-x86_64-2.7) """ plat_specifier = '.{0}-{1}'.format(cmd.plat_name, sys.version[0:3]) return os.path.join(cmd.build_base, 'lib' + plat_specifier) def get_numpy_include_path(): """ Gets the path to the numpy headers. """ # We need to go through this nonsense in case setuptools # downloaded and installed Numpy for us as part of the build or # install, since Numpy may still think it's in "setup mode", when # in fact we're ready to use it to build astropy now. import builtins if hasattr(builtins, '__NUMPY_SETUP__'): del builtins.__NUMPY_SETUP__ import imp import numpy imp.reload(numpy) try: numpy_include = numpy.get_include() except AttributeError: numpy_include = numpy.get_numpy_include() return numpy_include class _DummyFile(object): """A noop writeable object.""" errors = '' def write(self, s): pass def flush(self): pass @contextlib.contextmanager def silence(): """A context manager that silences sys.stdout and sys.stderr.""" old_stdout = sys.stdout old_stderr = sys.stderr sys.stdout = _DummyFile() sys.stderr = _DummyFile() exception_occurred = False try: yield except: exception_occurred = True # Go ahead and clean up so that exception handling can work normally sys.stdout = old_stdout sys.stderr = old_stderr raise if not exception_occurred: sys.stdout = old_stdout sys.stderr = old_stderr if sys.platform == 'win32': import ctypes def _has_hidden_attribute(filepath): """ Returns True if the given filepath has the hidden attribute on MS-Windows. Based on a post here: http://stackoverflow.com/questions/284115/cross-platform-hidden-file-detection """ if isinstance(filepath, bytes): filepath = filepath.decode(sys.getfilesystemencoding()) try: attrs = ctypes.windll.kernel32.GetFileAttributesW(filepath) assert attrs != -1 result = bool(attrs & 2) except (AttributeError, AssertionError): result = False return result else: def _has_hidden_attribute(filepath): return False def is_path_hidden(filepath): """ Determines if a given file or directory is hidden. Parameters ---------- filepath : str The path to a file or directory Returns ------- hidden : bool Returns `True` if the file is hidden """ name = os.path.basename(os.path.abspath(filepath)) if isinstance(name, bytes): is_dotted = name.startswith(b'.') else: is_dotted = name.startswith('.') return is_dotted or _has_hidden_attribute(filepath) def walk_skip_hidden(top, onerror=None, followlinks=False): """ A wrapper for `os.walk` that skips hidden files and directories. This function does not have the parameter `topdown` from `os.walk`: the directories must always be recursed top-down when using this function. See also -------- os.walk : For a description of the parameters """ for root, dirs, files in os.walk( top, topdown=True, onerror=onerror, followlinks=followlinks): # These lists must be updated in-place so os.walk will skip # hidden directories dirs[:] = [d for d in dirs if not is_path_hidden(d)] files[:] = [f for f in files if not is_path_hidden(f)] yield root, dirs, files def write_if_different(filename, data): """Write `data` to `filename`, if the content of the file is different. Parameters ---------- filename : str The file name to be written to. data : bytes The data to be written to `filename`. """ assert isinstance(data, bytes) if os.path.exists(filename): with open(filename, 'rb') as fd: original_data = fd.read() else: original_data = None if original_data != data: with open(filename, 'wb') as fd: fd.write(data) def import_file(filename, name=None): """ Imports a module from a single file as if it doesn't belong to a particular package. The returned module will have the optional ``name`` if given, or else a name generated from the filename. """ # Specifying a traditional dot-separated fully qualified name here # results in a number of "Parent module 'astropy' not found while # handling absolute import" warnings. Using the same name, the # namespaces of the modules get merged together. So, this # generates an underscore-separated name which is more likely to # be unique, and it doesn't really matter because the name isn't # used directly here anyway. mode = 'r' if name is None: basename = os.path.splitext(filename)[0] name = '_'.join(os.path.relpath(basename).split(os.sep)[1:]) if not os.path.exists(filename): raise ImportError('Could not import file {0}'.format(filename)) if import_machinery: loader = import_machinery.SourceFileLoader(name, filename) mod = loader.load_module() else: with open(filename, mode) as fd: mod = imp.load_module(name, fd, filename, ('.py', mode, 1)) return mod def resolve_name(name): """Resolve a name like ``module.object`` to an object and return it. Raise `ImportError` if the module or name is not found. """ parts = name.split('.') cursor = len(parts) - 1 module_name = parts[:cursor] attr_name = parts[-1] while cursor > 0: try: ret = __import__('.'.join(module_name), fromlist=[attr_name]) break except ImportError: if cursor == 0: raise cursor -= 1 module_name = parts[:cursor] attr_name = parts[cursor] ret = '' for part in parts[cursor:]: try: ret = getattr(ret, part) except AttributeError: raise ImportError(name) return ret def extends_doc(extended_func): """ A function decorator for use when wrapping an existing function but adding additional functionality. This copies the docstring from the original function, and appends to it (along with a newline) the docstring of the wrapper function. Examples -------- >>> def foo(): ... '''Hello.''' ... >>> @extends_doc(foo) ... def bar(): ... '''Goodbye.''' ... >>> print(bar.__doc__) Hello. Goodbye. """ def decorator(func): if not (extended_func.__doc__ is None or func.__doc__ is None): func.__doc__ = '\n\n'.join([extended_func.__doc__.rstrip('\n'), func.__doc__.lstrip('\n')]) return func return decorator def find_data_files(package, pattern): """ Include files matching ``pattern`` inside ``package``. Parameters ---------- package : str The package inside which to look for data files pattern : str Pattern (glob-style) to match for the data files (e.g. ``*.dat``). This supports the``**``recursive syntax. For example, ``**/*.fits`` matches all files ending with ``.fits`` recursively. Only one instance of ``**`` can be included in the pattern. """ return glob.glob(os.path.join(package, pattern), recursive=True) ././@PaxHeader0000000000000000000000000000003400000000000011452 xustar000000000000000028 mtime=1605787217.0911834 ndcube-1.4.2/astropy_helpers/astropy_helpers/version.py0000644000175100001640000000063700000000000023761 0ustar00vstsdocker00000000000000# Autogenerated by Astropy-affiliated package astropy_helpers's setup.py on 2020-11-19 12:00:17 UTC from __future__ import unicode_literals import datetime version = "3.2.1" githash = "ba3734222a40f4c2864c375c6639f32cd9df06cd" major = 3 minor = 2 bugfix = 1 version_info = (major, minor, bugfix) release = True timestamp = datetime.datetime(2020, 11, 19, 12, 0, 17) debug = False astropy_helpers_version = "" ././@PaxHeader0000000000000000000000000000003400000000000011452 xustar000000000000000028 mtime=1605787216.9031825 ndcube-1.4.2/astropy_helpers/astropy_helpers/version_helpers.py0000644000175100001640000003073700000000000025507 0ustar00vstsdocker00000000000000# Licensed under a 3-clause BSD style license - see LICENSE.rst """ Utilities for generating the version string for Astropy (or an affiliated package) and the version.py module, which contains version info for the package. Within the generated astropy.version module, the `major`, `minor`, and `bugfix` variables hold the respective parts of the version number (bugfix is '0' if absent). The `release` variable is True if this is a release, and False if this is a development version of astropy. For the actual version string, use:: from astropy.version import version or:: from astropy import __version__ """ from __future__ import division import datetime import os import pkgutil import sys import time import warnings from distutils import log from configparser import ConfigParser import pkg_resources from . import git_helpers from .distutils_helpers import is_distutils_display_option from .git_helpers import get_git_devstr from .utils import AstropyDeprecationWarning, import_file __all__ = ['generate_version_py'] def _version_split(version): """ Split a version string into major, minor, and bugfix numbers. If any of those numbers are missing the default is zero. Any pre/post release modifiers are ignored. Examples ======== >>> _version_split('1.2.3') (1, 2, 3) >>> _version_split('1.2') (1, 2, 0) >>> _version_split('1.2rc1') (1, 2, 0) >>> _version_split('1') (1, 0, 0) >>> _version_split('') (0, 0, 0) """ parsed_version = pkg_resources.parse_version(version) if hasattr(parsed_version, 'base_version'): # New version parsing for setuptools >= 8.0 if parsed_version.base_version: parts = [int(part) for part in parsed_version.base_version.split('.')] else: parts = [] else: parts = [] for part in parsed_version: if part.startswith('*'): # Ignore any .dev, a, b, rc, etc. break parts.append(int(part)) if len(parts) < 3: parts += [0] * (3 - len(parts)) # In principle a version could have more parts (like 1.2.3.4) but we only # support .. return tuple(parts[:3]) # This is used by setup.py to create a new version.py - see that file for # details. Note that the imports have to be absolute, since this is also used # by affiliated packages. _FROZEN_VERSION_PY_TEMPLATE = """ # Autogenerated by {packagetitle}'s setup.py on {timestamp!s} UTC from __future__ import unicode_literals import datetime {header} major = {major} minor = {minor} bugfix = {bugfix} version_info = (major, minor, bugfix) release = {rel} timestamp = {timestamp!r} debug = {debug} astropy_helpers_version = "{ahver}" """[1:] _FROZEN_VERSION_PY_WITH_GIT_HEADER = """ {git_helpers} _packagename = "{packagename}" _last_generated_version = "{verstr}" _last_githash = "{githash}" # Determine where the source code for this module # lives. If __file__ is not a filesystem path then # it is assumed not to live in a git repo at all. if _get_repo_path(__file__, levels=len(_packagename.split('.'))): version = update_git_devstr(_last_generated_version, path=__file__) githash = get_git_devstr(sha=True, show_warning=False, path=__file__) or _last_githash else: # The file does not appear to live in a git repo so don't bother # invoking git version = _last_generated_version githash = _last_githash """[1:] _FROZEN_VERSION_PY_STATIC_HEADER = """ version = "{verstr}" githash = "{githash}" """[1:] def _get_version_py_str(packagename, version, githash, release, debug, uses_git=True): try: from astropy_helpers import __version__ as ahver except ImportError: ahver = "unknown" epoch = int(os.environ.get('SOURCE_DATE_EPOCH', time.time())) timestamp = datetime.datetime.utcfromtimestamp(epoch) major, minor, bugfix = _version_split(version) if packagename.lower() == 'astropy': packagetitle = 'Astropy' else: packagetitle = 'Astropy-affiliated package ' + packagename header = '' if uses_git: header = _generate_git_header(packagename, version, githash) elif not githash: # _generate_git_header will already generate a new git has for us, but # for creating a new version.py for a release (even if uses_git=False) # we still need to get the githash to include in the version.py # See https://github.com/astropy/astropy-helpers/issues/141 githash = git_helpers.get_git_devstr(sha=True, show_warning=True) if not header: # If _generate_git_header fails it returns an empty string header = _FROZEN_VERSION_PY_STATIC_HEADER.format(verstr=version, githash=githash) return _FROZEN_VERSION_PY_TEMPLATE.format(packagetitle=packagetitle, timestamp=timestamp, header=header, major=major, minor=minor, bugfix=bugfix, ahver=ahver, rel=release, debug=debug) def _generate_git_header(packagename, version, githash): """ Generates a header to the version.py module that includes utilities for probing the git repository for updates (to the current git hash, etc.) These utilities should only be available in development versions, and not in release builds. If this fails for any reason an empty string is returned. """ loader = pkgutil.get_loader(git_helpers) source = loader.get_source(git_helpers.__name__) or '' source_lines = source.splitlines() if not source_lines: log.warn('Cannot get source code for astropy_helpers.git_helpers; ' 'git support disabled.') return '' idx = 0 for idx, line in enumerate(source_lines): if line.startswith('# BEGIN'): break git_helpers_py = '\n'.join(source_lines[idx + 1:]) verstr = version new_githash = git_helpers.get_git_devstr(sha=True, show_warning=False) if new_githash: githash = new_githash return _FROZEN_VERSION_PY_WITH_GIT_HEADER.format( git_helpers=git_helpers_py, packagename=packagename, verstr=verstr, githash=githash) def generate_version_py(packagename=None, version=None, release=None, debug=None, uses_git=None, srcdir='.'): """ Generate a version.py file in the package with version information, and update developer version strings. This function should normally be called without any arguments. In this case the package name and version is read in from the ``setup.cfg`` file (from the ``name`` or ``package_name`` entry and the ``version`` entry in the ``[metadata]`` section). If the version is a developer version (of the form ``3.2.dev``), the version string will automatically be expanded to include a sequential number as a suffix (e.g. ``3.2.dev13312``), and the updated version string will be returned by this function. Based on this updated version string, a ``version.py`` file will be generated inside the package, containing the version string as well as more detailed information (for example the major, minor, and bugfix version numbers, a ``release`` flag indicating whether the current version is a stable or developer version, and so on. """ if packagename is not None: warnings.warn('The packagename argument to generate_version_py has ' 'been deprecated and will be removed in future. Specify ' 'the package name in setup.cfg instead', AstropyDeprecationWarning) if version is not None: warnings.warn('The version argument to generate_version_py has ' 'been deprecated and will be removed in future. Specify ' 'the version number in setup.cfg instead', AstropyDeprecationWarning) if release is not None: warnings.warn('The release argument to generate_version_py has ' 'been deprecated and will be removed in future. We now ' 'use the presence of the "dev" string in the version to ' 'determine whether this is a release', AstropyDeprecationWarning) # We use ConfigParser instead of read_configuration here because the latter # only reads in keys recognized by setuptools, but we need to access # package_name below. conf = ConfigParser() conf.read('setup.cfg') if conf.has_option('metadata', 'name'): packagename = conf.get('metadata', 'name') elif conf.has_option('metadata', 'package_name'): # The package-template used package_name instead of name for a while warnings.warn('Specifying the package name using the "package_name" ' 'option in setup.cfg is deprecated - use the "name" ' 'option instead.', AstropyDeprecationWarning) packagename = conf.get('metadata', 'package_name') elif packagename is not None: # deprecated pass else: sys.stderr.write('ERROR: Could not read package name from setup.cfg\n') sys.exit(1) if conf.has_option('metadata', 'version'): version = conf.get('metadata', 'version') add_git_devstr = True elif version is not None: # deprecated add_git_devstr = False else: sys.stderr.write('ERROR: Could not read package version from setup.cfg\n') sys.exit(1) if release is None: release = 'dev' not in version if not release and add_git_devstr: version += get_git_devstr(False) if uses_git is None: uses_git = not release # In some cases, packages have a - but this is a _ in the module. Since we # are only interested in the module here, we replace - by _ packagename = packagename.replace('-', '_') try: version_module = get_pkg_version_module(packagename) try: last_generated_version = version_module._last_generated_version except AttributeError: last_generated_version = version_module.version try: last_githash = version_module._last_githash except AttributeError: last_githash = version_module.githash current_release = version_module.release current_debug = version_module.debug except ImportError: version_module = None last_generated_version = None last_githash = None current_release = None current_debug = None if release is None: # Keep whatever the current value is, if it exists release = bool(current_release) if debug is None: # Likewise, keep whatever the current value is, if it exists debug = bool(current_debug) package_srcdir = os.path.join(srcdir, *packagename.split('.')) version_py = os.path.join(package_srcdir, 'version.py') if (last_generated_version != version or current_release != release or current_debug != debug): if '-q' not in sys.argv and '--quiet' not in sys.argv: log.set_threshold(log.INFO) if is_distutils_display_option(): # Always silence unnecessary log messages when display options are # being used log.set_threshold(log.WARN) log.info('Freezing version number to {0}'.format(version_py)) with open(version_py, 'w') as f: # This overwrites the actual version.py f.write(_get_version_py_str(packagename, version, last_githash, release, debug, uses_git=uses_git)) return version def get_pkg_version_module(packagename, fromlist=None): """Returns the package's .version module generated by `astropy_helpers.version_helpers.generate_version_py`. Raises an ImportError if the version module is not found. If ``fromlist`` is an iterable, return a tuple of the members of the version module corresponding to the member names given in ``fromlist``. Raises an `AttributeError` if any of these module members are not found. """ version = import_file(os.path.join(packagename, 'version.py'), name='version') if fromlist: return tuple(getattr(version, member) for member in fromlist) else: return version ././@PaxHeader0000000000000000000000000000003400000000000011452 xustar000000000000000028 mtime=1605787217.3871849 ndcube-1.4.2/astropy_helpers/astropy_helpers.egg-info/0000755000175100001640000000000000000000000023406 5ustar00vstsdocker00000000000000././@PaxHeader0000000000000000000000000000003400000000000011452 xustar000000000000000028 mtime=1605787217.1351836 ndcube-1.4.2/astropy_helpers/astropy_helpers.egg-info/PKG-INFO0000644000175100001640000000503100000000000024502 0ustar00vstsdocker00000000000000Metadata-Version: 2.1 Name: astropy-helpers Version: 3.2.1 Summary: Utilities for building and installing packages in the Astropy ecosystem Home-page: https://github.com/astropy/astropy-helpers Author: The Astropy Developers Author-email: astropy.team@gmail.com License: BSD 3-Clause License Description: astropy-helpers =============== .. image:: https://travis-ci.org/astropy/astropy-helpers.svg :target: https://travis-ci.org/astropy/astropy-helpers .. image:: https://ci.appveyor.com/api/projects/status/rt9161t9mhx02xp7/branch/master?svg=true :target: https://ci.appveyor.com/project/Astropy/astropy-helpers .. image:: https://codecov.io/gh/astropy/astropy-helpers/branch/master/graph/badge.svg :target: https://codecov.io/gh/astropy/astropy-helpers The **astropy-helpers** package includes many build, installation, and documentation-related tools used by the Astropy project, but packaged separately for use by other projects that wish to leverage this work. The motivation behind this package and details of its implementation are in the accepted `Astropy Proposal for Enhancement (APE) 4 `_. Astropy-helpers is not a traditional package in the sense that it is not intended to be installed directly by users or developers. Instead, it is meant to be accessed when the ``setup.py`` command is run - see the "Using astropy-helpers in a package" section in the documentation for how to do this. For a real-life example of how to implement astropy-helpers in a project, see the ``setup.py`` and ``setup.cfg`` files of the `Affiliated package template `_. For more information, see the documentation at http://astropy-helpers.readthedocs.io Platform: UNKNOWN Classifier: Development Status :: 5 - Production/Stable Classifier: Intended Audience :: Developers Classifier: Framework :: Setuptools Plugin Classifier: License :: OSI Approved :: BSD License Classifier: Operating System :: OS Independent Classifier: Programming Language :: Python Classifier: Programming Language :: Python :: 3 Classifier: Topic :: Software Development :: Build Tools Classifier: Topic :: Software Development :: Libraries :: Python Modules Classifier: Topic :: System :: Archiving :: Packaging Provides: astropy_helpers Requires-Python: >=3.5 Provides-Extra: docs ././@PaxHeader0000000000000000000000000000003400000000000011452 xustar000000000000000028 mtime=1605787217.1671839 ndcube-1.4.2/astropy_helpers/astropy_helpers.egg-info/SOURCES.txt0000644000175100001640000000225200000000000025273 0ustar00vstsdocker00000000000000.coveragerc .gitignore .readthedocs.yml .travis.yml CHANGES.rst CONTRIBUTING.md LICENSE.rst MANIFEST.in README.rst ah_bootstrap.py appveyor.yml setup.cfg setup.py .circleci/config.yml astropy_helpers/__init__.py astropy_helpers/conftest.py astropy_helpers/distutils_helpers.py astropy_helpers/git_helpers.py astropy_helpers/openmp_helpers.py astropy_helpers/setup_helpers.py astropy_helpers/utils.py astropy_helpers/version.py astropy_helpers/version_helpers.py astropy_helpers.egg-info/PKG-INFO astropy_helpers.egg-info/SOURCES.txt astropy_helpers.egg-info/dependency_links.txt astropy_helpers.egg-info/not-zip-safe astropy_helpers.egg-info/requires.txt astropy_helpers.egg-info/top_level.txt astropy_helpers/commands/__init__.py astropy_helpers/commands/_dummy.py astropy_helpers/commands/build_ext.py astropy_helpers/commands/build_sphinx.py astropy_helpers/commands/test.py astropy_helpers/commands/src/compiler.c astropy_helpers/sphinx/__init__.py astropy_helpers/sphinx/conf.py docs/Makefile docs/advanced.rst docs/api.rst docs/basic.rst docs/conf.py docs/developers.rst docs/index.rst docs/known_issues.rst docs/make.bat docs/updating.rst docs/using.rst licenses/LICENSE_ASTROSCRAPPY.rst././@PaxHeader0000000000000000000000000000003400000000000011452 xustar000000000000000028 mtime=1605787217.1351836 ndcube-1.4.2/astropy_helpers/astropy_helpers.egg-info/dependency_links.txt0000644000175100001640000000000100000000000027454 0ustar00vstsdocker00000000000000 ././@PaxHeader0000000000000000000000000000003400000000000011452 xustar000000000000000028 mtime=1605787217.1351836 ndcube-1.4.2/astropy_helpers/astropy_helpers.egg-info/not-zip-safe0000644000175100001640000000000100000000000025634 0ustar00vstsdocker00000000000000 ././@PaxHeader0000000000000000000000000000003400000000000011452 xustar000000000000000028 mtime=1605787217.1351836 ndcube-1.4.2/astropy_helpers/astropy_helpers.egg-info/requires.txt0000644000175100001640000000002700000000000026005 0ustar00vstsdocker00000000000000 [docs] sphinx-astropy ././@PaxHeader0000000000000000000000000000003400000000000011452 xustar000000000000000028 mtime=1605787217.1351836 ndcube-1.4.2/astropy_helpers/astropy_helpers.egg-info/top_level.txt0000644000175100001640000000002000000000000026130 0ustar00vstsdocker00000000000000astropy_helpers ././@PaxHeader0000000000000000000000000000003400000000000011452 xustar000000000000000028 mtime=1605787217.3871849 ndcube-1.4.2/astropy_helpers/licenses/0000755000175100001640000000000000000000000020276 5ustar00vstsdocker00000000000000././@PaxHeader0000000000000000000000000000003400000000000011452 xustar000000000000000028 mtime=1605787216.9031825 ndcube-1.4.2/astropy_helpers/licenses/LICENSE_ASTROSCRAPPY.rst0000644000175100001640000000315400000000000024127 0ustar00vstsdocker00000000000000# The OpenMP helpers include code heavily adapted from astroscrappy, released # under the following license: # # Copyright (c) 2015, Curtis McCully # All rights reserved. # # Redistribution and use in source and binary forms, with or without modification, # are permitted provided that the following conditions are met: # # * Redistributions of source code must retain the above copyright notice, this # list of conditions and the following disclaimer. # * Redistributions in binary form must reproduce the above copyright notice, this # list of conditions and the following disclaimer in the documentation and/or # other materials provided with the distribution. # * Neither the name of the Astropy Team nor the names of its contributors may be # used to endorse or promote products derived from this software without # specific prior written permission. # # THIS SOFTWARE IS PROVIDED BY THE COPYRIGHT HOLDERS AND CONTRIBUTORS "AS IS" AND # ANY EXPRESS OR IMPLIED WARRANTIES, INCLUDING, BUT NOT LIMITED TO, THE IMPLIED # WARRANTIES OF MERCHANTABILITY AND FITNESS FOR A PARTICULAR PURPOSE ARE # DISCLAIMED. IN NO EVENT SHALL THE COPYRIGHT HOLDER OR CONTRIBUTORS BE LIABLE FOR # ANY DIRECT, INDIRECT, INCIDENTAL, SPECIAL, EXEMPLARY, OR CONSEQUENTIAL DAMAGES # (INCLUDING, BUT NOT LIMITED TO, PROCUREMENT OF SUBSTITUTE GOODS OR SERVICES; # LOSS OF USE, DATA, OR PROFITS; OR BUSINESS INTERRUPTION) HOWEVER CAUSED AND ON # ANY THEORY OF LIABILITY, WHETHER IN CONTRACT, STRICT LIABILITY, OR TORT # (INCLUDING NEGLIGENCE OR OTHERWISE) ARISING IN ANY WAY OUT OF THE USE OF THIS # SOFTWARE, EVEN IF ADVISED OF THE POSSIBILITY OF SUCH DAMAGE. ././@PaxHeader0000000000000000000000000000003400000000000011452 xustar000000000000000028 mtime=1605787217.3871849 ndcube-1.4.2/docs/0000755000175100001640000000000000000000000014176 5ustar00vstsdocker00000000000000././@PaxHeader0000000000000000000000000000003300000000000011451 xustar000000000000000027 mtime=1605787057.150476 ndcube-1.4.2/docs/Makefile0000644000175100001640000001074500000000000015645 0ustar00vstsdocker00000000000000# Makefile for Sphinx documentation # # You can set these variables from the command line. SPHINXOPTS = SPHINXBUILD = sphinx-build PAPER = BUILDDIR = _build # Internal variables. PAPEROPT_a4 = -D latex_paper_size=a4 PAPEROPT_letter = -D latex_paper_size=letter ALLSPHINXOPTS = -d $(BUILDDIR)/doctrees $(PAPEROPT_$(PAPER)) $(SPHINXOPTS) . .PHONY: help clean html dirhtml singlehtml pickle json htmlhelp qthelp devhelp epub latex latexpdf text man changes linkcheck doctest #This is needed with git because git doesn't create a dir if it's empty $(shell [ -d "_static" ] || mkdir -p _static) help: @echo "Please use \`make ' where is one of" @echo " html to make standalone HTML files" @echo " dirhtml to make HTML files named index.html in directories" @echo " singlehtml to make a single large HTML file" @echo " pickle to make pickle files" @echo " json to make JSON files" @echo " htmlhelp to make HTML files and a HTML help project" @echo " qthelp to make HTML files and a qthelp project" @echo " devhelp to make HTML files and a Devhelp project" @echo " epub to make an epub" @echo " latex to make LaTeX files, you can set PAPER=a4 or PAPER=letter" @echo " latexpdf to make LaTeX files and run them through pdflatex" @echo " text to make text files" @echo " man to make manual pages" @echo " changes to make an overview of all changed/added/deprecated items" @echo " linkcheck to check all external links for integrity" clean: -rm -rf $(BUILDDIR) -rm -rf api -rm -rf generated html: $(SPHINXBUILD) -b html $(ALLSPHINXOPTS) $(BUILDDIR)/html @echo @echo "Build finished. The HTML pages are in $(BUILDDIR)/html." dirhtml: $(SPHINXBUILD) -b dirhtml $(ALLSPHINXOPTS) $(BUILDDIR)/dirhtml @echo @echo "Build finished. The HTML pages are in $(BUILDDIR)/dirhtml." singlehtml: $(SPHINXBUILD) -b singlehtml $(ALLSPHINXOPTS) $(BUILDDIR)/singlehtml @echo @echo "Build finished. The HTML page is in $(BUILDDIR)/singlehtml." pickle: $(SPHINXBUILD) -b pickle $(ALLSPHINXOPTS) $(BUILDDIR)/pickle @echo @echo "Build finished; now you can process the pickle files." json: $(SPHINXBUILD) -b json $(ALLSPHINXOPTS) $(BUILDDIR)/json @echo @echo "Build finished; now you can process the JSON files." htmlhelp: $(SPHINXBUILD) -b htmlhelp $(ALLSPHINXOPTS) $(BUILDDIR)/htmlhelp @echo @echo "Build finished; now you can run HTML Help Workshop with the" \ ".hhp project file in $(BUILDDIR)/htmlhelp." qthelp: $(SPHINXBUILD) -b qthelp $(ALLSPHINXOPTS) $(BUILDDIR)/qthelp @echo @echo "Build finished; now you can run "qcollectiongenerator" with the" \ ".qhcp project file in $(BUILDDIR)/qthelp, like this:" @echo "# qcollectiongenerator $(BUILDDIR)/qthelp/Astropy.qhcp" @echo "To view the help file:" @echo "# assistant -collectionFile $(BUILDDIR)/qthelp/Astropy.qhc" devhelp: $(SPHINXBUILD) -b devhelp $(ALLSPHINXOPTS) $(BUILDDIR)/devhelp @echo @echo "Build finished." @echo "To view the help file:" @echo "# mkdir -p $$HOME/.local/share/devhelp/Astropy" @echo "# ln -s $(BUILDDIR)/devhelp $$HOME/.local/share/devhelp/Astropy" @echo "# devhelp" epub: $(SPHINXBUILD) -b epub $(ALLSPHINXOPTS) $(BUILDDIR)/epub @echo @echo "Build finished. The epub file is in $(BUILDDIR)/epub." latex: $(SPHINXBUILD) -b latex $(ALLSPHINXOPTS) $(BUILDDIR)/latex @echo @echo "Build finished; the LaTeX files are in $(BUILDDIR)/latex." @echo "Run \`make' in that directory to run these through (pdf)latex" \ "(use \`make latexpdf' here to do that automatically)." latexpdf: $(SPHINXBUILD) -b latex $(ALLSPHINXOPTS) $(BUILDDIR)/latex @echo "Running LaTeX files through pdflatex..." make -C $(BUILDDIR)/latex all-pdf @echo "pdflatex finished; the PDF files are in $(BUILDDIR)/latex." text: $(SPHINXBUILD) -b text $(ALLSPHINXOPTS) $(BUILDDIR)/text @echo @echo "Build finished. The text files are in $(BUILDDIR)/text." man: $(SPHINXBUILD) -b man $(ALLSPHINXOPTS) $(BUILDDIR)/man @echo @echo "Build finished. The manual pages are in $(BUILDDIR)/man." changes: $(SPHINXBUILD) -b changes $(ALLSPHINXOPTS) $(BUILDDIR)/changes @echo @echo "The overview file is in $(BUILDDIR)/changes." linkcheck: $(SPHINXBUILD) -b linkcheck $(ALLSPHINXOPTS) $(BUILDDIR)/linkcheck @echo @echo "Link check complete; look for any errors in the above output " \ "or in $(BUILDDIR)/linkcheck/output.txt." doctest: @echo "Run 'python setup.py test' in the root directory to run doctests " \ @echo "in the documentation." ././@PaxHeader0000000000000000000000000000003400000000000011452 xustar000000000000000028 mtime=1605787217.3831847 ndcube-1.4.2/docs/_templates/0000755000175100001640000000000000000000000016333 5ustar00vstsdocker00000000000000././@PaxHeader0000000000000000000000000000003400000000000011452 xustar000000000000000028 mtime=1605787217.3871849 ndcube-1.4.2/docs/_templates/autosummary/0000755000175100001640000000000000000000000020721 5ustar00vstsdocker00000000000000././@PaxHeader0000000000000000000000000000003300000000000011451 xustar000000000000000027 mtime=1605787057.150476 ndcube-1.4.2/docs/_templates/autosummary/base.rst0000644000175100001640000000037200000000000022367 0ustar00vstsdocker00000000000000{% extends "autosummary_core/base.rst" %} {# The template this is inherited from is in astropy/sphinx/ext/templates/autosummary_core. If you want to modify this template, it is strongly recommended that you still inherit from the astropy template. #}././@PaxHeader0000000000000000000000000000003300000000000011451 xustar000000000000000027 mtime=1605787057.150476 ndcube-1.4.2/docs/_templates/autosummary/class.rst0000644000175100001640000000037300000000000022563 0ustar00vstsdocker00000000000000{% extends "autosummary_core/class.rst" %} {# The template this is inherited from is in astropy/sphinx/ext/templates/autosummary_core. If you want to modify this template, it is strongly recommended that you still inherit from the astropy template. #}././@PaxHeader0000000000000000000000000000003300000000000011451 xustar000000000000000027 mtime=1605787057.150476 ndcube-1.4.2/docs/_templates/autosummary/module.rst0000644000175100001640000000037400000000000022744 0ustar00vstsdocker00000000000000{% extends "autosummary_core/module.rst" %} {# The template this is inherited from is in astropy/sphinx/ext/templates/autosummary_core. If you want to modify this template, it is strongly recommended that you still inherit from the astropy template. #}././@PaxHeader0000000000000000000000000000003300000000000011451 xustar000000000000000027 mtime=1605787057.150476 ndcube-1.4.2/docs/api.rst0000644000175100001640000000050000000000000015474 0ustar00vstsdocker00000000000000.. _api: Reference/API ============= .. automodapi:: ndcube .. automodapi:: ndcube.mixins .. automodapi:: ndcube.utils .. automodapi:: ndcube.utils.cube :headings: ^# .. automodapi:: ndcube.utils.sequence :headings: ^# .. automodapi:: ndcube.utils.wcs :headings: ^# .. automodapi:: ndcube.tests.helpers ././@PaxHeader0000000000000000000000000000003300000000000011451 xustar000000000000000027 mtime=1605787057.150476 ndcube-1.4.2/docs/conf.py0000644000175100001640000001615600000000000015506 0ustar00vstsdocker00000000000000# -*- coding: utf-8 -*- # Licensed under a 3-clause BSD style license - see LICENSE.rst # # SunPy documentation build configuration file. # # This file is execfile()d with the current directory set to its containing dir. # # Note that not all possible configuration values are present in this file. # # All configuration values have a default. Some values are defined in # the global Astropy configuration which is loaded here before anything else. # See astropy.sphinx.conf for which values are set there. # If extensions (or modules to document with autodoc) are in another directory, # add these directories to sys.path here. If the directory is relative to the # documentation root, use os.path.abspath to make it absolute, like shown here. # sys.path.insert(0, os.path.abspath('..')) # IMPORTANT: the above commented section was generated by sphinx-quickstart, but # is *NOT* appropriate for sunpy or sunpy affiliated packages. It is left # commented out with this explanation to make it clear why this should not be # done. If the sys.path entry above is added, when the astropy.sphinx.conf # import occurs, it will import the *source* version of astropy instead of the # version installed (if invoked as "make html" or directly with sphinx), or the # version in the build directory (if "python setup.py build_docs" is used). # Thus, any C-extensions that are needed to build the documentation will *not* # be accessible, and the documentation will not build correctly. import numpy as np from pkg_resources import get_distribution import os import sys import pathlib import datetime # -- Import Base config from sphinx-astropy ------------------------------------ try: from sphinx_astropy.conf.v1 import * except ImportError: print('ERROR: the documentation requires the "sphinx-astropy" package to be installed') sys.exit(1) if on_rtd: os.environ['SUNPY_CONFIGDIR'] = '/home/docs/' os.environ['HOME'] = '/home/docs/' os.environ['LANG'] = 'C' os.environ['LC_ALL'] = 'C' versionmod = get_distribution('ndcube') # The version info for the project you're documenting, acts as replacement for # |version| and |release|, also used in various other places throughout the # built documents. # The short X.Y version. version = '.'.join(versionmod.version.split('.')[:3]) # The full version, including alpha/beta/rc tags. release = versionmod.version.split('+')[0] # Is this version a development release is_development = '.dev' in release # -- Shut up numpy warnings from WCSAxes -------------------------------------- np.seterr(invalid='ignore') # -- General configuration ---------------------------------------------------- # If your documentation needs a minimal Sphinx version, state it here. needs_sphinx = '2.0' # To perform a Sphinx version check that needs to be more specific than # major.minor, call `check_sphinx_version("x.y.z")` here. check_sphinx_version(needs_sphinx) # Add any custom intersphinx for SunPy intersphinx_mapping.pop('h5py', None) intersphinx_mapping['sunpy'] = ('https://docs.sunpy.org/en/stable/', None) intersphinx_mapping['sqlalchemy'] = ('https://docs.sqlalchemy.org/en/latest/', None) intersphinx_mapping['pandas'] = ('https://pandas.pydata.org/pandas-docs/stable/', None) intersphinx_mapping['skimage'] = ('https://scikit-image.org/docs/stable/', None) intersphinx_mapping['drms'] = ('https://docs.sunpy.org/projects/drms/en/stable/', None) intersphinx_mapping['parfive'] = ('https://parfive.readthedocs.io/en/latest/', None) # List of patterns, relative to source directory, that match files and # directories to ignore when looking for source files. exclude_patterns.append('_templates') # Add any paths that contain templates here, relative to this directory. if 'templates_path' not in locals(): # in case parent conf.py defines it templates_path = [] templates_path.append('_templates') # For the linkcheck linkcheck_ignore = [r"https://doi.org/\d+", r"https://riot.im/\d+", r"https://github.com/\d+", r"https://docs.sunpy.org/\d+"] linkcheck_anchors = False # This is added to the end of RST files - a good place to put substitutions to # be used globally. rst_epilog = """ .. ndcube .. _SunPy: https://sunpy.org .. _`SunPy mailing list`: https://groups.google.com/group/sunpy .. _`SunPy dev mailing list`: https://groups.google.com/group/sunpy-dev """ # -- Project information ------------------------------------------------------ project = 'ndcube' author = 'The SunPy Community' copyright = '{}, {}'.format(datetime.datetime.now().year, author) try: from sunpy_sphinx_theme.conf import * except ImportError: html_theme = 'default' html_logo = 'logo/ndcube.png' # The name of an image file (within the static path) to use as favicon of the # docs. This file should be a Windows icon file (.ico) being 16x16 or 32x32 # pixels large. html_favicon = "./logo/favicon.ico" # The name for this set of Sphinx documents. If None, it defaults to # " v documentation". html_title = f'{project} v{release}' # Output file base name for HTML help builder. htmlhelp_basename = project + 'doc' # A dictionary of values to pass into the template engine’s context for all pages. html_context = {} html_context['to_be_indexed'] = ['stable', 'latest'] # -- Options for LaTeX output -------------------------------------------------- # Grouping the document tree into LaTeX files. List of tuples # (source start file, target name, title, author, documentclass [howto/manual]). latex_documents = [('index', project + '.tex', project + ' Documentation', author, 'manual')] # -- Options for manual page output -------------------------------------------- # One entry per manual page. List of tuples # (source start file, name, description, authors, manual section). man_pages = [('index', project.lower(), project + ' Documentation', [author], 1)] # -- Swap to Napoleon --------------------------------------------------------- # Remove numpydoc extensions.remove('numpydoc') extensions.append('sphinx.ext.napoleon') # Disable having a separate return type row napoleon_use_rtype = False # Disable google style docstrings napoleon_google_docstring = False extensions += ['sphinx_astropy.ext.edit_on_github', 'sphinx.ext.doctest', 'sphinx.ext.githubpages'] # -- Options for the edit_on_github extension --------------------------------- # Don't import the module as "version" or it will override the # "version" configuration parameter edit_on_github_project = "sunpy/ndcube" if 'dev' not in release: edit_on_github_branch = f"{version}" else: edit_on_github_branch = "master" edit_on_github_source_root = "" edit_on_github_doc_root = "docs" edit_on_github_skip_regex = '_.*|generated/.*' github_issues_url = 'https://github.com/sunpy/ndcube/issues/' """ Write the latest changelog into the documentation. """ target_file = os.path.abspath("./whatsnew/latest_changelog.txt") try: from sunpy.util.towncrier import generate_changelog_for_docs if is_development: generate_changelog_for_docs("../", target_file) except Exception as e: print(f"Failed to add changelog to docs with error {e}.") # Make sure the file exists or else sphinx will complain. open(target_file, 'a').close() ././@PaxHeader0000000000000000000000000000003300000000000011451 xustar000000000000000027 mtime=1605787057.150476 ndcube-1.4.2/docs/contributing.rst0000644000175100001640000003240500000000000017443 0ustar00vstsdocker00000000000000====================== Contributing to ndcube ====================== We are always enthusiastic to welcome new users and developers who want to enhance the ndcube package. You can contribute in several ways, from providing feedback, reporting bugs, contributing code, and reviewing pull requests. There is a role for almost any level of engagement. Providing Feedback ------------------ We could always use more voices and opinions in the discussions about ndcube and its development from both users and developers. There are a number of ways to make your voice heard. Whether it be constructive criticism, inquiries about current or future capabilities, or flattering praise, we would love to hear from you. You can contact us on the SunPy matrix channel or SunPy mailing lists. See :ref:`getting_help`. Reporting Bugs -------------- If you run into unexpected behavior or a bug please report it. All bugs are raised and stored on our `issue tracker`_. If you are not sure if your problem is a bug, a deficiency in functionality, or something else, send us a message on the SunPy chat room or an email to the developer mailing list. Ideally, we would like a short code example so we can run into the bug on our own machines. See :ref:`getting_help`. .. _contributing_code: Contributing Code ----------------- If you would like to contribute code, it is strongly recommended that you first discuss your aims with the ndcube community. We strive to be an open and welcoming community for developers of all experience levels. Discussing your ideas before you start can give you new insights that will make your development easier, lead to a better end product, and reduce the chances of your work being regetfully rejected because of an issue you weren't aware of, e.g. the functionality already exists elsewhere. See :ref:`getting_help` to contact the ndcube community. In the rest of this section we will go through the steps needed to set up your system so you can contribute code to ndcube. This is done using `git`_ version control software and `GitHub`_, a website that allows you to upload, update, and share code repositories (repos). If you are new to code development or git and GitHub you can learn more from the following guides: * `SunPy Newcomer's Guide`_ * `github guide`_ * `git guide`_ * `SunPy version control guide`_ The principles in the SunPy guides for contributing code and utilizing GitHub and git are exactly the same for ndcube except that we contribute to the ndcube repository rather than the SunPy one. If you are a more seasoned developer and would like to get further information, you can check out the `SunPy Developer's Guide`_. Before you can contribute code to ndcube, you first need to install the development version of ndcube. To find out how, see :ref:`dev_install`. In the rest of this section we will assume you have performed the installation as described there. Next, you will have to create a new online version of the ndcube repo on your own GitHub account, a process known as "forking". (If you don't have a GitHub account, `sign up here`_.) Sign into your GitHub account and then go to the main `ndcube GitHub repository`_. Click the "Fork" button in the top right corner of the page. A pop-up window should appear asking to you to confirm which GitHub account you wish to fork to. Once you've done that, you should have a new version of the ndcube repo on your own GitHub account. It should reside at a URL like https:/github.com/my_github_handle/ndcube. Next, we need to link our newly forked online repo with the one we created on our local machine as part of the installation. To do this, we will have to create a remote. A `git remote`_ is an alias pointing to the URL of a GitHub repo. To see what remotes you have and the URLs they point to, change into the local repo directory on the command line and type: .. code-block:: console $ git remote -v If you have installed the ndcube development version as outlined in :ref:`dev_install`, you will have one remote called ``origin`` pointing to https://github.com/sunpy/ndcube. Let's now add a remote to the repo in your GitHub account called ``my_repo``. In a terminal, from the local repo directory, type: .. code-block:: console $ git remote add my_repo https:/github.com/my_github_handle/ndcube where you replace ``my_github_handle`` with your GitHub name. Now you can check that the remote has been added by again typing ``git remote -v``. Now you're ready to get coding! The following subsection will outline an example workflow for contributing to ndcube. .. _contributing_workflow: Example Workflow for Contributing Code -------------------------------------- To make changes to the development version of ndcube, we must first activate the environment in which it is installed. Recall during installation, we named this environment ``ndcube-dev``. From any directory on the command line, Windows users should type: .. code-block:: console > activate ndcube-dev while Linux and MacOS users should type: .. code-block:: console $ source activate ndcube-dev Next, change into the local ndcube repo directory, ``ndcube-git``. When you are making changes to ndcube, it is strongly recommended that you use a different `git branch`_ for each set of related new features and/or bug fixes. Git branches are a way of having different versions of the same code within the repo simultaneously. Assuming you have just installed the ndcube development version, you will only have one branch, called ``master``. It is recommended you do not do any development on the ``master`` branch, but rather keep it as an clean copy of the latest ``origin master`` branch. If you have more than one branch, the ``*`` next to the branch name will indicate which branch you are currently on. To check what branches you have and which one you are on, type in the terminal: .. code-block:: console $ git branch If you are not on the ``master`` branch, let's start by changing to it (known as checking out the branch): .. code-block:: console $ git checkout master Now, let's ensure we have the latest updates to the development version from the main repo. .. code-block:: console $ git pull origin master This updates the local branch you are on (in this case, ``master``) with the version of the ``master`` branch stored in the ``origin`` remote, i.e. the original ndcube GitHub repo. Let's now create a new branch called ``my_fix`` on which to develop our new feature of bugfix. Type: .. code-block:: console $ git checkout -b my_fix This will not only create the new branch but also check it out. The new branch will now be an exact copy of the branch from which you created it, in this case, the ``master`` branch. But now you can edit files so that the ``my_fix`` branch diverges while keeping your ``master`` branch intact. After a while, you've made some changes that partially or completely fix the bug. We now want to commit that change. Committing is a bit like saving except that it records the state of the entire code base, not just the file you've changed. You can then revert to this state at any time, even after new commits have been made. So if you mess up in the future you can always go back to a version thats worked. This is why it's called version controlling. Before committing, we can see a list of files that we've changed by typing: .. code-block:: console $ git status We can also get a summary of those changes, line by line: .. code-block:: console $ git diff Once we're happy with the changes, we must add the changed files to the set to be included in the commit. We do not have to include all changed file. We can add files one by one: .. code-block:: console $ git add file1.py $ git add file2.py or add all changed files at once: .. code-block:: console $ git add --all Be sure to check what files have changed before using this option to make sure you know what you are committing. Finally, to commit, type: .. code-block:: console $ git commit This will open a text editor, usually VI, and allow you to enter a commit message to describe the changes you've made. A commit message is required before the commit can take place. Once you've entered your message, save it and exit your text editor. Voila! You've committed your changes!! To speed things up, the above process can be done in one command if desired: .. code-block:: console $ git commit -am 'My first commit.' where ``'My first commit.'`` is the commit message. But CAUTION! This adds and commits all changed files. So make sure you know what files have changed and how they've changed before doing this. Many a developer has accidentally committed extra files using this command and has wasted time undoing their mistake. Say it's the next day and you want to continue working on your bugfix. Open a terminal, activate your ``ndcube-dev`` conda environent, change into the ``ndcube-git`` directory and make sure you are on the correct branch. Also make sure you pull any new updates from the ``origin`` ``master`` branch to your local ``my_fix`` branch: .. code-block:: console $ source activate ndcube-dev # For Windows users, type "activate ndcube-dev" $ cd ndcube-git $ git branch $ git checkout my_fix $ git pull origin master Assuming there are no updates that conflict with the changes you made the other day, you're ready to continue working. If there are conflicts, open the affected files and resolve them. After more work and more commits, let's say you are ready to issue a pull request (PR) to get feedback on your work and ultimately have it approved and merged into the main repo! First you have to push your changes to your GitHub account, using the ``my_repo`` remote: .. code-block:: console $ git push my_repo my_fix Now your changes are available on GitHub. Follow the steps below to open a PR: #. In a browser, go to your GitHub account and find your version of the git repo. The URL should look like this: https://github.com/my_github_handle/ndcube/ #. There should be a green button on the right marked "Compare & pull request". Click it. If it is not there, click on the "Pull Requests" tab near the top of the page. The URL should look like this: https://github.com/my_github_handle/ndcube/pulls. Then click on the green "New Pull Request" button. This will open a new page with four drop-down menus near the top. #. Set the "base fork" drop-down menu to "sunpy/ndcube" and the "base" drop-down to "master". This describes the repo and branch the changes are to be merged into. Set the "head fork" drop-down menu to "my_github_handle/ndcube" and the "compare" drop-down to "my_fix". This sets the repo and branch in which you have made the changes you want to merge. #. Enter a title and a description of the PR in the appropriate boxes. Try to be descriptive so other developers can understand the purpose of the PR. #. Finally, click the green "Create Pull Request" button. Well done! You've opened your first PR! Now begins the process of code review. Code review is a standard industry practice which involves other members of the community reviewing your proposed changes and suggesting improvements. It is a fantastic way of improving your coding abilities as well as preserving the integrity of the overall package. A bugfix does not have to be finished in order to open a PR. In fact, most PRs are incomplete when they are first opened. This allows others to follow your progress and contribute suggestions if you get stuck. Anyone can review a PR. Experience is not a disqualifying factor. But it is recommended that at least one experienced developer reviews your code. You can make updates to your PR by editing your local ``my_fix`` branch, committing the new changes and pushing them to the ``my_repo`` remote. The PR will then be automatically updated with the new commits. Once you've made all changes and the online tests have passed, those reviewing your code can approve the PR. Approved PRs can then be merged by those with write permissions to the repo. Congratulations! You have just contributed to ndcube! Be sure to pull your the newly contributed changes to your local master branch by doing: .. code-block:: console $ git checkout master $ git pull origin master You are now ready to start using the newly improved development version of ndcube, including your changes! If you have questions about this guide or while making contributions, ndcube and SunPy developers are always happy to help. See :ref:`getting_help`. Happy coding and talk to you soon! .. _issue tracker: https://github.com/sunpy/ndcube/issues .. _sign up here: https://github.com/join?source=header-home .. _ndcube GitHub repository: https://github.com/sunpy/ndcube .. _GitHub: https://github.com/ .. _git: https://git-scm.com/ .. _SunPy Newcomer's Guide: http://docs.sunpy.org/en/stable/dev_guide/newcomers.html .. _github guide: https://guides.github.com/ .. _git guide: https://git-scm.com/book/en/v2/Getting-Started-Git-Basics .. _SunPy version control guide: http://docs.sunpy.org/en/stable/dev_guide/version_control.html .. _SunPy Developer's Guide: http://docs.sunpy.org/en/stable/dev_guide .. _pull requests: https://help.github.com/articles/about-pull-requests/ .. _git branch: https://git-scm.com/book/en/v2/Git-Branching-Branches-in-a-Nutshell .. _git remote: https://git-scm.com/book/en/v2/Git-Basics-Working-with-Remotes ././@PaxHeader0000000000000000000000000000003300000000000011451 xustar000000000000000027 mtime=1605787057.150476 ndcube-1.4.2/docs/getting_help.rst0000644000175100001640000000403400000000000017402 0ustar00vstsdocker00000000000000.. _getting_help: ============ Getting Help ============ If you are unsure of the general features of the ndcube package, your first stop should be this guide which we strive to make as comprehensive, yet understandable, as possible. If you would like to know more about how to use a specific function, class, etc., documentation for each object is stored in the source code itself. It can be accessed there or in the :ref:`api` of this guide. If you cannot find the answer to your issue there, would like to provide feedback, or would like help contributing code to ndcube, you can get help directly from the ndcube team in a number of ways. SunPy Mailing Lists ------------------- ndcube is a SunPy-affiliated package and is developed within the larger SunPy development ecosystem. You can contact the ndcube developers by email via the SunPy mailing lists. Just reference ndcube in the email subject and it will find its way to a relevant developer. SunPy has two mailing lists: a `general mailing list`_; and a `developer mailing list`_. If you have a general question about how ndcube works, use the general mailing list. The general mailing list is another way to get help with doing solar physics in Python. If, on the other hand, you have a question about the inner workings of ndcube, how ndcube is organized, or have a question about developing some new feature, please use the developer mailing list. Live Chat --------- SunPy has a `chat room`_ that uses `Matrix`_ which will open directly in your browser. Stop by and say hello. Talk with the SunPy and ndcube users and developers and get started. If you are using IRC hosted on `freenode`_, then you can join the same channel (bridged into matrix) by joining the (#sunpy) channel on freenode. .. _`general mailing list`: https://groups.google.com/forum/#!forum/sunpy .. _`developer mailing list`: https://groups.google.com/forum/#!forum/sunpy-dev .. _`chat room`: https://riot.im/app/#/room/#sunpy:openastronomy.org .. _`Matrix`: https://matrix.org/blog/home/ .. _`freenode`: https://freenode.net/ ././@PaxHeader0000000000000000000000000000003300000000000011451 xustar000000000000000027 mtime=1605787057.150476 ndcube-1.4.2/docs/index.rst0000644000175100001640000000057300000000000016044 0ustar00vstsdocker00000000000000************ ndcube Guide ************ Welcome to the ndcube User Guide. ndcube is a SunPy-affiliated package designed for handling n-dimensional datacubes described by a WCS (World Coordinate System) translation. .. toctree:: :maxdepth: 2 introduction installation ndcube ndcubesequence ndcollection contributing getting_help api whatsnew/index ././@PaxHeader0000000000000000000000000000003300000000000011451 xustar000000000000000027 mtime=1605787057.150476 ndcube-1.4.2/docs/installation.rst0000644000175100001640000000472500000000000017441 0ustar00vstsdocker00000000000000============ Installation ============ ndcube requires Python 3.5+, SunPy 0.9+, astropy and matplotlib. Installing the Stable Version ----------------------------- There are two options for installing the stable version of ndcube. The first is via the anaconda distribution using the conda-forge channel. For more information on installing the anaconda distribution, see the `anaconda website`_. .. code-block:: console $ conda install --channel conda-forge ndcube To update ndcube do: .. code-block:: console $ conda update ndcube The second option for installing the stable version of ndcube is via pip. .. code-block:: console $ pip install ndcube Then to update ndcube do: .. code-block:: console $ pip install ndcube --upgrade .. _dev_install: Installing the Development Version ---------------------------------- The stable version of ndcube will be reliable. However, if you value getting the latest updates over reliablility, or want to contribute to the development of ndcube, you will need to install the development version via `GitHub`_. Let's step through how to do this using anaconda. For information on installing the anaconda distribution, see the `anaconda website`_. First, create a conda environment on your local machine to hold the ndcube bleeding edge version. Using a new environment allows you to keep your root environment for stable package releases. Let's call the new conda environment ``ndcube-dev``. Type the following into a terminal: .. code-block:: console $ conda config --append channels conda-forge $ conda create -n ndcube-dev sunpy hypothesis pytest-mock pip sphinx coverage ipython jupyter Be sure to activate the environment, i.e. switch into it. In Linux or MacOS, type: .. code-block:: console $ source activate ndcube-dev In Windows, type: .. code-block:: console > activate ndcube-dev Next clone the ndcube repo from GitHub to a new directory. Let's call it ndcude-git. .. code-block:: console $ git clone https://github.com/sunpy/ndcube.git ndcube-git To install, change into the new directory and run the install script. .. code-block:: console $ cd ndcube-git $ pip install -e . Voila! The ndcube development version is now installed! Be sure you get the latest updates by regularly doing: .. code-block:: console $ git pull origin master .. _anaconda website: https://docs.anaconda.com/anaconda/install.html .. _GitHub: https://github.com/ .. _ndcube GitHub repository: https://github.com/sunpy/ndcube ././@PaxHeader0000000000000000000000000000003300000000000011451 xustar000000000000000027 mtime=1605787057.150476 ndcube-1.4.2/docs/introduction.rst0000644000175100001640000000521200000000000017451 0ustar00vstsdocker00000000000000========================= An Introduction to ndcube ========================= N-dimensional Data in Astronomy ------------------------------- N-dimensional data sets are common in all areas of science and beyond. For example, a series of images taken sequentially with a CCD camera can be stored as a single 3-D array with two spatial axes and one temporal axis. Each array-element can represent a pixel and the value in that array-element can represent the reading in that pixel at a given time. In astronomy, the relationship between the pixel coordinate and the location and time in the Universe being observed is often represented by a World Coordinate System (WCS) transformation described by a set of well-defined parameters with standarized names. This, coupled with WCS's ability to handle many different types (e.g. spatial, temporal, spectral, etc.) of transformations make it a succinct, standard and powerful way to relate pixels from an observation or cells in a simulation grid to the location in the Universe to which they correspond. Due of the prevalence of N-D data and the importance of the transformations, there exist mature scientific Python packages (e.g. numpy and astropy) that contain powerful tools to handle N-D arrays and WCS transformations. What is ndcube? --------------- ndcube is a free, open-source, community-developed Python package whose purpose is to combine the above-mentioned functionalities for handling N-D arrays and WCS coordinate transformations into a single package of objects. These objects provide unified slicing, visualization, coordinate conversion and inspection of the data, WCS transformations and metadata. ndcube does this in a way that is not specific to any number or physical type of axis, and so can in principle be used for any type of data (e.g. images, spectra, timeseries, etc.) so long as the data are represented by an array and a set of WCS transformations. This makes ndcube ideal to subclass when creating classes for specific types of data, while keeping the non-data-specific functionalities (e.g. slicing) common between classes. The ndcube package is composed of two basic classes: `~ndcube.NDCube` and `~ndcube.NDCubeSequence`. The former is for managing a single array and set of WCS transformations, while the latter is for handling multiple arrays, each described by their own set of WCS transformations. They share similar APIs to enable the user to think more about their data in an intuitive way, rather than focusing on the format in which that data happens to be stored. The functionality of these classes in outlined in detail in the :ref:`ndcube` and :ref:`ndcubesequence` sections of this guide. ././@PaxHeader0000000000000000000000000000003400000000000011452 xustar000000000000000028 mtime=1605787217.3871849 ndcube-1.4.2/docs/logo/0000755000175100001640000000000000000000000015136 5ustar00vstsdocker00000000000000././@PaxHeader0000000000000000000000000000003300000000000011451 xustar000000000000000027 mtime=1605787057.150476 ndcube-1.4.2/docs/logo/favicon.ico0000644000175100001640000000217600000000000017265 0ustar00vstsdocker00000000000000 h(    ySͯͰIEȸʻHENͿ^0t<ML^08ԻY*HJM¤ɩ].f;@FHPY*nCʳY)^mD;xRW(NX)IW'æc7|XErJX)L\.S!Pūf:O?˽EP_2HS"¨c6DC ˼j>R![-MGyȳf:@ f;̾A IPW(f;r{`3z,Ţϳ<LNbƱͼ˸ȳŦd7h@ LNȱ˸ɵʸrJ6IPī̻ì°LGĪεwPz././@PaxHeader0000000000000000000000000000003300000000000011451 xustar000000000000000027 mtime=1605787057.150476 ndcube-1.4.2/docs/logo/ndcube.png0000644000175100001640000015031400000000000017110 0ustar00vstsdocker00000000000000PNG  IHDRc\P'szTXtRaw profile type exifxڥivd)^0A/KCUugd${<0a;\˥:jGqM;~]u7׷^?__&ߕ._o/I#=oR"|]`~=^{~]s+_ I1ﮛ'Fz)8~K L9U)=fpݏ_Jc;R ?|_}wǝy}j6q?ϼw{fLszGy-TƗDY0Bdnan8c9آw1FS#[Ҭ?ƖF,pLƒ'^?mǻй5.O|:?DX0Mc߼M& ?Z fq*Hoo,|_hL. א\ 5c ,d1XPJ 2*#;H ﭱEeW!k3dr.O˝%\J^F5\KUlVZuF=KG#`QG}1'\y\qWYu5kcيUkmq v{y!N>N?Kt-n?UocxKIњb1B7-fVNkG$+JdEV'rÏNK,͵cg圖?X?_ڣ. ޤDgSD_/p߰,~v5o=A6 ׀Ak=ª41랛 Goڛ5Y{>R#)a:WLQ"bŧ~sڙ3PM|o 0e<,C3j},݉L5WqQТ2l{0;k1ۄoy@z6W9sMD]ܞQ%~ iݺLG٘t["eNH\3ZFnm 3w]YiE(h4 䉪7w\X z-Օ{HX/Z0GNb9n BZ?!Iꗉb*ҬO+㿄eDpcj_"Jw\ɌntcD %؍LAn"shb""N\LعJ%cKq}@_0X@f}'}o~e9q-ɀaS?`ٱ&ԌX~T$$  ?O0(%^Ct(n^Ҙ8lS6&oJ5%%D!ޛY#(Η ,nֹ/% s[ٖBӈkFr+j4HEFv250 jP<HɉB AVI5F5Da8&(`?%к`}{$N14%I0{xXf$$n:D^~NJUNZsjdNR-^˝M5$򸭓CF K@ Ґ#L3>W=]Z9' ueiY ƌ>.w,(6:Qɥ'p"-K62p:}]"0:Xl u ӿzlE~q#XeaP;1\lHx LTęRE5Õ#Q7%TWd1¤ZXbL! a]ГS4<*fh!ȳcWa+J, %oc4w`/ F:d5BEX eP:oY] 6#)< @ TcC#)@wY)H3Z1Aq}J>k9GFX0R[#.n5-h01+Ø0)^|.ʜz9E@2FҐ`&ŴqwàRd SdY(cԀ7S}}v?s+.ER+h:zIAâyQQIzخw"'D'e4d@#IA {wJx-W%/;u } 0~+lk3i%/VZxwWH f6D7L8l!2 HpCL "9hBr{k"5L! _|(Ϭ 6=pP٨R"̰ m.0:dhݤ.=_PE*9Pd<:e.1rkhAg{ˋMr D(]xG0tK֬+@IN)XVӁk+{kP ŗ\g2hytx wࡪ; Lm#*FR7h>'.g3pP @4_^b~+ ABxJذ> pĎI.`b!@.ϖ%WSy{0;kщvl#("6Vxd&A\]fA1TiR"gy'DC@1C;pxqS i% ާ 82Szl 71b8W"fG$-Jl\ gᗆd9k˶veɰyꗌ o~Lj_7@]H-2LJŅ!n[&A(!Ɉfī>?£]XPE43:2yAYC.} ۺ"%ܴP$ w!HC B$gY89rh5d)4YH< #q{=*)Å6VAoL}cRwkT;xqt@`K]~~wB!4qgWXVRq9kA;{{Th%$ˋ8e< H%X~.c $xXO׼#:z<Ҕ?0Vb>+N=%j/bgK]Ld$) /TT!@NM&ɠ@a<")g/\s4 C ,E5 B՗37gT͟d"ء:+'jU|E{bsC^d! }T2#w(TM1LTMvjaL!_[6qQfeYJG 5|VD!0R艌b͓d'@mpk"nPN:`:q'Yj(݈r_#lZL4]<4,Y4w/C&C0$l)8 ~a-v5 #%R9-&T2}Ŀkx0hR"KpC5s_:-+s/w2w G}4RڑK&<.p1Nƌ%ZrnK25\LxS+3h'Y x0XǍ`;JAg>4fT7j>эl ~;9F67$͌u|jLHaUAw.|xްˣEIيGҮ̨@B۱3~S |e<~t*FrVxluȮVU K۳UϺ bTkGt7A1qKB<׊ږCpԃ;2c@HW|!2 #= І{j C ƪ!TKQU1[U`7`y*]*c`Af;B'h#8|5N>1,<:21h,MTL/yq{D`=lnK#PkԎ]:Lĕg/xy+h凞HK-d_{ +H3@WţuݕniF^SX{p2%``L'1E߫#F=W)$"T:VBΑ֥4 Lhʭ͘=SMvt,է"1!jr+p)JJ-)UrTEe ? {sT7^#ê+A?[ D}P&.J$m3o«j,PQKifЛ34*y]tHt&# WݒMs-"흤3+갴Pɟ}cpJfQ[Ršk!*뎱 P[ʻjbvLihᨶ%*YneRe*[_߄'E _ iB,,ٙh2]IXvJFn(dg5sh2𥧤mʛ?|lrtm4Cѽ*CaL8O} 5cӫ K4Xi.,0ވ/呰J_8Ѽ>2|"pP65POLi1HRCIܰ.n(#UP^ b, ›4<&1xm |hL@jkX8|bݦv;$rG)28E+rdmV+Ռ9Exl} M86~kYð/?dbG/u bw& X "k5TH%c2R[h%ف8*hBӍ$(=8p9O|,SPfǗA\W^鈣'-’MeP'\F~ 2UI:A&y돪k"GH^h#ˈu LE3MN!kd@73ADa ȤHfȏÒ P'aԗtw#c-}FVT6: ϔՏE V(ܺz`!nN&J蟡o1"?aǓ 0M\Dma g^PiYleW][p2(uve #F vb;7w ?rV>A?WHޡJ7_a 0NKJ',i?XiHJځЙ [e| U,`%:*([EJ"Ҭ^(![ҷ=2#qѿC/q9tf ֻDChȅۦj&UrQ`%}N<ej.?9ūrT\\])T=--4HdQNi ?]giֿH'Lw4@G-%wDC9Џ6CB~9MǙ xX  b>Q_b;|-RP*v/`EY V ~08|AcfcHf) x$Xq]3kF Et2]}F4Ug9?;zL0z_l¡f Q dJ`t$bX*%%OX^hjW A"xZa{p";: DNBXo!"&UB8:;f1uQPD@^G5<_B#znw:D.TغE=N-5z-\QgQ35B#'NkDPtT$çrP<$qܑ*pBcL픓:j aۭ-۫gC~`*\x+f+Ӿ鐤*Ra$`w9eឪE Q d`15iPM)q34+W'6^[pڼ!bQVd%2Mf֘ԇ:XBzċ"״ +UdGMV+ 3M("rMiGnLƆB^HJWZ\ Kt()/x^"9(Y;rP.Ag1N jfdW Θ"MLg~mIESpfA;W-_Id8kLuf#W^9<M崄<V3o|׾*]' =>x a 5{l}V߇Ss%ZUcAEͻ9I)_1QQs %3%6A[=juڕcY,4t212<)БǤӋ{!Ƥjje&g| V_e:˜i0D'Dc=q&%3Pj= NN5D 67WLD6F qI#&չ 9Į_7>+o*R)lA>kFw7_iAIbKGDJ. pHYs  tIME .ߴ( IDATxyյ6]Dh@@QfQqH@ $5jLz1ƛ38FQ"hDeR (t*4 ݧj:=@{9Uu~Z]3gylO.^Of>@f>@ o$wG1w 壕mEvX5CJ́3gf/g3^|=;sD790v欩GC/Fee5c @?}~n;s`ڂe Qݣ=)xwؙڶ}t^Hm񧟜;q́3g ;N[ sĊQ :ۍ}g9Ke{~RTJ}d@޾}!~sOrscgΔv;C-8P-)~Aw:;kٶ|_Ǣ[I0$,{d~䦉xw1tLD8ol)ڳqB㓝90vG+byK$bAX ~= #Gsc$# A9ιw nt7/eܸ2msKL|rE<<Ƙbo =^14ڶ29yְg8+pE\i3;kǗŦ")ϸbo%P ^s#m2?k쐞éa_],h)%'y dT"w]cewU╥[iׁ:*v9'-7W9Y>6MxnCޯ$x 6γ` `ce]iCqRtj8H 3סt}G:™cge//HfΡ?]ʪj|CY[[zO#x[6;c O>_2]aQyfth goZOBgqq]˃XǚO&&=w$aZ;' 879Yn ҄?3<(%XVَǶŷ? NZ6)D,$P8GO9́ܰ]J 4 z1OdCpZpmkY}ŵgߣ{x(~*;ˎv; 4{-( Ip?U)@, O[CK|5CK0lzkv;ˌm+GSn{4vT|]`I$+(L/< [{{z8!:P߅WY^i?)=ETẃs^3ƲwH @)UF"@mҷޣyX.HE/x+ݎ}ָx\7i0:C=0myjvp;kjhNzAj E1Sa"BM#/R\g]b:z[N%dlt=w^'z(;\Z:s^dž}KHwO56V}˗E!)tT3ΎȢҖ }C=C,`lW0bc0*bQ 1}T3GS8kR=a,Zؒ0ɤ6/wWN= Yr|cg-,Җ<2Rœ*%-[ZN ̖4(4;d =ٹP I';ΖenvҖe;"M9ÖB&YlsHN}77O{,49=PS܀h˥9Ys;.->̩Vh%-vhL2x;Z=;"nķBSss^w^cgj?JG[cM@V璲F6 agNTg*}>qR iW-xc=/F4;n!/:Q甬<įZW?HopY'*yg[wY U=?lwkB *`mO_?OWY.>F]Щ32zn7> /({^:WNnKuJzYS؊W?J;ҖX OwP !%%8JYCBƌvNO^hۊА+g!U8O.EveZμt= qt"WX78YK.yGbEYq@[pL#{Ǘ„]E5zQͷ@5ng!μ';0v dW%]iOc{I8Hu⦙ZJ}9hզgN8L!~*zSaBVρcg鰻]Jg!m Q8l{\HUs7qErP\uV?tn*V'$L^t`쬩mu4j!mYu0@ uu70,\:\5 uKB+1ϕV7g,쭵/KQQOsKlC7ɤ !,[KϜ3oyE՚ctƘ|V!h}{’zG \w1K?08, ڊr>RA՟`oUR{ՌA^޾]<_q; p* ViJ|"ҕsvYO>?>N>?^M>ۜٶ}ǥ1$ U9WW0Yq\ZdYC{];+10嶧\* p&%y $ Cu+h0m\7qzvj]g'SbyރŜ%LJ"57 הv@,~F`>GTeKQT`B:<̙Iu:0nY =jTH:.SY!4;nb$T7@ ~y#n90n>#̇y90vGm0,ŋ6)ٮΩO|/pѿ>~7o8 dʉsR:QR-Z>Gq崡R~kQT Ιo}$in `~ǁFLmAɱ}'t-Wġ:n\v9>gJ AzrzsZ=`/k1dRzˎp 8-aùcO[.q*gٯ`O[kܴgMb5(M-<=bAX5\E* crTݻ*Mش`%Q;?1'ttӁqnPq?<@`wgTs7܂)ó|ĀE9aJ".O2GY 'vġǡ &+@ t`ް uyF /Zngβd cO/K'x$Jg5Sn{n܁Iܯ8lcoD|叩@_c矄& ƀG8GOفq@Xmz?%S9d Rᮘz*f q˲gGï [byqw|GG8X\j`1hHwrpLZ [ֻƝCJuT8M ־}+9ao mY{f|.aB@EV0n(XK&!<)t`t l6Ke5]ns $&i >g`0A,ԗ1dv`̇s^w@XY "]\S[r`;s f|g.:(9|%S$c"~FwN.Z@فq2BWϹZP3 1 \> !܏:%,6RYu7߳g7[B;kqF<ЪMƟ9?|!Y7>ڲny@P-~LM\:g-bD(c}WMsDCVl7w?E2oIq;Ec|oX̧ =}+`ҰnhWhmvni|rBM^1Ze# ĀGXM*{bmOe;Zj` l#G;kI@:/E g6Ciyt,_So{ 7^Y#H@XYm8cg-k P#tѵ[{\9m8f]:) G3סt}ԜdތKaFy.Y)ģtRO\LzHut3f+8pNV{ŮÙWShզcu Ÿ~Ǧh1-&$Wܺـr8L؀c7v!43z?*/8 =;w bBqǜ=~9?<7ߧkǙ3gcTgCB/L՜yϥϯkP[0V]MGEmr(a(ACGa0#VU񫇖Tgށ'Q?Ҵ۟._nyoڝ5]xΜ5C}KTn&q臟S~b~ UwWTg^q?Ҩ=s0^{3;8Lga\aIxnlk\<$\?q0:J@":.ZgjƒnSgΜł,3T0};}`{N1Z3v'|b%JG-0A9X7߳07~ܿX9sV/ 7hp7_6=v1x+g- wKsp?Q/˃ظyoFWi;skݶgU]tj )gɔp'p9'R=[~/_k>Nę3gMg+.+Iv(?H4yxfL鑜0V+Rk]BQImؑk8h3g1UGAsbWzQZ]ݕ7:rbfüjY[Gn<#`}(/ߟ`iT8s,qxKոFal1-( aS 7~Sn{ʶTd 7^hd c`+_( .:g2sl ҸmqA\{ މ VOQWY]JgԺ0%Hd-=ƇN:sę!Xѧ3xfɔ7'd;r(Y(/X|f]ա@bK$^[ib<S#rę4y4{ Y%[OY+1ɥ8{dT3#`\srSU n尘I" bUdlΜ9K9cd1jScKbxN>UwXiu9ӄѪU1x0{3)tn8sm93;t(ɃqN aL2ȌRwy=60UT:>zxޚ[ sy4ߞ\=cK;s^'gN5C~jWP(,N5a.0~\x[>rS4J9Y20"UHRH9^ {wu3lS)%,DjZ=e#j9Iy组Te~U4 $`J+}d xhJOy3SQoȳefl xcgrxj6rP\7i0:H nr+F_pV]J߫K Q < qD*ճ#*W}YA'Y.1|NM~%Exfys0&LiW`0ѥt78Nfn5D2r|1 A7̙wĀTg{n[s<)?h=ξEh94{ #4@]<l# "I6ST`ߛYY4w`1nj,&=MNE~0ZJuΩ:%RmESn{ng!vWZ}s%{=lHp@*NTZe}79G8D6(ƞ^k/*d9RV 7^Hnz 6rvozƴ*3YKQ?n@;swTsL܄11mOuv)IVe;p N0ѥ?3ɿ22ئ6I~DyS#Źݏ:ٙ< (ATs eKѹ}"2Q8b9f,gCG" ,hµ%BC8|qE*%_ܷ+8#1wlwhgΜ5nwm)Rte%W @cU f}fΡn'4xfkK$d9 4*w%*KS'ؙ+|n G֭ pڸe#0WGcI|eipY^= V[usU:o9r=M9Dw;5g۞{.& FNSYB( N=!l,ތntQ~  |wGzUh/YINAI~2ln枪ڌdUܳ8t9krYB(iDvn~%'NXϸ!r~8@pY{! à6u}Xv:a]<ܕs[p<,Μ5/ '<}-XeEKZ=ٻDCNh|FL8c@QR]۵žj7 Y:VP]ߘkzxk}*`,['뤏7>B8hJ.&(#T9ZөȠНke8_e[pc:_+=5c2wUAjLxu2.*MU: sx 0 &sXZWI;KW~~>ly};;-^WSI|&[[냀9Q4oŎ/kW IDAT[kڙm7%U:1'0{i;k=:y)1#(1K0Gq SExmk?]P$-or+$ }_ҕtW)k-~qhthV:30+hk-ED!sd<+Ø.٦}:ש[묙_cxVDm|Nѭ[Gp0J\91nuFw4Eh br.iefb!uq~<0x J3uBwZU{?św@u 14WH`2p5}̥LSPuxΘ0KU QWiED`uc%=κp֜-V ,Z CkYg O;kK` @UP\+i _l!QR!-5{ndXBف+㽍g.!-[Eyx t74[ϑ ^` wNf/ vkaoemd(K.>ڷE1΅:?`Q&E!$cEsvEyb~y(.:Nqv}ZZh}jW)ܲia}|_i:⦫<IaW44x|;8)'${cm0 d]~HKzDBIflv0= _gū?ьuL+rX|r7q.ΤLP"}uE{P:L~pr`ޮD_l>l\r{90Γ!Q`g΀gK+t8s#)+K f}׺1Z%pq+;|0<1&s1 ʂm ӗ]7ȺQ2ev Os ܊e, 'Jjѳ#ht+`-E%`39 МIqƩ!EQ6=Ukv9%xrv7hқ:[u+sE#njbZ^o/79GV_ā=%df#0B ; `{& *6SڴNhy<$RM2K#O8_ ,:cZm g|aa8P {Wiwh1R=acH` ) Epf3\TA| Pjy4y)jE/3IJtň}"EN"d3kI%44O'kD ۅ?ij*ho1/}2aLvTO`rBX\oK)3$-dg 'pS϶݇1vj {P3x"JΦݸ`Lukڠ,XhQ,4-Ciwٌs>r#pմҥ$K>M>eIO1O'`3*ٷ8b gu`c/Aښ~wlػfb2ӟx 6uQB(QtzwgnTTy$JڷnA  qx`D<1tP'䱈93<)mL'ncTr=RVM<(#alj О>q>%5I1~ qw/BE 9.g9P3_r4/s:bPYfn}$tbA8#t)M<"{3ҚTʂr\00¢G@YfI%^}kk 1)sOF"zOdE u-Ћrg|c\)^Im,fR 5)l)}{G,Y'aYnO0rzig5ֺ@Bb;3_lò_(M9S+Œ)#Qܥ>q0Ie$qn2zόX+6Lт)KwWPgy^ ."-ݲlj| Ӳ;Tc [,>uPY ꁁf@BSJćӞwBtU,8. Um5g )>̓RYN,R!ŕl0QZ]۾<0`-Uˆ̋vF_憳лW{!x.`K'"uu>17d9c;i>kX(@6Y#Dvh2ϊL>89D]YDa'(j8" @eJIO@tUwA׎Ls?+ؓg s`\\C iSrзتK,7LJ@$[BUub}SrUO$#jnuGzǴ>ǫO>=3۹l(x`>| ;13_ 3NX%̥M3w )By sp4E[sW,gc8o8ic#nuv{Whoy? FSv5֏޿7@loqdP\{i8s8O :Vf>xIx v7c;mSF1J%˭dk3v7K)~ Ik9dt`+MSS蒒ף#Ky9A`Y x5FdŘh0va i'`А9]DF՚=nyw>Cܝ[&  ['vE !{Z@d{VRCQ>ug(Wnk/|Jck7:K^߈õ9{ _:|Wq+]}gou%'xU™I=dփUߠgP!c3#ov{Va0n`9zj=&?hC<|yy5Fg5j [9՛Sn*pc쳜`[p3koX#O3 =L?7ѻoW&/q%cXUF7"d|{p1n9 8xn҄)?1AeDYDvU,c*4ECD^,FDڜh80>:vCE&?nz>8-Z~/:TP, Yt}8TdGd.mTK`P}.TB׭zHu Ŗά7ҍsƙ/ 4|0sf3$sUgnjmv:p#O,osKYmɓÃ] qєS7\,. 7ibc6 r)=TC56}{{| r႓KNEkxNc! #t$EQsg0SHR,9_gs:gƍwk>N7Y@2w{CQ8u@^5Nu FL2!CMv|x|xs+VYK hxcѫoәD8_Lt|[DC(8i JsFc\s7Dg6~pU;ҁv^Z9?<78ci5_^яNO{ՄxtwCdm_Ͻ^.}_ @ǗňQ}'R@o<\ٓsUJNHϘ PPQtyQG$%i~Y̤Ijn0|GM~ѣkc@T0OFun~]|Β8s$BK :gၗ˰iA09UguJ3 ЦM@%.,;K/g^EQ F*F ʟ :L1!`h>xhf7_1w)=*rl 0 pNi &='ǃΈ>x5xM"K6uazf6ƾa; / |*s?tAL؜POp KgBES 6-&"˒f.^SUVkI2˟(A4g[WMGKCO}Ε-Eg金`Љ%Яئ f«v|i3&o,ǪJE1NdclzL\cU^_cxSA9PD#K$p9\~__z%ȣOWkfnd^g\1Uu1y 80nˋ49u a:Y[ycsR-s ]pYsVnރ:O .luWx|;xذ`(N0M;(m}Vnދ Ơ!d5%o)j^J MXcu 5{x: .X3-|.+(ܐ;@'EPS% 4^}ᆣ`[8W+vkL#m Z2&HT:C+MIݣ߿Ɵ}ڀTpGRvQުA-.kaꝨa5PfAq-<+) 61"!֋扙/ʊ#Ԅl58ߐ,(QZ*3HgCil"sC]<_."Y;BkrhAZqe+3= |in\~:d7UGl||&ɬ|3Kd]6vO+Nz.Z N̩ Z=еz).w ih#)] Fbm:0΢##{B3粝9K_~hmf%Ԅb!hQh~07-L:ZkAkU52=w#^Y kf▽ssJwYdPgd1j:*) # .朁ɔ6X@Wjod/TĀwNU8CVA~c;h-nv.lKp.(BVuSI gQ{04O@ZP*$>lC@Rx i$2:77mk1-Snt'MNA]pѦu#UgZW1?-(?sUeEQ0Q' 8V!8 p*[P# 5Mz>EY.[Lgo:P.4ab!Z*ĭ(& "p/&M&e5c WTdb5r5nqt:?"gSǞ+`t 334ɟ%e o5EA, q4'eSa fI,0IגH @i1kL?wUqqX$-P8NEG( .jgj>kf%- =@,Q AoVRb7.Rؑw~v`P:vdx$HP3O㣺eb5V詷7?|,t fFG#,SQ"0e1A?E:/ acx([+_ 'CdxL_:^^2, ]=(p`+^RҚlce#wN.EoF&1ڲ ;kjXdu!wXVbzEXzC>v@ DU4E!ړ^etD323Y-:8l|e =`4@HsHu1a1y$UtAnU}7 m^B!*8ݟgZNKJ:_^[B>{zYS8 ]8R4˟36 b|xw*e$"; y,Vgg us80>""m&Ao^x1:ߕyPr/4c\J95Ex[ p xҁ;|u ;He#nfNx\lICu՞)Yz^@SsʟlsP'-pGSəI IDATG$`]v=%u%G) PhmD)@ʬ;G,Ł;/v(*L,\YX8iKKz]wB <;!g~Y(5-2MV!BŌ Z ?ijL{ρh|94]V~ׁqcA^05;;wB &乻G<:Z)5Uiv?ީT*^1K;g-^.V:mM&ѢfhcDbX DAۢ[[rq/)>U?fɨ4EJ *` T2BRs3uKx 6EY\&8MLJP K1(grʤ`2]g4ōӫGEsڡE?$?5&B(Sx q'DE88k k/ԕ~g MaeI=.k"V:)o0qF>E0% CKA՝ Y\y(_*p` pjztDI19u猳7@S71EC cʟôFl B6*t4e@YL(6p+sn?@Vgf3-:8=c?:!yr]ɢ}>fc Ț3'FUnd3}lO:a?+U_-A[Vn1gN60os!Q9U NM7t|M7S~uFn1Y^(]bvnsDVgu,LKIOldĈCC1ځq^r]Q3.E3sj鋣8Dty +e^qۭ"𽜤J:XeUpgĕ??yPlj90>[o ٫=OOVg(/ro<$ ?8[?8ўzD$=P:sg~D8|{T8KӔFsXCϜs$tc>؊F[T2p9{cP)'`bqYc{k϶oW߭E%PC\69Y g ٖI)riE#,n AX޸6gznPR!{J'X]9R])#p/.?ebr'|~/;^$Xt +]<쀞^7?3b.̪;x”0IK$3&sr?䨊|4g:RfxO5;jVj*M]X1noclDt,x)ÑyϘ<UƘPAW,Vّq;|G-桧#$`fǜ5 =m/mGaԝe/= +su r3Bstg̼ݟ#ȍ%q֙6IM7 k ѧo; н}}Xybg}zdvfVc-.n@oCo*&v4' *x?ss^rEQc,1ynނ)}ctn[_"V`!b,::R "@  K Ⱥ*PtUJ9*xtRTW9a3缑$9_Ĭ_ UUVRlU^s2qlHh,PpPh VMk bϘIL JhpȌ权kI x;t9Cj2{sVlFyyǁ,ux9ZYLjнGGA`NեU8q:5 B-B";f W.̎͡`n8cI)RZBe?{@~ڦ0o4eM!_9ܫkd[)~9×ejB4Ҷ 9D;NR\^4 (#Qa@{SWB@Q)8#f,b.WS{HQd6norq=ۈGp TfM瞢KQ!^x26nU&J8!m 1kL?,Ÿ~x+jG?\v[r@35AϤ(8?e 6ݟ. D`LD1qDT5Z3r+kmѵ]+<Ѷ]W-c+ŸSqw`9h۶5^x 2wNBxoI`5%%P^'[+A޴AQ$[+A+ P;gցDj)c%YgaGp;sO@uUƜqc2ߦc()e+ ]M=*0S|]L_xn ~;]쩪n&"hDD(МPTIWp,b>8C( ):ͮҢ`qEl ۋ9eA 1q7U;wSULs 7핪 4_̙/( D抃|ct3˟U['*E-{9ՈfE{lsVl:aDz~]R-QEj)O"Z׸/3xch)BJ n{ŵ&`dVHӥӞ,}f"#%ES VY|Rg)g96*o].)F>ҝ1/dFs:Ҥ48F| l\!tFB0j)?Ó)VV#5s Y#8l<]P"vfj  V4rn19qG˻@ݰg,زG9x蝍9"?cA7ɲm٭8KܗB=Pnqdu8Цړ/q's9<@4Hvq 8#:!6|U۞ܬZ_>ܿ8[f5iف1FEK$Sst@ke{%Mrel"Ӝ@Z,h129#)Ay4aa_A/!1ߴ ۞kk׶u7U j Ψ-*fG=ppbF6EK+ts;~dA({!zŁanOտ1VyƏQ\] -WdN<7l6M93 -NAmLM ^ /,8ƖܗoS& Eˆoh?kK6e7оMLן|gJi=HS|g1f3' 0bS<\)G\;}aZ2.^ҝ‚2L8e,ёQx`7*Maj, =_Skho” eF)ny.2E v+s(3l Lz>,:@sQaT6S-λ10gf\P2i6[3<88~UqƁRZirnJfңN!(9ɢ-K֥!-SgEVg3S"!Mj [;/KX|'Ph@C2F^[H( =t?ި;8X9ً+3~#,D6<{D'!"^+YB;“Ttg$tn(׽Me>s|(K#MZ3:Kq069 3?L4YPś-&rSʟCyB"`lp,e;4QU;&lB֯4c<}I]c Y UQx|puCφC2rf\ $$07.ŷV ʟ Qh^!t:?گ1˛Od,&↽s-tz*zv@II|^r؁2fĽ_m Z+!EGsuiTX?Fz2X糠yNS?43u?WOמcR7UP٢_._9eNwj9hwJv2hYY@xږ`rP;UJP${w֜ 8)#̌6x~q.=!@%g-Ea{ r5afQP-WKd{@='nǟpOlgY|!Y3kq0`q3Xs  YڛYc&V3#ܭ/^73Ƅ(lNjOZF:vm\RP(DHkݍŒ5gfyMr]uh-TY{)=a{,F`ܷi wV.-^ݟUNZy/8~ުcX";bTFD@"⮀ uј뒫q\む[D#c"pWQw4ZN9g{`fzOS穷z}Y77E7 i^2d[wQM&?G;r7Xtwe1(H@x4E&  ZSLC-_k8(L ʷxy"X:oRHǽ'W?+B3c̘pRp@ަ'lB}`&Ye"q E2V/2[mSP[TK}]םΘɫ"PSgœ IDAT+ PfȚZş,Ï6˰r{ <#ZC<sShSyChLW ! CGYqHBdW0& P(ȡ:te FL5`!-akw$ ]Ae ^+VL {)=w2̆w!1 7k豧2˾|?~(.xJy=^B]'ˡM8Uzvg#or*;B{0%[s+E΁+>@Nj)z/3N}?L +w@$ɾqa;1!D '&lX8nrI9?Nx1J!;[2㍏kOL -Q|hwL -!0fC_vESBZ4EmߐOm  ϶8-CuXnʟ#0Rwd?ı4|[36aVڽND.'6T'fTg3|42R^;uB$x\ݟ5EYm ši* yl6UO6X'Ãq-g{6Tcx\b^ym u/*|GD݈؜XqCQkj¦("@ W~~(3O?ǵZS B9?:x5@捋>{Jfag1!aT+qco⣅Xʘ#B!q8@4gbagIQ|s8mbhf7>z1GYPn9خr1wx罯QV,[Ajxni,xڔV&_m2iY 0XCFj* 7% J`0{k9B9[_\Ȯuha"nXnnRm0/45 Ȭ1 &[B!btU !uɒ0A$ crq<RgסX3mq*i^K>0p`g!Fw07T77{ЫZ)ks m@~x g}@؄.bY0p&3p Cp,U&,1%:ML&J8f03Zlڴ ?TqS2(R瞲ҰVTcE_T;w-'t3-q陇{&Ln?c '8)7* WL F 6#ix}$!&.j(kԁꩊ-8M xT/hgcF#YM` j2Wld7X%hޔa㿆9885 6yzgԲ4q.ڀO|VZF^C&2[+`Ü@kQ;_xV8L]z !jԨԎO<,MK+ō^_^h*`dj?#&E1~kDQ'P[7*N_l~*a!LfBex7]WGC[cxyn}۵ȳȞ&# , b sX#< 91v~ܸ|w<*֍E݅*98רjŰWX/ ۭ>Ĺ=.-vs3πKʆ*7b?@U ިnj[[?.铨*zgfa8O[4j1eɌ25푢 7bhڸCO? OYVbr2tƖyyr O\|0;Gi3^yTVVb0ok$ਣd8Cw2ٽI3&0 ( I6o?e* 8[#'yf.~deɺA,* :(%iDYYo/ݪ~ ,貊eVUO1f9߭R'Ebu"`ձ[i!=ǰA}vP;'( S&<K(%-`Q_qӹ8UXɶ0O61".b\uvrYUD!2#zb..x0~n[v(ݮo,Jobԅj)+X?%;Ĩ9ڜcSXm1S3 Pn{XbY'W=[ zANɾ?$pNj 0 ldȓp;0V*~uPU4^Eק!(Mxw Yh}>W =ncҺ2] UNi2 0#F$3i(R :-8zn{ΉO+^Gǟ\c؃A` L}mzYtqzG~eO9ov56hܤ-U^XgZ)A򨙧VS0[Ґ="΀si M9x@m "-aep9GbwJ]yM=a0mL՝z,Z9h[csUjHlkS^nY:Du s@!T #US򵄀8"Q¥z~O]8MEW^pa 1 ۦʍ16VY,-+;ۖ_xldF\Эv|Կ[U;v+ pQvڡkf̩c0v:0j?<;oWY2L"/f%t;vjPza j GהQ`( `"th֜3#8/A!`T\0o9xvnoS׹ɮ¢Y?vXO @ %NOƪ 'tW 9 ;RlVą#Q3 MIKjf92p\ 浜i2($qAy'w7~im2|psvSX|m]\fQ ranK5 ennF>s= z(ʏjl-2xͱ&bafLŠ;4߀k!{ʹ \2DV-`]P_^/1fa]Ƣָͽ3pSV '-#HʹO՛s#j<4R51p4 ȂJYRN6,]ۄ1{2#0h7 ڜ+٨;N Ǘ3 }{K>(>נ|KmX8yLp`c˶X;q}j̑qC1͹vڡhgc)WTYjRœspb7sQejB0 X'82(aV#e0USBNNYF H+`dha%>z8|ϝ-|Pοr/͡9[_ږj<֥5~k#1"[TFnޭ4Ǩk]-US hhbȰ㨦HJ5%KI) Hd7+\-KME 9jβ*pi(M&gsd Ӳ ;&n;;vmR/z^$y.b<2PfkP"欏x#b5%jU$NrC6wj4$;E$K^pi?Q <7.k'#UgИHZ2 G* ^X6x:Yo*J"PpsPz|pEuD*DҡYxYEZ^nÄoE/ uѼIG C= WڠIӝ5T0F5ΒL!E$%i9$ntE /,` F/Hi $T6M ?7! 2Z)4.!ts$wxsg6G^X %e˲LJ6 C0o0/ct fWހWY,_6"PC&Ætw ;kCEDhpQ8.LgƱd\@ڶZskI"PV#շN 03rdjt#œgKJBf}@qYc!°afJ i jrPb`#(#$.u6mW:So>G91!0c |ǃcgJ>Nr.b[.~|t\f&"4oRG퇋ꆽkȘrvKBt6JB0]yᬣ半;sT)'56^8ø xai<~dvJBft{6 L\mDo bɮ,f\2ͮ] .N{^o|W*)ɐS:L6gхy~u+z"PDM:fʢ|QfL$+:ƕg]=BdLRRV q1x3sj]kKZ\,V='%WpP૤jǥjتA8Cp6U o?l17\27fwNhSA9dR 8)(+]*!?}S:Q|ʳ{"5&M9ؼS¡f$-Q,-ۆmmW53f #ڒ[@"`00Y.a PBk-JvV2X^n S.#" up^`u|z븴w(買lmѨ K]p<%.BNRpM]@.%Q̹Vk1yhq%sc&uUd ~΄/ sQ¸}38eKq¹(ONQ[ַAx10x T\py$9O:eS yfvGYY3 8wr+8hfx@;6ݜ7]l:g(B7Լ6KzI%Nf_X@R`1D3M *uivz}qE6u5YSkj u sc%Y'mĐSvnv6/Ā;l^XYd 6fsNg79UA@8dAYǍ*J9(jyd mvOa@\0׌. oC\񕙩y{ @z7*е5F:v)Ѕ٘^X8m9'CJ0V 92pG[w 2Qv=pJy氌z9$mg|0)`V]lWdL؋b&%Ap#Ӯpq-y\:qW};£pŞ{ШNmPDI0 @cN*|uO(7>[u_.̀z}1хjr13m#XmqZZHbj.8i@'p\x(Ցƕc.{< G*[1y>Ơ 7h' U]œ1}4ڼl\Ffi/RZ%™jyḏ]JBSfX6lS@,a]L80킰v>FR c$M[o< So>]Dj IDATmwkb*_zOiR 7˒1Ii0E > c+($} DژYt#mK mz# `fN0%̲tv 3/X^baY[J^87p_XVZR5RIԉ^>Qݺמ7еjH w,_^nv[fd`*Yen6)˖L@݀2jTE nY^ZΖa.̑TMUi?$-O_ٛ4 R5) uHrZ[͖j"9r yɈ\|2l S|URs*Y"Ҷ-kKKë{KK/,[a(M0ۼp\%K|$UcJY>1|Jkiݖ- >B$7L^Z8-&eePѽON?eDe>L8zxGeY; Gd)ǥjizaNQˣM%%mb1(=T$3R5/Kl,-B) wҀNx%z^1|R8o!s qpEeD^Ҁ2F0ԅ*wfעN,#aKl^8d78%a6f[[reAI)`֖DՂk:­i01Q^-s2ݭ>xc%݉QX r1hu2bkgE2Y%a,cS6KȖRK6= #@ԑV9m*A378T+~sI^U?<#1qgQQ[/lKFƿl9N~*)݇n9t VTn0 SDFcԳ ˱jv @>>MRoUⅥVM&ĥˀ=$,\֖Dp&2par { ( N#^ed*R۩۩gQOū7`J+frEKS؃mU&Lmq^;~OJVцnvg Cp6Ӆ9@<3D2̈,k @5i(Lp5ⅫNqR9>i@'ąf:i)eqDBuTB/۲!f(/ؿS"T]6X F᧩jR.7ЌZ[[ e֖H4Gekr<*܈xau*{ކqT}Nh_YTE1*vXnQި oG4E})娦yaa:a֖ ;Y< ҕq8źm' }l^80}, #a=lO7ԇC7SOu}̅qW&͎LXRNë\M*i׿ť)pU6JM{u9 3p|$SZF]œFIGTQpfU O( \"p4Zf*a2"v&\2^( s%ms6]H_hM.a㥿9{$нq=XYǐauR&aE#d AiUUw,Qs}hW50GFܭ/.r.jSkYgt1l{n%\bB:}L2x6޹Tb˫rdr֯Ynh>ע$t9W$/,L,k4%JfPuy鼰a+NGݍ8H@eF<8v&n<ugn2viAiӌ b.*k͇WKKI:YzNFyax 3,-U3ϵ-9+1tmi_},8DB/qFTt>>Q<) pŏ⦗mcpi XZg.7LBVJwq kru|BP^:踴 pK3$WKZ@R E`&4} %)f*& /, JH@V1[aʧ<ơ2g;jLy_+&`CӰxOg!NK:k <9s~-`anUޠ|{^eepV}J_D6f GBH#!29"i:B}x˯LQh({ͩ$rHL%JERz%[wOj9tvwl[mZT=^OM@o[epQ 8YSwsFczθgeOS$$2"` ʮΡRE0,&y-Q,h dѤh bT%0xDƃfS34UY)RA};Ҥ;u|"@?ZAuGi'U! Zx 4x!eKjH +^ӣNRӅY0; '+8emT_GĴY1φ-M3=;{xRלӃ>_F\7v^}SSTf0 X^ ޘ|3=uMkN i"*3Wj _3ʚW$ȥSuE|O^Xd5no*.]!vCʱo9ߵZ^|%n7~ U$+ 0TE.nXz[mH>^3UQ>?dpj [+@L(3׈VET >h Kߏ[Pr#ƌ}>ެ7#FUV5j{vڋf.;X6frYn撓y(C1z+23@ftja)S<6 |$eLwU<Ҽ0>u?y~O%WBi<3?^\uA:Ч^AuO6F Ld%V1os9]|r7'o<0X~wl0̨U ^ޏ{iߎm:b˅huYd TkK ;q]("Յqf,P֖gs| GA-[)W._WY9^pi]kSޱ5yBVqL/fP[zq3lZu.(;9H(Go%+ѶM*zi rS,n̖3/) $rgƄ20G`[Fc,*6a0gmd{rF<7a9~#UBe1S\ 9{Ec4jPwi)"7WגSV)8>z hcldqpUͥ$rKՌ]f\<5Q:irdm}eXjUz0`Y=4m^:!_Qxf/ l[c|⢼aFØzv>if|?f"rN0ñ$=rٜS%27`񒕸`ʒ*xbx+v~Ug1ӵOfUg:KWm?1=wM$̈́t)G~\BX VN 3Gն˃r('%_rx]FǩVfJ%=H+J0֡ȧ_V߬\c]EysJ(C; 7EhL}mhGBŬ-c֙_X { er;M7UڥX Vefp סx*;2 THFT-TV4‰%/ ںqc_bŊu rKXzzzG~njcاߝBy\If=T#f(n\ZOyfMцaT|9:=b4%ŒG_Ibs]Wqd}55ӳjʢ~;OS7Lz/kz!Ƽ0Wn,`bsŏt9oהϘSqء`,qHY$AsZwp /"DXi nUz}vt]|r7ZUdN_U̎ᩋZ),(/鈌e9h}D$G&ށڲ>qAV|DgĄQjl1SOzVK+zO.b&BgpUOPaFek|lsd~.Sj]03~t?;'MMk$̓D;ꆞV .a+ fl9 Jfp^x 6uA9 fSV>w.1Vg>j@StU@UW=]Qw 'W<+7@n6y .+?Î/ÃcgAD%ݪejm3V-=(؅Džf !NԂ[<;᣸@%+&b uNUx 녵Z}$g8A {6|jMiuv%nNwl܈t`ڠSPChgO.J^؃qJաLʮ:%4Z͎Ol[漿<خ,Tb< O2i<3?J=発d!j/4hܜj3ܱYU/j"# l݀ /ڹIcobljEl.anr,l)3xŅ:XQ|ODHO/-|#ޏ֖ⶊc2ƃO~|u^ͲeʌSi^W|l<̝. xhZmCw<};(kC=¥E˞k}n KD9q30-M, 2rٸ J ypkR˸+٪G BԲu5|#Vf7&|*uǟ<Ƙqp "aBGcUYdmϠ)jgRQ[orgTtoQlߋj_/ݴYp1K R0KڲI F :-=MQ\[‡u)zJ@,YZ[ /2ѯ!(}:곏-~Agő)5 쪏O&Buq-Sqa0v@YGǽ탤bWMw >tSjv֒@+&(k[4y<&yy0E^芾}vc{b1}CU6v@ -} s~sƵ'_4hn&R}l`<p/_zJC~ǭ#q.=0_c{h *o]N珳z0]ѐ-={P=ۭ^Ƕd+&M^ni #T6\4+z7hkKEL_lmhH?O[oL5cUY)F!}}ό}lW\V `% L{cȌӆad#-ʈ)mC3Q;EZc7XO_UIـhIDATwp}ex96}1_{y?}j|*Z7O;7A}c.@ABu/0)X:ikn\uVw\{N7^=aO'K2(DSi-v|v˗W۟w:mk'[[ɟRg݋D' GU !P~xrnzdZ UY)3Q{yaYfo8| ȷgƒJ"=4ޒ*4y.< n8߻>3QheOIͩ#"ah‚7#J9IH6>kpxO{ ʗ![ޱ5n8cw`b'XYqLFʍx9f{%zmݲ)~?IwSGFwLxSٮ޷v&f&sCG>8>_F>a+D:+6S`hZcD.:B?xJy?TqTSv0c3cZũ>W% !XЛH>kN/D!0q+q39b"^N`?=h ѳӾOOdLvQ4>jж%APX-Ӫ A:x^؃up(Sv˩[1mV`  ڵ}h _R_tJIM#K}x0F'%N,D~ɔ!(.N()oCt晸{،u~68a89n~ zvڋя;}y4慹gcЀn* ّG=%ROL|??WαmRڬ)4tM3ex ׁzǙn'@͑hN9ҶXZ[߯DAҏ&>3q_+׉+2f/!TGQ/_ &ʒPEaGbM X/cي%WxUE%74?|"S(DJqߨѳ^G㨎nyXNii JG^V蘻 l[Gp2iWNJ=G/|lW\v+M||ps}1O%%ަiK}:.].7 s~]̈W|onz!Poe\|`cmyؙXrmg6( rq%xu)uēsp~L[@UY)n҇c)lN'b4# RA. 0â',58 hVZbJf}אk|!Ճl9;R6F|DY޷RK}x0QoY)a |*].<{$+x@9ʍW06gۙN<'Gwޛ寯> ,tiӯ~ ?P-,kkl:UY\3vIwÃU4 !jFg#; ߿csoRˎGu~;(omÃkA_QxuΧ啮VX֖8RIBp?nq}x0QܱOfO/㛕r[ B`]F<4n&QJkPJ뼵~#LO Q3?7Wq~2ҦO}q3/o^xRVq|x0ÎWpٶWQrqUzkK}*Y,:iB\YT 6[[6ßGg$w>|x0q瓳Ÿb=M%5P0ÃǗųvr@XB0}>|-R:Sa!D[[`Gmӯ~ 7l!h n'y><QqoWd&Bg0h?}x0.k]Oom飠u1BIENDB`././@PaxHeader0000000000000000000000000000003300000000000011451 xustar000000000000000027 mtime=1605787057.150476 ndcube-1.4.2/docs/make.bat0000644000175100001640000001070500000000000015606 0ustar00vstsdocker00000000000000@ECHO OFF REM Command file for Sphinx documentation if "%SPHINXBUILD%" == "" ( set SPHINXBUILD=sphinx-build ) set BUILDDIR=_build set ALLSPHINXOPTS=-d %BUILDDIR%/doctrees %SPHINXOPTS% . if NOT "%PAPER%" == "" ( set ALLSPHINXOPTS=-D latex_paper_size=%PAPER% %ALLSPHINXOPTS% ) if "%1" == "" goto help if "%1" == "help" ( :help echo.Please use `make ^` where ^ is one of echo. html to make standalone HTML files echo. dirhtml to make HTML files named index.html in directories echo. singlehtml to make a single large HTML file echo. pickle to make pickle files echo. json to make JSON files echo. htmlhelp to make HTML files and a HTML help project echo. qthelp to make HTML files and a qthelp project echo. devhelp to make HTML files and a Devhelp project echo. epub to make an epub echo. latex to make LaTeX files, you can set PAPER=a4 or PAPER=letter echo. text to make text files echo. man to make manual pages echo. changes to make an overview over all changed/added/deprecated items echo. linkcheck to check all external links for integrity echo. doctest to run all doctests embedded in the documentation if enabled goto end ) if "%1" == "clean" ( for /d %%i in (%BUILDDIR%\*) do rmdir /q /s %%i del /q /s %BUILDDIR%\* del /q /s api del /q /s generated goto end ) if "%1" == "html" ( %SPHINXBUILD% -b html %ALLSPHINXOPTS% %BUILDDIR%/html if errorlevel 1 exit /b 1 echo. echo.Build finished. The HTML pages are in %BUILDDIR%/html. goto end ) if "%1" == "dirhtml" ( %SPHINXBUILD% -b dirhtml %ALLSPHINXOPTS% %BUILDDIR%/dirhtml if errorlevel 1 exit /b 1 echo. echo.Build finished. The HTML pages are in %BUILDDIR%/dirhtml. goto end ) if "%1" == "singlehtml" ( %SPHINXBUILD% -b singlehtml %ALLSPHINXOPTS% %BUILDDIR%/singlehtml if errorlevel 1 exit /b 1 echo. echo.Build finished. The HTML pages are in %BUILDDIR%/singlehtml. goto end ) if "%1" == "pickle" ( %SPHINXBUILD% -b pickle %ALLSPHINXOPTS% %BUILDDIR%/pickle if errorlevel 1 exit /b 1 echo. echo.Build finished; now you can process the pickle files. goto end ) if "%1" == "json" ( %SPHINXBUILD% -b json %ALLSPHINXOPTS% %BUILDDIR%/json if errorlevel 1 exit /b 1 echo. echo.Build finished; now you can process the JSON files. goto end ) if "%1" == "htmlhelp" ( %SPHINXBUILD% -b htmlhelp %ALLSPHINXOPTS% %BUILDDIR%/htmlhelp if errorlevel 1 exit /b 1 echo. echo.Build finished; now you can run HTML Help Workshop with the ^ .hhp project file in %BUILDDIR%/htmlhelp. goto end ) if "%1" == "qthelp" ( %SPHINXBUILD% -b qthelp %ALLSPHINXOPTS% %BUILDDIR%/qthelp if errorlevel 1 exit /b 1 echo. echo.Build finished; now you can run "qcollectiongenerator" with the ^ .qhcp project file in %BUILDDIR%/qthelp, like this: echo.^> qcollectiongenerator %BUILDDIR%\qthelp\Astropy.qhcp echo.To view the help file: echo.^> assistant -collectionFile %BUILDDIR%\qthelp\Astropy.ghc goto end ) if "%1" == "devhelp" ( %SPHINXBUILD% -b devhelp %ALLSPHINXOPTS% %BUILDDIR%/devhelp if errorlevel 1 exit /b 1 echo. echo.Build finished. goto end ) if "%1" == "epub" ( %SPHINXBUILD% -b epub %ALLSPHINXOPTS% %BUILDDIR%/epub if errorlevel 1 exit /b 1 echo. echo.Build finished. The epub file is in %BUILDDIR%/epub. goto end ) if "%1" == "latex" ( %SPHINXBUILD% -b latex %ALLSPHINXOPTS% %BUILDDIR%/latex if errorlevel 1 exit /b 1 echo. echo.Build finished; the LaTeX files are in %BUILDDIR%/latex. goto end ) if "%1" == "text" ( %SPHINXBUILD% -b text %ALLSPHINXOPTS% %BUILDDIR%/text if errorlevel 1 exit /b 1 echo. echo.Build finished. The text files are in %BUILDDIR%/text. goto end ) if "%1" == "man" ( %SPHINXBUILD% -b man %ALLSPHINXOPTS% %BUILDDIR%/man if errorlevel 1 exit /b 1 echo. echo.Build finished. The manual pages are in %BUILDDIR%/man. goto end ) if "%1" == "changes" ( %SPHINXBUILD% -b changes %ALLSPHINXOPTS% %BUILDDIR%/changes if errorlevel 1 exit /b 1 echo. echo.The overview file is in %BUILDDIR%/changes. goto end ) if "%1" == "linkcheck" ( %SPHINXBUILD% -b linkcheck %ALLSPHINXOPTS% %BUILDDIR%/linkcheck if errorlevel 1 exit /b 1 echo. echo.Link check complete; look for any errors in the above output ^ or in %BUILDDIR%/linkcheck/output.txt. goto end ) if "%1" == "doctest" ( %SPHINXBUILD% -b doctest %ALLSPHINXOPTS% %BUILDDIR%/doctest if errorlevel 1 exit /b 1 echo. echo.Testing of doctests in the sources finished, look at the ^ results in %BUILDDIR%/doctest/output.txt. goto end ) :end ././@PaxHeader0000000000000000000000000000003300000000000011451 xustar000000000000000027 mtime=1605787057.150476 ndcube-1.4.2/docs/ndcollection.rst0000644000175100001640000001721300000000000017411 0ustar00vstsdocker00000000000000.. _ndcollection: ============ NDCollection ============ `~ndcube.NDCollection` is a container class for grouping `~ndcube.NDCube` or `~ndcube.NDCubeSequence` instances together. It does not imply an ordered relationship between its constituent ND objects like `~ndcube.NDCubeSequence`. Instead it links ND objects in an unordered way like a Python dictionary. This has many possible uses, for example, linking observations with derived data products. Let's say we have a 3D `~ndcube.NDCube` representing space-space-wavelength. Then let's say we fit a spectral line in each pixel's spectrum and extract its linewidth. Now we have a 2D spatial map of linewidth with the same spatial axes as the original 3D cube. However the physical properties represented by the data are different. They do not have an order within their common coordinate space. And they do not have the same dimensionality as the 2nd cube's spectral axis has been collapsed. Therefore is it not appropriate to combine them in an `~ndcube.NDCubeSequence`. This is where `~ndcube.NDCollection` comes in handy. It allows us to name each ND object and combine them into a single container, just like a dictionary. In fact `~ndcube.NDCollection` inherits from `dict`. Initialization -------------- To see how we initialize an `~ndcube.NDCollection`, let's first define a couple of `~ndcube.NDCube` instances representing the situation above, i.e. a 3D space-space-spectral cube and a 2D space-space cube that share spatial axes. Let there be 10x20 spatial pixels and 30 pixels along the spectral axis. .. code-block:: python >>> import numpy as np >>> from astropy.wcs import WCS >>> from ndcube import NDCube >>> # Define observations NDCube. >>> data = np.ones((10, 20, 30)) # dummy data >>> obs_wcs_dict = { ... 'CTYPE1': 'WAVE ', 'CUNIT1': 'Angstrom', 'CDELT1': 0.2, 'CRPIX1': 0, 'CRVAL1': 10, 'NAXIS1': 30, ... 'CTYPE2': 'HPLT-TAN', 'CUNIT2': 'deg', 'CDELT2': 0.5, 'CRPIX2': 2, 'CRVAL2': 0.5, 'NAXIS2': 20, ... 'CTYPE3': 'HPLN-TAN', 'CUNIT3': 'deg', 'CDELT3': 0.4, 'CRPIX3': 2, 'CRVAL3': 1, 'NAXIS3': 10} >>> obs_wcs = WCS(obs_wcs_dict) >>> obs_cube = NDCube(data, obs_wcs) >>> # Define derived linewidth NDCube >>> linewidth_data = np.ones((10, 20)) / 2 # dummy data >>> linewidth_wcs_dict = { ... 'CTYPE1': 'HPLT-TAN', 'CUNIT1': 'deg', 'CDELT1': 0.5, 'CRPIX1': 2, 'CRVAL1': 0.5, 'NAXIS1': 20, ... 'CTYPE2': 'HPLN-TAN', 'CUNIT2': 'deg', 'CDELT2': 0.4, 'CRPIX2': 2, 'CRVAL2': 1, 'NAXIS2': 10} >>> linewidth_wcs = WCS(linewidth_wcs_dict) >>> linewidth_cube = NDCube(linewidth_data, linewidth_wcs) Combine these ND objects into an `~ndcube.NDCollection` by supplying a sequence of ``(key, value)`` pairs in the same way that you initialize and dictionary. .. code-block:: python >>> from ndcube import NDCollection >>> my_collection = NDCollection([("observations", obs_cube), ("linewidths", linewidth_cube)]) Data Access ----------- Key Access ********** To access each ND object in ``my_collection`` we can index with the name of the desired object, just like a `dict`: .. code-block:: python >>> my_collection["observations"] # doctest: +SKIP And just like a `dict` we can see the different names available using the ``keys`` method: .. code-block:: python >>> my_collection.keys() dict_keys(['observations', 'linewidths']) Aligned Axes & Slicing ********************** Aligned Axes ^^^^^^^^^^^^ `~ndcube.NDCollection` is more powerful than a simple dictionary because it allows us to link common aligned axes between the ND objects. In our example above, the linewidth object's axes are aligned with the first two axes of observation object. Let's instantiate our collection again, but this time declare those axes to be aligned. .. code-block:: python >>> my_collection = NDCollection( ... [("observations", obs_cube), ("linewidths", linewidth_cube)], aligned_axes=(0, 1)) We can see which axes are aligned by inpecting the ``aligned_axes`` attribute: .. code-block:: python >>> my_collection.aligned_axes {'observations': (0, 1), 'linewidths': (0, 1)} As you can see, this gives us the aligned axes for each ND object separately. We should read this as the 0th axes of both ND objects are aligned, as are the 1st axes of both objects. Because each ND object's set of aligned axes is stored separately, aligned axes do not have to be in the same order in both objects. Let's say we reversed the axes of our ``linewidths`` ND object for some reason: .. code-block:: python >>> linewidth_wcs_dict_reversed = { ... 'CTYPE2': 'HPLT-TAN', 'CUNIT2': 'deg', 'CDELT2': 0.5, 'CRPIX2': 2, 'CRVAL2': 0.5, 'NAXIS2': 20, ... 'CTYPE1': 'HPLN-TAN', 'CUNIT1': 'deg', 'CDELT1': 0.4, 'CRPIX1': 2, 'CRVAL1': 1, 'NAXIS1': 10} >>> linewidth_wcs_reversed = WCS(linewidth_wcs_dict_reversed) >>> linewidth_cube_reversed = NDCube(linewidth_data.transpose(), linewidth_wcs_reversed) We can still define an `~ndcube.NDCollection` with aligned axes by supplying a tuple of tuples, giving the aligned axes of each ND object separately. In this case, the 0th axis of the ``observations`` object is aligned with the 1st axis of the ``linewidths`` object and vice versa. .. code-block:: python >>> my_collection_reversed = NDCollection( ... [("observations", obs_cube), ("linewidths", linewidth_cube_reversed)], ... aligned_axes=((0, 1), (1, 0))) >>> my_collection_reversed.aligned_axes {'observations': (0, 1), 'linewidths': (1, 0)} Aligned axes must have the same lengths. We can see the lengths of the aligned axes by using the ``aligned_dimensions`` property. .. code-block:: python >>> my_collection.aligned_dimensions Note that this only tells us the lengths of the aligned axes. To see the lengths of the non-aligned axes, e.g. the spectral axis of the ``observations`` object, you must inspect that ND object individually. We can also see the physical properties to which the aligned axes correspond by using the ``aligned_world_axis_physical_types`` property. .. code-block:: python >>> my_collection.aligned_world_axis_physical_types ('custom:pos.helioprojective.lon', 'custom:pos.helioprojective.lat') Note that this method simply returns the world physical axis types of one of the ND objects. However, there is no requirement that all aligned axes must represent the same physical types. They just have to be the same length. Slicing ^^^^^^^ Defining aligned axes enables us to slice those axes of all the ND objects in the collection by using the standard Python slicing API. .. code-block:: python >>> sliced_collection = my_collection[1:3, 3:8] >>> sliced_collection.keys() dict_keys(['observations', 'linewidths']) >>> sliced_collection.aligned_dimensions Note that we still have the same number of ND objects, but both have been sliced using the inputs provided by the user. Also note that slicing takes account of and updates the aligned axis information. Therefore a self-consistent result would be obtained even if the aligned axes are not in order. .. code-block:: python >>> sliced_collection_reversed = my_collection_reversed[1:3, 3:8] >>> sliced_collection_reversed.keys() dict_keys(['observations', 'linewidths']) >>> sliced_collection_reversed.aligned_dimensions Editing NDCollection -------------------- Because `~ndcube.NDCollection` inherits from `dict`, we can edit the collection using many of the same methods. These have the same or analagous APIs to the ``dict`` versions and include ``del``, `~ndcube.NDCollection.pop`, and `~ndcube.NDCollection.update`. Some `dict` methods may not be implemented on `~ndcube.NDCollection` if they are not consistent with its design. ././@PaxHeader0000000000000000000000000000003300000000000011451 xustar000000000000000027 mtime=1605787057.150476 ndcube-1.4.2/docs/ndcube.rst0000644000175100001640000005246300000000000016202 0ustar00vstsdocker00000000000000.. _ndcube: ====== NDCube ====== `~ndcube.NDCube` is the fundamental class of the ndcube package and is designed to handle data contained in a single N-D array described by a single set of WCS transformations. `~ndcube.NDCube` is subclassed from `astropy.nddata.NDData` and so inherits the same attributes for data, wcs, uncertainty, mask, meta, and unit. The WCS object contained in the ``.wcs`` attribute is subclassed from `astropy.wcs.WCS` and contains a few additional attributes to enable to keep track of its relationship to the data. Initialization -------------- To initialize the most basic `~ndcube.NDCube` object, all you need is a `numpy.ndarray` containing the data, and an `astropy.wcs.WCS` object describing the transformation from array-element space to real world coordinates. Let's create a 3-D array of data with shape (3, 4, 5) where every value is 1:: >>> import numpy as np >>> data = np.ones((3, 4, 5)) Now let's create an `astropy.wcs.WCS` object describing the translation from the array element coordinates to real world coordinates. Let the first data axis be helioprojective longitude, the second be helioprojective latitude, and the third be wavelength. Note that due to (confusing) convention, the order of the axes in the WCS object is reversed relative to the data array. >>> import astropy.wcs >>> wcs_input_dict = { ... 'CTYPE1': 'WAVE ', 'CUNIT1': 'Angstrom', 'CDELT1': 0.2, 'CRPIX1': 0, 'CRVAL1': 10, 'NAXIS1': 5, ... 'CTYPE2': 'HPLT-TAN', 'CUNIT2': 'deg', 'CDELT2': 0.5, 'CRPIX2': 2, 'CRVAL2': 0.5, 'NAXIS2': 4, ... 'CTYPE3': 'HPLN-TAN', 'CUNIT3': 'deg', 'CDELT3': 0.4, 'CRPIX3': 2, 'CRVAL3': 1, 'NAXIS3': 3} >>> input_wcs = astropy.wcs.WCS(wcs_input_dict) Now that we have a data array and a corresponding WCS object, we can create an `~ndcube.NDCube` instance by doing:: >>> from ndcube import NDCube >>> my_cube = NDCube(data, input_wcs) The data array is stored in the ``mycube.data`` attribute while the WCS object is stored in the ``my_cube.wcs`` attribute. However, when manipulating/slicing the data is it better to slice the object as a whole. (See section on :ref:`ndcube_slicing`.) So the ``.data`` attribute should only be used to access a specific value(s) in the data. Another thing to note is that as part of the initialization, the WCS object is converted from an `astropy.wcs.WCS` to an `ndcube.utils.wcs.WCS` object which has some additional features for tracking "missing axes", etc. (See section on :ref:`missing_axes`.) Thanks to the fact that `~ndcube.NDCube` is subclassed from `astropy.nddata.NDData`, you can also supply additional data to the `~ndcube.NDCube` instance. These include: metadata (`dict` or dict-like) located at `NDCube.meta`; a data mask (boolean `numpy.ndarray`) located at `NDCube.mask` marking, for example, reliable and unreliable pixels; an uncertainty array (`numpy.ndarray`) located at `NDCube.uncertainty` describing the uncertainty of each data array value; and a unit (`astropy.units.Unit` or unit `str`). For example:: >>> mask = np.zeros_like(my_cube.data, dtype=bool) >>> meta = {"Description": "This is example NDCube metadata."} >>> my_cube = NDCube(data, input_wcs, uncertainty=np.sqrt(data), ... mask=mask, meta=meta, unit=None) INFO: uncertainty should have attribute uncertainty_type. [astropy.nddata.nddata] N.B. The above warning is due to the fact that `astropy.nddata.uncertainty` is recommended to have an ``uncertainty_type`` attribute giving a string describing the type of uncertainty. However, this is not required. Dimensions ---------- `~ndcube.NDCube` has useful properties for inspecting its data shape and axis types, `~ndcube.NDCube.dimensions` and `~ndcube.NDCube.world_axis_physical_types`:: >>> my_cube.dimensions >>> my_cube.world_axis_physical_types ('custom:pos.helioprojective.lon', 'custom:pos.helioprojective.lat', 'em.wl') `~ndcube.NDCube.dimensions` returns an `~astropy.units.Quantity` of pixel units giving the length of each dimension in the `~ndcube.NDCube` while `~ndcube.NDCube.world_axis_physical_types` returns an iterable of strings denoting the type of physical property represented by each axis. The axis names are in accordance with the International Virtual Observatory Alliance (IVOA) `UCD1+ controlled vocabulary `_. Here the shape and axis types are given in data order, not WCS order. .. _ndcube_slicing: Slicing ------- Arguably NDCube's most powerful capability is its slicing. Slicing an `~ndcube.NDCube` instance using the standard slicing notation allows users to access sub-regions of their data while simultaneously slicing not only the other array attributes (e.g. uncertainty, mask, etc.) but also the WCS object. This ensures that even though the data array has changed size and shape, each array element will still correspond to the same real world coordinates as they did before. An example of how to slice a 3-D `~ndcube.NDCube` object is:: >>> my_cube_roi = my_cube[3:5, 10:100, 30:37] Slicing can also reduce the dimension of an `~ndcube.NDCube`, e.g.:: >>> my_2d_cube = my_cube[0, 10:100, 30:37] In addition to slicing by index, `~ndcube.NDCube` supports a basic version of slicing/indexing by real world coordinates via the `~ndcube.NDCube.crop_by_coords` method. This takes a list of `astropy.units.Quantity` instances representing the minimum real world coordinates of the region of interest in each dimension. The order of the coordinates must be the same as the order of the data axes. A second iterable of `~astropy.units.Quantity` must also be provided which gives the widths of the region of interest in each data axis:: >>> import astropy.units as u >>> my_cube_roi = my_cube.crop_by_coords([0.7*u.deg, 1.3e-5*u.deg, 1.04e-9*u.m], ... [0.6*u.deg, 1.*u.deg, 0.08e-9*u.m]) This method does not rebin or interpolate the data if the region of interest does not perfectly map onto the array's "pixel" grid. Instead it translates from real world to pixel coordinates and rounds to the nearest integer pixel before indexing/slicing the `~ndcube.NDCube` instance. Therefore it should be noted that slightly different inputs to this method can result in the same output. .. _missing_axes: Missing Axes ------------ Some WCS axis types are coupled. For example, the helioprojective latitude and longitude of the Sun as viewed by a camera on a satellite orbiting Earth do not map independently to the pixel grid. Instead, the longitude changes as we move vertically along the same x-position if that single x-position is aligned anywhere other than perfectly north-south along the Sun's central meridian. The analagous is true of the latitude for any y-pixel position not perfectly aligned with the Sun's equator. Therefore, knowledge of both the latitude and longitude must be known to derive the pixel position along a single spatial axis and vice versa. However, there are occasions when a data array may only contain one spatial axis, e.g. data from a slit-spectrograph. In this case, simply extracting the corresponding latitude or longitude axis from the WCS object would cause the translations to break. To deal with this scenario, `~ndcube.NDCube` supports "missing" WCS axes. An additional attribute is added to the WCS object (`NDCube.wcs.missing_axes`) which is a list of `bool` type indicating which WCS axes do not have a corresponding data axis. This allows translation information on coupled axes to persist even if the data axes do not. This feature also makes it possible for `~ndcube.NDCube` to seamlessly reduce the data dimensionality via slicing. In the majority of cases a user will not need to worry about this feature. But it is useful to be aware of as many of the coordinate transformation functionalities of `~ndcube.NDCube` are only made possible by the missing axis feature. Extra Coordinates ----------------- In the case of some datasets, there may be additional translations between the array elements and real world coordinates that are not included in the WCS. Consider a 3-D data cube from a rastering slit-spectrograph instrument. The first axis corresponds to the x-position of the slit as it steps across a region of interest in a given pattern. The second corresponds to latitude along the slit. And the third axis corresponds to wavelength. However, the first axis also corresponds to time, as it takes time for the slit to move and then take another exposure. It would be very useful to have the measurement times also associated with the x-axis. However, the WCS may only handle one translation per axis. Fortunately, `~ndcube.NDCube` has a solution to this. Values at integer (pixel) steps along an axis can be stored within the object and accessed via the `~ndcube.NDCube.extra_coords` property. To attach extra coordinates to an `~ndcube.NDCube` instance, provide an iterable of tuples of the form (`str`, `int`, `~astropy.units.Quantity` or array-like) during instantiation. The 0th entry gives the name of the coordinate, the 1st entry gives the data axis to which the extra coordinate corresponds, and the 2nd entry gives the value of that coordinate at each pixel along the axis. So to add timestamps along the 0th axis of ``my_cube`` we do:: >>> from datetime import datetime, timedelta >>> # Define our timestamps. Must be same length as data axis. >>> axis_length = int(my_cube.dimensions[0].value) >>> timestamps = [datetime(2000, 1, 1)+timedelta(minutes=i) ... for i in range(axis_length)] >>> extra_coords_input = [("time", 0, timestamps)] >>> # Generate NDCube as above, except now set extra_coords kwarg. >>> my_cube = NDCube(data, input_wcs, uncertainty=np.sqrt(data), ... mask=mask, meta=meta, unit=None, ... extra_coords=extra_coords_input) INFO: uncertainty should have attribute uncertainty_type. [astropy.nddata.nddata] The `~ndcube.NDCube.extra_coords` property returns a dictionary where each key is a coordinate name entered by the user. The value of each key is itself another dictionary with keys ``'axis'`` and ``'value'`` giving the corresponding data axis number and coordinate value at each pixel as supplied by the user:: >>> my_cube.extra_coords # doctest: +SKIP {'time': {'axis': 0, 'value': [datetime.datetime(2000, 1, 1, 0, 0), datetime.datetime(2000, 1, 1, 0, 1), datetime.datetime(2000, 1, 1, 0, 2)]}} Just like the data array and the WCS object, the extra coordinates are sliced automatically when the `~ndcube.NDCube` instance is sliced. So if we take the first slice of ``my_cube`` in the 0th axis, the extra time coordinate will only contain the value from that slice.:: >>> my_cube[0].extra_coords # doctest: +SKIP {'time': {'axis': None, 'value': datetime.datetime(2000, 1, 1, 0, 0)}} Note that the ``axis`` value is now ``None`` because the dimensionality of the `~ndcube.NDCube` has been reduced via the slicing:: >>> my_cube[0].dimensions and so the ``time`` extra coordinate no longer corresponds to a data axis. This would not have been the case if we had done the slicing so the length of the 0th axis was >1:: >>> my_cube[0:2].dimensions >>> my_cube[0:2].extra_coords # doctest: +SKIP {'time': {'value': [datetime.datetime(2000, 1, 1, 0, 0), datetime.datetime(2000, 1, 1, 0, 1)], 'axis': 0}} Plotting -------- To quickly and easily visualize N-D data, `~ndcube.NDCube` provides a simple-to-use, yet powerful plotting method, `~ndcube.NDCube.plot`, which produces a sensible visualization based on the dimensionality of the data. It is intended to be a useful quicklook tool and not a replacement for high quality plots or animations, e.g. for publications. The plot method can be called very simply, like so:: >>> my_cube.plot() # doctest: +SKIP The type of visualization returned depends on the dimensionality of the data within the `~ndcube.NDCube` object. For 1-D data a line plot is produced, similar to `matplotlib.pyplot.plot`. For 2-D data, an image is produced similar to that of `matplotlib.pyplot.imshow`. While for a >2-D data, a `sunpy.visualization.imageanimator.ImageAnimatorWCS` object is returned. This displays a 2-D image with sliders for each additional dimension which allow the user to animate through the different values of each dimension and see the effect in the 2-D image. No args are required. The necessary information to generate the plot is derived from the data and metadata in the `~ndcube.NDCube` itself. Setting the x and y ranges of the plot can be done simply by indexing the `~ndcube.NDCube` object itself to the desired region of interest and then calling the plot method, e.g.:: >>> my_cube[0, 10:100, :].plot() # doctest: +SKIP In addition, some optional kwargs can be used to customize the plot. The ``axis_ranges`` kwarg can be used to set the axes ticklabels. See the `~sunpy.visualization.imageanimator.ImageAnimatorWCS` documentation for more detail. However, if this is not set, the axis ticklabels are automatically derived in real world coordinates from the WCS object within the `~ndcube.NDCube`. By default the final two data dimensions are used for the plot axes in 2-D or greater visualizations, but this can be set by the user using the ``images_axes`` kwarg:: >>> my_cube.plot(image_axes=[0,1]) # doctest: +SKIP where the first entry in the list gives the index of the data index to go on the x-axis, and the second entry gives the index of the data axis to go on the y-axis. In addition, the units of the axes or the data can be set by the ``unit_x_axis``, ``unit_y_axis``, unit kwargs. However, if not set, these are derived from the `~ndcube.NDCube` wcs and unit attributes. Coordinate Transformations -------------------------- The fundamental point the WCS system is the ability to easily translate between pixel and real world coordinates. For this purpose, `~ndcube.NDCube` provides convenience wrappers for the better known astropy functions, `astropy.wcs.WCS.all_pix2world` and `astropy.wcs.WCS.all_world2pix`. These are `~ndcube.NDCube.pixel_to_world`, `~ndcube.NDCube.world_to_pixel`, and `~ndcube.NDCube.axis_world_coords`. It is highly recommended that when using `~ndcube.NDCube` these convenience wrappers are used rather than the original astropy functions for a few reasons. For example, they can track house-keeping data, are aware of "missing" WCS axis, are unit-aware, etc. To use `~ndcube.NDCube.pixel_to_world`, simply input `~astropy.units.Quantity` objects with pixel units. Each `~astropy.units.Quantity` corresponds to an axis so the number of `~astropy.units.Quantity` objects should equal the number of data axes. Also, the order of the quantities should correspond to the data axes' order, not the WCS order. The nth element of each `~astropy.units.Quantity` describes the pixel coordinate in that axis. For example, if we wanted to transform the pixel coordinates of the pixel (2, 3, 4) in ``my_cube`` we would do:: >>> import astropy.units as u >>> real_world_coords = my_cube.pixel_to_world(2*u.pix, 3*u.pix, 4*u.pix) To convert two pixels with pixel coordinates (2, 3, 4) and (5, 6, 7), we would call pixel_to_world like so:: >>> real_world_coords = my_cube.pixel_to_world([2, 5]*u.pix, [3, 6]*u.pix, [4, 7]*u.pix) As can be seen, since each `~astropy.units.Quantity` describes a different pixel coordinate of the same number of pixels, the lengths of each `~astropy.units.Quantity` must be the same. `~ndcube.NDCube.pixel_to_world` returns a similar list of Quantities to those that were input, except that they are now in real world coordinates:: >>> real_world_coords [, , ] The exact units used are defined within the `~ndcube.NDCube` instance's `~ndcube.utils.wcs.WCS` object. Once again, the coordinates of the nth pixel is given by the nth element of each of the `~astropy.units.Quantity` objects returned. Using `~ndcube.NDCube.world_to_pixel` to convert real world coordinates to pixel coordinates is exactly the same, but in reverse. This time the input `~astropy.units.Quantity` objects must be in real world coordinates compatible with those defined in the `~ndcube.NDCube` instance's `~ndcube.utils.wcs.WCS` object. The output is a list of `~astropy.units.Quantity` objects in pixel unit.:: >>> pixel_coords = my_cube.world_to_pixel( ... 1.400069678 * u.deg, 1.49986193 * u.deg, 1.10000000e-09 * u.m) >>> pixel_coords [, , ] Note that both `~ndcube.NDCube.pixel_to_pixel` and `~ndcube.NDCube.world_to_pixel` can handle non-integer pixels. Moreover, they can also handle pixel beyond the bounds of the `~ndcube.NDCube` and even negative pixels. This is because the WCS translations should be valid anywhere in space, and not just within the field of view of the `~ndcube.NDCube`. This capability has many useful applications, for example, in comparing observations from different instruments with overlapping fields of view. There are times however, when you only want to know the real world coordinates of the `~ndcube.NDCube` field of view. To make this easy, `~ndcube.NDCube` has a another coordinate transformation method `~ndcube.NDCube.axis_world_coords`. This method returns the real world coordinates for each pixel along a given data axis. So in the case of ``my_cube``, if we wanted the wavelength axis we could call:: >>> my_cube.axis_world_coords(2) Note we set ``axes`` to ``2`` since ``axes`` is defined in data axis order. We can also define the axis using any unique substring from the axis names defined in `ndcube.NDCube.world_axis_physical_types`:: >>> my_cube.world_axis_physical_types ('custom:pos.helioprojective.lon', 'custom:pos.helioprojective.lat', 'em.wl') >>> # Since 'wl' is unique to the wavelength axis name, let's use that. >>> my_cube.axis_world_coords('wl') Notice how this returns the same result as when we set ``axes`` to the corresponding data axis number. As discussed above, some WCS axes are not independent. For those axes, `~ndcube.NDCube.axis_world_coords` returns a `~astropy.units.Quantity` with the same number of dimensions as dependent axes. For example, helioprojective longitude and latitude are dependent. Therefore if we ask for longitude, we will get back a 2D `~astropy.units.Quantity` with the same shape as the longitude x latitude axes lengths. For example:: >>> longitude = my_cube.axis_world_coords('lon') >>> my_cube.dimensions >>> longitude.shape (3, 4) >>> longitude It is also possible to request more than one axis's world coordinates by setting ``axes`` to an iterable of data axis number and/or axis type strings.:: >>> my_cube.axis_world_coords(2, 'lon') (, ) Notice that the axes' coordinates have been returned in the same order in which they were requested. Finally, if the user wants the world coordinates for all the axes, ``axes`` can be set to ``None``, which is in fact the default.:: >>> my_cube.axis_world_coords() (, , ) By default `~ndcube.NDCube.axis_world_coords` returns the coordinates at the center of each pixel. However, the pixel edges can be obtained by setting the ``edges`` kwarg to True. For example, >>> my_cube.axis_world_coords(edges=True) (, , ) As stated previously, `~ndcube.NDCube` is only written to handle single arrays described by single WCS instances. For cases where data is made up of multiple arrays, each described by different WCS translations, `ndcube` has another class, `~ndcube.NDCubeSequence`, which will discuss in the next section. ././@PaxHeader0000000000000000000000000000003300000000000011451 xustar000000000000000027 mtime=1605787057.150476 ndcube-1.4.2/docs/ndcubesequence.rst0000644000175100001640000004746600000000000017742 0ustar00vstsdocker00000000000000.. _ndcubesequence: ============== NDCubeSequence ============== `~ndcube.NDCubeSequence` is a class for handling multiple `~ndcube.NDCube` objects as though they were one contiguous data set. Another way of thinking about it is that `~ndcube.NDCubeSequence` provides the ability to manipulate a data set described by multiple separate WCS transformations. Regarding implementation, an `~ndcube.NDCubeSequence` instance is effectively a list of `~ndcube.NDCube` instances with some helper methods attached. Initialization -------------- To initialize the most basic `~ndcube.NDCubeSequence` object, all you need is a list of `~ndcube.NDCube` instances. So let us first define three 3-D NDCubes for slit-spectrograph data as we did in the NDCube section of this tutorial. First we define the data arrays and WCS objects:: >>> # Define data for cubes >>> import numpy as np >>> data0 = np.ones((3, 4, 5)) >>> data1 = data0 * 2 >>> data2 = data1 * 2 >>> # Define WCS object for all cubes. >>> import astropy.wcs >>> wcs_input_dict = { ... 'CTYPE1': 'WAVE ', 'CUNIT1': 'Angstrom', 'CDELT1': 0.2, 'CRPIX1': 0, 'CRVAL1': 10, 'NAXIS1': 5, ... 'CTYPE2': 'HPLT-TAN', 'CUNIT2': 'deg', 'CDELT2': 0.5, 'CRPIX2': 2, 'CRVAL2': 0.5, 'NAXIS2': 4, ... 'CTYPE3': 'HPLN-TAN', 'CUNIT3': 'deg', 'CDELT3': 0.4, 'CRPIX3': 2, 'CRVAL3': 1, 'NAXIS3': 3} >>> input_wcs = astropy.wcs.WCS(wcs_input_dict) Let's also define an extra coordinate of time assigned to the 0th cube data axis and another label coordinate assigned to the cubes as wholes. (See NDCube section of this guide of more detail.) Let the slices along the 0th axis be separated by one minute and the slices in preceding cube are followed directly in time by the slices in the next:: >>> from datetime import datetime, timedelta >>> timestamps0 = [datetime(2000, 1, 1)+timedelta(minutes=i) for i in range(data0.shape[0])] >>> timestamps1 = [timestamps0[-1]+timedelta(minutes=i+1) for i in range(data1.shape[0])] >>> timestamps2 = [timestamps1[-1]+timedelta(minutes=i+1) for i in range(data2.shape[0])] >>> extra_coords_input0 = [("time", 0, timestamps0), ("label", None, "hello")] >>> extra_coords_input1 = [("time", 0, timestamps1), ("label", None, "world")] >>> extra_coords_input2 = [("time", 0, timestamps2), ("label", None, "!")] Now we can define our cubes. >>> from ndcube import NDCube >>> from ndcube import NDCubeSequence >>> # Define a mask such that all array elements are unmasked. >>> mask = np.empty(data0.shape, dtype=object) >>> mask[:, :, :] = False >>> cube_meta = {"Description": "This is example NDCube metadata."} >>> my_cube0 = NDCube(data0, input_wcs, uncertainty=np.sqrt(data0), ... mask=mask, meta=cube_meta, unit=None, ... extra_coords=extra_coords_input0) INFO: uncertainty should have attribute uncertainty_type. [astropy.nddata.nddata] >>> my_cube1 = NDCube(data1, input_wcs, uncertainty=np.sqrt(data1), ... mask=mask, meta=cube_meta, unit=None, ... extra_coords=extra_coords_input1) INFO: uncertainty should have attribute uncertainty_type. [astropy.nddata.nddata] >>> my_cube2 = NDCube(data2, input_wcs, uncertainty=np.sqrt(data2), ... mask=mask, meta=cube_meta, unit=None, ... extra_coords=extra_coords_input2) INFO: uncertainty should have attribute uncertainty_type. [astropy.nddata.nddata] N.B. The above warnings are due to the fact that `astropy.nddata.uncertainty` is recommended to have an ``uncertainty_type`` attribute giving a string describing the type of uncertainty. However, this is not required. Also note that due to laziness, we have used the same WCS translations in each `~ndcube.NDCube` instance above. However, it would be more common for each `~ndcube.NDCube` instance to have a different WCS, and in that case the usefulness of `~ndcube.NDCubeSequence` is more pronounced. Nonetheless, this case can still be used to adequately demonstrate the capabilities of `~ndcube.NDCubeSequence`. Finally, creating an `~ndcube.NDCubeSequence` becomes is simple:: >>> my_sequence = NDCubeSequence([my_cube0, my_cube1, my_cube2]) While, each `~ndcube.NDCube` in the `~ndcube.NDCubeSequence` can have its own meta, it is also possible to supply additional metadata upon initialization of the `~ndcube.NDCubeSequence`. This metadata may be common to all sub-cubes or is specific to the sequence rather than the sub-cubes. This metadata is input as a dictionary:: >>> my_sequence_metadata = {"Description": "This is some sample NDCubeSequence metadata."} >>> my_sequence = NDCubeSequence([my_cube0, my_cube1, my_cube2], ... meta=my_sequence_metadata) and stored in the ``my_sequence.meta`` attribute. Meanwhile, the `~ndcube.NDCube` instances are stored in ``my_sequence.data``. However, analgously to `~ndcube.NDCube`, it is strongly advised that the data is manipulated by slicing the `~ndcube.NDCubeSequence` rather than more manually delving into the ``.data`` attribute. For more explanation, see the section on :ref:`sequence_slicing`. Common Axis ----------- It is possible (although not required) to set a common axis of the `~ndcube.NDCubeSequence`. A common axis is defined as the axis of the sub-cubes parallel to the axis of the sequence. For example, assume the 0th axis of the sub-cubes, ``my_cube0``, ``my_cube1`` and ``my_cube2`` in the `~ndcube.NDCubeSequence`, ``my_sequence``, represent time as we have indicated by setting the ``time`` extra coordinate. In this case, ``my_cube0`` represents observations taken from a period directly before ``my_cube1`` and ``my_cube2`` and the sub-cubes are ordered chronologically in the sequence. Then moving along the 0th axis of one sub-cube and moving along the sequence axis from one cube to the next both represent movement in time. The difference is simply the size of the steps. Therefore it can be said that the 0th axis of the sub-cubes is common to the sequence. To define a common axis, set the kwarg during intialization of the `~ndcube.NDCubeSequence` to the desired data axis number:: >>> my_sequence = NDCubeSequence([my_cube0, my_cube1, my_cube2], ... meta=my_sequence_metadata, common_axis=0) Defining a common axis enables the full range of the `~ndcube.NDCubeSequence` features to be utilized including `ndcube.NDCubeSequence.plot`, `ndcube.NDCubeSequence.common_axis_extra_coords`, and `ndcube.NDCubeSequence.index_as_cube`. See following sections for more details on these features. .. _dimensions: Dimensions ---------- Analagous to `ndcube.NDCube.dimensions`, there is also a `ndcube.NDCubeSequence.dimensions` property for easily inspecting the shape of an `~ndcube.NDCubeSequence` instance:: >>> my_sequence.dimensions (, , , ) Slightly differently to `ndcube.NDCube.dimensions`, `ndcube.NDCubeSequence.dimensions` returns a tuple of `astropy.units.Quantity` instances with pixel units, giving the length of each axis. This is in constrast to the single `~astropy.units.Quantity` returned by `~ndcube.NDCube`. This is because `~ndcube.NDCubeSequence` supports sub-cubes of different lengths along the common axis if it is set. In that case, the corresponding quantity in the dimensions tuple will have a length greater than 1 and list the length of each sub-cube along the common axis. Equivalent to `ndcube.NDCube.world_axis_physical_types`, `ndcube.NDCubeSequence.world_axis_physical_types` returns a tuple of the physical axis types. The same `IVOA UCD1+ controlled words ` are used for the cube axes as is used in `ndcube.NDCube.world_axis_physical_types`. The sequence axis is given the label ``'meta.obs.sequence'`` as it is the IVOA UCD1+ controlled word that best describes it. To call, simply do:: >>> my_sequence.world_axis_physical_types ('meta.obs.sequence', 'custom:pos.helioprojective.lon', 'custom:pos.helioprojective.lat', 'em.wl') .. _sequence_slicing: Slicing ------- As with `~ndcube.NDCube`, slicing an `~ndcube.NDCubeSequence` using the standard slicing API simulataneously slices the data arrays, WCS objects, masks, uncertainty arrays, etc. in each relevant sub-cube. For example, say we have three NDCubes in an `~ndcube.NDCubeSequence`, each of shape ``(3, 4, 5)``. Say we want to obtain a region of interest between the 1st and 2nd pixels (inclusive) in the 2nd dimension and 1st and 3rd pixels (inclusive) in the 3rd dimension of the 0th slice along the 0th axis in only the 1st (not 0th) and 2nd sub-cubes in the sequence. This would be a cumbersome slicing operation if treating the sub-cubes independently. (This would be made even worse without the power of `~ndcube.NDCube` where the data arrays, WCS objects, masks, uncertainty arrays, etc. would all have to be sliced independently!) However, with `~ndcube.NDCubeSequence` this becomes as simple as indexing a single array:: >>> regions_of_interest_in_sequence = my_sequence[1:3, 0, 1:3, 1:4] >>> regions_of_interest_in_sequence.dimensions (, , ) >>> regions_of_interest_in_sequence.world_axis_physical_types ('meta.obs.sequence', 'custom:pos.helioprojective.lat', 'em.wl') This will return a new `~ndcube.NDCubeSequence` with 2 2-D NDCubes, one for each region of interest from the 3rd slice along the 0th axis in each original sub-cube. If our regions of interest only came from a single sub-cube - say the 0th and 1st slices along the 0th axis in the 1st sub-cube - an NDCube is returned:: >>> roi_from_single_subcube = my_sequence[1, 0:2, 1:3, 1:4] >>> roi_from_single_subcube.dimensions >>> roi_from_single_subcube.world_axis_physical_types ('custom:pos.helioprojective.lon', 'custom:pos.helioprojective.lat', 'em.wl') If a common axis has been defined for the `~ndcube.NDCubeSequence` one can think of it as a contiguous data set with different sections along the common axis described by different WCS translations. Therefore it would be useful to be able to index the sequence as though it were one single cube. This can be achieved with the `ndcube.NDCubeSequence.index_as_cube` property. In our above example, ``my_sequence`` has a shape of ``(, , , )`` and a common axis of ``0``. Therefore we can think of ``my_sequence`` as a having an effective cube-like shape of ``(, , )`` where the first sub-cube extends along the 0th cube-like axis from 0 to 3, the second from 3 to 6 and the third from 6 to 9. Say we want to extract the same region of interest as above, i.e. ``my_sequence[1, 0:2, 1:3, 1:4]``. Then this can be acheived by entering:: >>> roi_from_single_subcube = my_sequence.index_as_cube[3:5, 1:3, 1:4] >>> roi_from_single_subcube.dimensions >>> roi_from_single_subcube.world_axis_physical_types ('custom:pos.helioprojective.lon', 'custom:pos.helioprojective.lat', 'em.wl') In this case the entire region came from a single sub-cube. However, `~ndcube.NDCubeSequence.index_as_cube` also works when the region of interest spans multiple sub-cubes in the sequence. Say we want the same region of interest in the 2nd and 3rd cube dimensions from the final slice along the 0th cube axis of the 0th sub-cube, the whole 1st sub-cube and the 0th slice of the 2nd sub-cube. In cube-like indexing this corresponds to slices 2 to 7 along to the 0th cube axis:: >>> roi_across_subcubes = my_sequence.index_as_cube[2:7, 1:3, 1:4] >>> roi_across_subcubes.dimensions (, , , ) >>> roi_across_subcubes.world_axis_physical_types ('meta.obs.sequence', 'custom:pos.helioprojective.lon', 'custom:pos.helioprojective.lat', 'em.wl') Notice that since the sub-cubes are now of different lengths along the common axis, the corresponding `~astropy.units.Quantity` gives the lengths of each cube individually. See section on :ref:`dimensions` for more detail. Cube-like Dimensions -------------------- To help with handling an `~ndcube.NDCubeSequence` with a common axis as if it were a single cube, there exist cube-like equivalents of the `~ndcube.NDCubeSequence.dimensions` and `~ndcube.NDCubeSequence.world_axis_physical_types` methods. They are intuitively named `~ndcube.NDCubeSequence.cube_like_dimensions` and `~ndcube.NDCubeSequence.cube_like_world_axis_physical_types`. These give the lengths and physical types of the axes as if the data were stored in a single `~ndcube.NDCube`. So in the case of ``my_sequence``, with three sub-cubes, each with a length of 3 along the common axis, we get:: >>> my_sequence.cube_like_dimensions >>> my_sequence.cube_like_world_axis_physical_types ('custom:pos.helioprojective.lon', 'custom:pos.helioprojective.lat', 'em.wl') Note that `~ndcube.NDCubeSequence.cube_like_dimensions` returns a single `~astropy.units.Quantity` in pixel units, as if it were `ndcube.NDCube.dimensions`. This is in contrast to `ndcube.NDCubeSequence.dimensions` that returns a `tuple` of `~astropy.units.Quantity`. Common Axis Extra Coordinates ----------------------------- If a common axis is defined, it may be useful to view the extra coordinates along that common axis defined by each of the sub-cube `~ndcube.NDCube.extra_coords` as if the `~ndcube.NDCubeSequence` were one contiguous Cube. This can be done using the ``common_axis_extra_coords`` property:: >>> my_sequence.common_axis_extra_coords {'time': array([datetime.datetime(2000, 1, 1, 0, 0), datetime.datetime(2000, 1, 1, 0, 1), datetime.datetime(2000, 1, 1, 0, 2), datetime.datetime(2000, 1, 1, 0, 3), datetime.datetime(2000, 1, 1, 0, 4), datetime.datetime(2000, 1, 1, 0, 5), datetime.datetime(2000, 1, 1, 0, 6), datetime.datetime(2000, 1, 1, 0, 7), datetime.datetime(2000, 1, 1, 0, 8)], dtype=object)} This returns a dictionary where each key gives the name of a coordinate. The value of each key is the values of that coordinate at each pixel along the common axis. Since all these coordinates must be along the common axis, it is not necessary to supply axis information as it is with `ndcube.NDCube.extra_coords` making `ndcube.NDCubeSequence.common_axis_extra_coords` simpler. Because this property has a functional form and calculates the dictionary each time from the constituent sub-cubes' `ndcube.NDCube.extra_coords` attributes, `ndcube.NDCubeSequence.common_axis_extra_coords` is effectively sliced when the `~ndcube.NDCubeSequence` is sliced, e.g.:: >>> my_sequence[1:3].common_axis_extra_coords {'time': array([datetime.datetime(2000, 1, 1, 0, 3), datetime.datetime(2000, 1, 1, 0, 4), datetime.datetime(2000, 1, 1, 0, 5), datetime.datetime(2000, 1, 1, 0, 6), datetime.datetime(2000, 1, 1, 0, 7), datetime.datetime(2000, 1, 1, 0, 8)], dtype=object)} Sequence Axis Extra Coordinates ------------------------------- Analgous to `~ndcube.NDCubeSequence.common_axis_extra_coords`, it is also possible to access the extra coordinates that are not assigned to any `~ndcube.NDCube` data axis via the `ndcube.NDCubeSequence.sequence_axis_extra_coords` property. Whereas `~ndcube.NDCubeSequence.common_axis_extra_coords` returns all the extra coords with an ``'axis'`` value equal to the common axis, `~ndcube.NDCubeSequence.sequence_axis_extra_coords` returns all extra coords with an ``'axis'`` value of ``None``. Another way of thinking about this when there is no common axis set, is that they are assigned to the sequence axis. Hence the property's name.:: >>> my_sequence.sequence_axis_extra_coords {'label': array(['hello', 'world', '!'], dtype=object)} Plotting -------- Just like `~ndcube.NDCube`, `~ndcube.NDCubeSequence` provide simple but powerful plotting APIs to help users visualize their data. Two plotting methods, `~ndcube.NDCubeSequence.plot` and `~ndcube.NDCubeSequence.plot_as_cube`, are provided which correspond to the sequence and cube-like representations of the data, respectively. These methods allows the sequence to be animated as though it were one contiguous `~ndcube.NDCube`. Both methods have the same API and same kwargs as `ndcube.NDCube.plot`. See documentation for `ndcube.NDCube.plot` for more details. The main substantive difference between them is how the axis inputs relate to dimensionality of the data, i.e. the same way that the inputs to NDCubeSequence slicing and `~ndcube.NDCubeSequence.index_as_cube` differ. Explode Along Axis ------------------ During analysis of some data - say of a stack of images - it may be necessary to make some different fine-pointing adjustments to each image that isn't accounted for the in the original WCS translations, e.g. due to satellite wobble. If these changes are not describable with a single WCS object, it may be desirable to break up the N-D sub-cubes of an `~ndcube.NDCubeSequence` into an sequence of sub-cubes with dimension N-1. This would enable a separate WCS object to be associated with each image and hence allow individual pointing adjustments. Rather than manually dividing the datacubes up and deriving the corresponding WCS object for each exposure, `~ndcube.NDCubeSequence` provides a useful method, `~ndcube.NDCubeSequence.explode_along_axis`. To call it, simply provide the number of the data cube axis along which you wish to break up the sub-cubes:: >>> exploded_sequence = my_sequence.explode_along_axis(0) Assuming we are using the same ``my_sequence`` as above, with dimensions.shape ``(, , , )``, the ``exploded_sequence`` will be an `~ndcube.NDCubeSequence` of nine 2-D NDCubes each with shape ``(, )``.:: >>> # Check old and new shapes of the squence >>> my_sequence.dimensions (, , , ) >>> exploded_sequence.dimensions (, , ) Note that any cube axis can be input. A common axis need not be defined. Extracting Data Arrays ------------------------- It is possible that you may have some procedures that are designed to operate on arrays instead of `~ndcube.NDCubeSequence` objects. "Therefore it may be useful to extract the data (or other array-like information such as `uncertainty` or `mask`) in the `~ndcube.NDCubeSequence` into a single `~numpy.ndarray`. A succinct way of doing this operation is using python's list comprehension features. In the above examples we defined the `my_sequence` `~ndcube.NDCubeSequence` object.:: >>> # Print dimensions of my_sequence as a reminder >>> print(my_sequence.dimensions) (, , , ) In this section we will use this object to demonstrate extracting data arrays from `~ndcube.NDCubeSequence` objects. For example, say we wanted to make a 4D array out of the data arrays within the `~ndcube.NDCubes` of `my_sequence`.:: >>> # Make a single 4D array of data in sequence. >>> data = np.stack([cube.data for cube in my_sequence.data]) >>> print(data.shape) (3, 3, 4, 5) If instead, we want to define a 3D array where every `~ndcube.NDCube` in the `~ndcube.NDCubeSequence` is appended together, we can use `numpy`'s `vstack` function:: >>> # Make a 3D array >>> data = np.vstack([cube.data for cube in my_sequence.data]) >>> print(data.shape) (9, 4, 5) Finally, we can also create 3D arrays by slicing `~ndcube.NDCubeSequence` objects. Here we slice the `~ndcube.NDCubeSequence` along the fastest-changing dimension:: >>> # Slice sequence to make 3D array >>> data = np.stack([cube[2].data for cube in my_sequence.data]) >>> print(data.shape) (3, 4, 5) ././@PaxHeader0000000000000000000000000000003300000000000011451 xustar000000000000000027 mtime=1605787057.150476 ndcube-1.4.2/docs/rtd_requirements.txt0000644000175100001640000000015100000000000020330 0ustar00vstsdocker00000000000000# This file is used on ReadTheDocs to build the # documentation and is separate of tox.ini and setup.cfg ././@PaxHeader0000000000000000000000000000003400000000000011452 xustar000000000000000028 mtime=1605787217.3871849 ndcube-1.4.2/docs/whatsnew/0000755000175100001640000000000000000000000016036 5ustar00vstsdocker00000000000000././@PaxHeader0000000000000000000000000000003300000000000011451 xustar000000000000000027 mtime=1605787057.150476 ndcube-1.4.2/docs/whatsnew/changelog.rst0000644000175100001640000000020100000000000020510 0ustar00vstsdocker00000000000000.. _changelog: ************** Full Changelog ************** .. include:: latest_changelog.txt .. include:: ../../CHANGELOG.rst ././@PaxHeader0000000000000000000000000000003300000000000011451 xustar000000000000000027 mtime=1605787057.150476 ndcube-1.4.2/docs/whatsnew/index.rst0000644000175100001640000000013400000000000017675 0ustar00vstsdocker00000000000000*************** Release History *************** .. toctree:: :maxdepth: 1 changelog ././@PaxHeader0000000000000000000000000000003400000000000011452 xustar000000000000000028 mtime=1605787217.3871849 ndcube-1.4.2/ndcube/0000755000175100001640000000000000000000000014506 5ustar00vstsdocker00000000000000././@PaxHeader0000000000000000000000000000003300000000000011451 xustar000000000000000027 mtime=1605787057.150476 ndcube-1.4.2/ndcube/__init__.py0000644000175100001640000000213500000000000016620 0ustar00vstsdocker00000000000000# Licensed under a 3-clause BSD style license - see LICENSE.rst # Packages may add whatever they like to this file, but # should keep this content at the top. # ---------------------------------------------------------------------------- from ._sunpy_init import * # noqa # ---------------------------------------------------------------------------- # Enforce Python version check during package import. # This is the same check as the one at the top of setup.py import sys from distutils.version import LooseVersion __minimum_python_version__ = "3.6" __all__ = ['NDCube', 'NDCubeSequence', "NDCollection"] class UnsupportedPythonError(Exception): pass if LooseVersion(sys.version) < LooseVersion(__minimum_python_version__): raise UnsupportedPythonError("ndcube does not support Python < {}" .format(__minimum_python_version__)) if not _SUNPY_SETUP_: # noqa # For egg_info test builds to pass, put package imports here. from .ndcube import NDCube, NDCubeOrdered from .ndcube_sequence import NDCubeSequence from .ndcollection import NDCollection ././@PaxHeader0000000000000000000000000000003300000000000011451 xustar000000000000000027 mtime=1605787057.150476 ndcube-1.4.2/ndcube/_sunpy_init.py0000644000175100001640000000053400000000000017422 0ustar00vstsdocker00000000000000# Licensed under a 3-clause BSD style license - see LICENSE.rst __all__ = ['__version__'] # this indicates whether or not we are in the package's setup.py try: _SUNPY_SETUP_ except NameError: import builtins builtins._SUNPY_SETUP_ = False try: from .version import version as __version__ except ImportError: __version__ = '' ././@PaxHeader0000000000000000000000000000003400000000000011452 xustar000000000000000028 mtime=1605787217.3871849 ndcube-1.4.2/ndcube/data/0000755000175100001640000000000000000000000015417 5ustar00vstsdocker00000000000000././@PaxHeader0000000000000000000000000000003300000000000011451 xustar000000000000000027 mtime=1605787057.150476 ndcube-1.4.2/ndcube/data/README.rst0000644000175100001640000000036600000000000017113 0ustar00vstsdocker00000000000000Data directory ============== This directory contains data files included with the package source code distribution. Note that this is intended only for relatively small files - large files should be externally hosted and downloaded as needed. ././@PaxHeader0000000000000000000000000000003400000000000011452 xustar000000000000000028 mtime=1605787217.3871849 ndcube-1.4.2/ndcube/mixins/0000755000175100001640000000000000000000000016015 5ustar00vstsdocker00000000000000././@PaxHeader0000000000000000000000000000003300000000000011451 xustar000000000000000027 mtime=1605787057.150476 ndcube-1.4.2/ndcube/mixins/__init__.py0000644000175100001640000000012000000000000020117 0ustar00vstsdocker00000000000000from .ndslicing import NDCubeSlicingMixin from .plotting import NDCubePlotMixin ././@PaxHeader0000000000000000000000000000003300000000000011451 xustar000000000000000027 mtime=1605787057.150476 ndcube-1.4.2/ndcube/mixins/ndslicing.py0000644000175100001640000000577200000000000020354 0ustar00vstsdocker00000000000000import copy from astropy.nddata.mixins.ndslicing import NDSlicingMixin from ndcube import utils __all__ = ['NDCubeSlicingMixin'] class NDCubeSlicingMixin(NDSlicingMixin): # Inherit docstring from parent class __doc__ = NDSlicingMixin.__doc__ def _slice_wcs(self, item): """ Override parent class method so we disable the wcs slicing on `astropy.nddata.mixins.NDSlicingMixin`. """ return None def __getitem__(self, item): """ Override the parent class method to explicitly catch `None` indices. This method calls ``_slice`` and then constructs a new object using the kwargs returned by ``_slice``. """ if item is None or (isinstance(item, tuple) and None in item): raise IndexError("None indices not supported") return super().__getitem__(item) def _slice(self, item): """ Construct a set of keyword arguments to initialise a new (sliced) instance of the class. This method is called in `astropy.nddata.mixins.NDSlicingMixin.__getitem__`. This method extends the `~astropy.nddata.mixins.NDSlicingMixin` method to add support for ``missing_axes`` and ``extra_coords`` and overwrites the astropy handling of wcs slicing. """ kwargs = super()._slice(item) wcs, missing_axes = self._slice_wcs_missing_axes(item) kwargs['wcs'] = wcs kwargs['missing_axes'] = missing_axes kwargs['extra_coords'] = self._slice_extra_coords(item, missing_axes) return kwargs def _slice_wcs_missing_axes(self, item): # here missing axis is reversed as the item comes already in the reverse order # of the input return utils.wcs._wcs_slicer( self.wcs, copy.deepcopy(self.missing_axes[::-1]), item) def _slice_extra_coords(self, item, missing_axes): if self.extra_coords is None: new_extra_coords_dict = None else: old_extra_coords = self.extra_coords extra_coords_keys = list(old_extra_coords.keys()) new_extra_coords = copy.deepcopy(self._extra_coords_wcs_axis) for ck in extra_coords_keys: axis_ck = old_extra_coords[ck]["axis"] if isinstance(item, (slice, int)): if axis_ck == 0: new_extra_coords[ck]["value"] = new_extra_coords[ck]["value"][item] if isinstance(item, tuple): try: slice_item_extra_coords = item[axis_ck] new_extra_coords[ck]["value"] = \ new_extra_coords[ck]["value"][slice_item_extra_coords] except IndexError: pass except TypeError: pass new_extra_coords_dict = utils.cube.convert_extra_coords_dict_to_input_format( new_extra_coords, missing_axes) return new_extra_coords_dict ././@PaxHeader0000000000000000000000000000003300000000000011451 xustar000000000000000027 mtime=1605787057.150476 ndcube-1.4.2/ndcube/mixins/plotting.py0000644000175100001640000006720000000000000020234 0ustar00vstsdocker00000000000000import datetime from warnings import warn import numpy as np import matplotlib as mpl import matplotlib.pyplot as plt import astropy.units as u from astropy.visualization.wcsaxes import WCSAxes import sunpy.visualization.wcsaxes_compat as wcsaxes_compat try: from sunpy.visualization.animator import ImageAnimator, LineAnimator except ImportError: from sunpy.visualization.imageanimator import ImageAnimator, LineAnimator from ndcube import utils from ndcube.utils.cube import _get_extra_coord_edges from ndcube.mixins import sequence_plotting from ndcube.visualization.animator import ImageAnimatorWCS __all__ = ['NDCubePlotMixin'] INVALID_UNIT_SET_MESSAGE = "Can only set unit for axis if corresponding coordinates in " + \ "axes_coordinates are set to None, an astropy Quantity or the name of an extra coord that " + \ "is an astropy Quantity." class NDCubePlotMixin: """ Add plotting functionality to a NDCube class. """ def plot(self, axes=None, plot_axis_indices=None, axes_coordinates=None, axes_units=None, data_unit=None, **kwargs): """ Plots an interactive visualization of this cube with a slider controlling the wavelength axis for data having dimensions greater than 2. Plots an x-y graph onto the current axes for 2D or 1D data. Keyword arguments are passed on to matplotlib. Parameters other than data and wcs are passed to ImageAnimatorWCS, which in turn passes them to imshow for data greater than 2D. Parameters ---------- plot_axis_indices: `list` The two axes that make the image. Default=[-1,-2]. This implies cube instance -1 dimension will be x-axis and -2 dimension will be y-axis. axes: `astropy.visualization.wcsaxes.core.WCSAxes` or None: The axes to plot onto. If None the current axes will be used. axes_unit: `list` of `astropy.units.Unit` data_unit: `astropy.unit.Unit` The data is changed to the unit given or the cube.unit if not given, for 1D plots. axes_coordinates: list of physical coordinates for array or None If None array indices will be used for all axes. If a list it should contain one element for each axis of the numpy array. For the image axes a [min, max] pair should be specified which will be passed to :func:`matplotlib.pyplot.imshow` as extent. For the slider axes a [min, max] pair can be specified or an array the same length as the axis which will provide all values for that slider. If None is specified for an axis then the array indices will be used for that axis. The physical coordinates expected by axes_coordinates should be an array of pixel_edges. A str entry in axes_coordinates signifies that an extra_coord will be used for the axis's coordinates. The str must be a valid name of an extra_coord that corresponds to the same axis to which it is applied in the plot. """ # If old API is used, convert to new API. plot_axis_indices, axes_coordinates, axes_units, data_unit, kwargs = _support_101_plot_API( plot_axis_indices, axes_coordinates, axes_units, data_unit, kwargs) # Check kwargs are in consistent formats and set default values if not done so by user. naxis = len(self.dimensions) plot_axis_indices, axes_coordinates, axes_units = sequence_plotting._prep_axes_kwargs( naxis, plot_axis_indices, axes_coordinates, axes_units) if naxis == 1: ax = self._plot_1D_cube(axes, axes_coordinates, axes_units, data_unit, **kwargs) else: if len(plot_axis_indices) == 1: ax = self._animate_cube_1D( plot_axis_index=plot_axis_indices[0], axes_coordinates=axes_coordinates, axes_units=axes_units, data_unit=data_unit, **kwargs) else: if naxis == 2: ax = self._plot_2D_cube(axes, plot_axis_indices, axes_coordinates, axes_units, data_unit, **kwargs) else: ax = self._plot_3D_cube( plot_axis_indices=plot_axis_indices, axes_coordinates=axes_coordinates, axes_units=axes_units, **kwargs) return ax def _plot_1D_cube(self, axes=None, axes_coordinates=None, axes_units=None, data_unit=None, **kwargs): """ Plots a graph. Keyword arguments are passed on to matplotlib. Parameters ---------- data_unit: `astropy.unit.Unit` The data is changed to the unit given or the cube.unit if not given. """ # Derive x-axis coordinates and unit from inputs. x_axis_coordinates, unit_x_axis = sequence_plotting._derive_1D_coordinates_and_units( axes_coordinates, axes_units) if x_axis_coordinates is None: # Default is to derive x coords and defaul xlabel from WCS object. xname = self.world_axis_physical_types[0] xdata = self.axis_world_coords() elif isinstance(x_axis_coordinates, str): # User has entered a str as x coords, get that extra coord. xname = x_axis_coordinates xdata = self.extra_coords[x_axis_coordinates]["value"] else: # Else user must have set the x-values manually. xname = "" xdata = x_axis_coordinates # If a unit has been set for the x-axis, try to convert x coords to that unit. if isinstance(xdata, u.Quantity): if unit_x_axis is None: unit_x_axis = xdata.unit xdata = xdata.value else: xdata = xdata.to(unit_x_axis).value else: if unit_x_axis is not None: raise TypeError(INVALID_UNIT_SET_MESSAGE) # Define default x axis label. default_xlabel = f"{xname} [{unit_x_axis}]" # Combine data and uncertainty with mask. xdata = np.ma.masked_array(xdata, self.mask) # Derive y-axis coordinates, uncertainty and unit from the NDCube's data. if self.unit is None: if data_unit is not None: raise TypeError("Can only set y-axis unit if self.unit is set to a " "compatible unit.") else: ydata = self.data if self.uncertainty is None: yerror = None else: yerror = self.uncertainty.array else: if data_unit is None: data_unit = self.unit ydata = self.data if self.uncertainty is None: yerror = None else: yerror = self.uncertainty.array else: ydata = (self.data * self.unit).to(data_unit).value if self.uncertainty is None: yerror = None else: yerror = (self.uncertainty.array * self.unit).to(data_unit).value # Combine data and uncertainty with mask. ydata = np.ma.masked_array(ydata, self.mask) if yerror is not None: yerror = np.ma.masked_array(yerror, self.mask) # Create plot fig, ax = sequence_plotting._make_1D_sequence_plot(xdata, ydata, yerror, data_unit, default_xlabel, kwargs) return ax def _plot_2D_cube(self, axes=None, plot_axis_indices=None, axes_coordinates=None, axes_units=None, data_unit=None, **kwargs): """ Plots a 2D image onto the current axes. Keyword arguments are passed on to matplotlib. Parameters ---------- axes: `astropy.visualization.wcsaxes.core.WCSAxes` or `None`: The axes to plot onto. If None the current axes will be used. plot_axis_indices: `list`. The first axis in WCS object will become the first axis of plot_axis_indices and second axis in WCS object will become the second axis of plot_axis_indices. Default: ['x', 'y'] """ # Set default values of kwargs if not set. if axes_coordinates is None: axes_coordinates = [None, None] if axes_units is None: axes_units = [None, None] # Set which cube dimensions are on the x an y axes. axis_data = ['x', 'x'] axis_data[plot_axis_indices[1]] = 'y' axis_data = axis_data[::-1] # Determine data to be plotted if data_unit is None: data = self.data else: # If user set data_unit, convert dat to desired unit if self.unit set. if self.unit is None: raise TypeError("Can only set data_unit if NDCube.unit is set.") else: data = (self.data * self.unit).to(data_unit).value # Combine data with mask data = np.ma.masked_array(data, self.mask) try: axes_coord_check = axes_coordinates == [None, None] except Exception: axes_coord_check = False if axes_coord_check and (isinstance(axes, WCSAxes) or axes is None): if axes is None: # Build slice list for WCS for initializing WCSAxes object. if self.wcs.naxis != 2: slice_list = [] index = 0 for bool_ in self.missing_axes: if not bool_: slice_list.append(axis_data[index]) index += 1 else: slice_list.append(1) if index != 2: raise ValueError("Dimensions of WCS and data don't match") axes = wcsaxes_compat.gca_wcs(self.wcs, slices=tuple(slice_list)) else: axes = wcsaxes_compat.gca_wcs(self.wcs) # Plot data axes.imshow(data, **kwargs) # Set axis labels x_wcs_axis = utils.cube.data_axis_to_wcs_axis(plot_axis_indices[0], self.missing_axes) axes.coords[x_wcs_axis].set_axislabel("{} [{}]".format( self.world_axis_physical_types[plot_axis_indices[0]], self.wcs.wcs.cunit[x_wcs_axis])) y_wcs_axis = utils.cube.data_axis_to_wcs_axis(plot_axis_indices[1], self.missing_axes) axes.coords[y_wcs_axis].set_axislabel("{} [{}]".format( self.world_axis_physical_types[plot_axis_indices[1]], self.wcs.wcs.cunit[y_wcs_axis])) else: # Else manually set axes x and y values based on user's input for axes_coordinates. new_axes_coordinates, new_axis_units, default_labels = \ self._derive_axes_coordinates(axes_coordinates, axes_units, data.shape) # Initialize axes object and set values along axis. if axes is None: axes = plt.gca() # Since we can't assume the x-axis will be uniform, create NonUniformImage # axes and add it to the axes object. if plot_axis_indices[0] < plot_axis_indices[1]: data = data.transpose() im_ax = mpl.image.NonUniformImage( axes, extent=(new_axes_coordinates[plot_axis_indices[0]][0], new_axes_coordinates[plot_axis_indices[0]][-1], new_axes_coordinates[plot_axis_indices[1]][0], new_axes_coordinates[plot_axis_indices[1]][-1]), **kwargs) im_ax.set_data(new_axes_coordinates[plot_axis_indices[0]], new_axes_coordinates[plot_axis_indices[1]], data) axes.add_image(im_ax) # Set the limits, labels, etc. of the axes. xlim = kwargs.pop("xlim", (new_axes_coordinates[plot_axis_indices[0]][0], new_axes_coordinates[plot_axis_indices[0]][-1])) axes.set_xlim(xlim) ylim = kwargs.pop("xlim", (new_axes_coordinates[plot_axis_indices[1]][0], new_axes_coordinates[plot_axis_indices[1]][-1])) axes.set_ylim(ylim) xlabel = kwargs.pop("xlabel", default_labels[plot_axis_indices[0]]) ylabel = kwargs.pop("ylabel", default_labels[plot_axis_indices[1]]) axes.set_xlabel(xlabel) axes.set_ylabel(ylabel) return axes def _plot_3D_cube(self, plot_axis_indices=None, axes_coordinates=None, axes_units=None, data_unit=None, **kwargs): """ Plots an interactive visualization of this cube using sliders to move through axes plot using in the image. Parameters other than data and wcs are passed to ImageAnimatorWCS, which in turn passes them to imshow. Parameters ---------- plot_axis_indices: `list` The two axes that make the image. Like [-1,-2] this implies cube instance -1 dimension will be x-axis and -2 dimension will be y-axis. axes_unit: `list` of `astropy.units.Unit` axes_coordinates: `list` of physical coordinates for array or None If None array indices will be used for all axes. If a list it should contain one element for each axis of the numpy array. For the image axes a [min, max] pair should be specified which will be passed to :func:`matplotlib.pyplot.imshow` as extent. For the slider axes a [min, max] pair can be specified or an array the same length as the axis which will provide all values for that slider. If None is specified for an axis then the array indices will be used for that axis. The physical coordinates expected by axes_coordinates should be an array of pixel_edges. A str entry in axes_coordinates signifies that an extra_coord will be used for the axis's coordinates. The str must be a valid name of an extra_coord that corresponds to the same axis to which it is applied in the plot. """ # For convenience in inserting dummy variables later, ensure # plot_axis_indices are all positive. plot_axis_indices = [i if i >= 0 else self.data.ndim + i for i in plot_axis_indices] # If axes kwargs not set by user, set them as list of Nones for # each axis for consistent behaviour. if axes_coordinates is None: axes_coordinates = [None] * self.data.ndim if axes_units is None: axes_units = [None] * self.data.ndim # If data_unit set, convert data to that unit if data_unit is None: data = self.data else: data = (self.data * self.unit).to(data_unit).value # Combine data values with mask. data = np.ma.masked_array(data, self.mask) # If axes_coordinates not provided generate an ImageAnimatorWCS plot # using NDCube's wcs object. new_axes_coordinates, new_axes_units, default_labels = \ self._derive_axes_coordinates(axes_coordinates, axes_units, data.shape, edges=True) if (axes_coordinates[plot_axis_indices[0]] is None and axes_coordinates[plot_axis_indices[1]] is None): # If there are missing axes in WCS object, add corresponding dummy axes to data. if data.ndim < self.wcs.naxis: new_shape = list(data.shape) for i in np.arange(self.wcs.naxis)[self.missing_axes[::-1]]: new_shape.insert(i, 1) # Also insert dummy coordinates and units. new_axes_coordinates.insert(i, None) new_axes_units.insert(i, None) # Iterate plot_axis_indices if neccessary for j, pai in enumerate(plot_axis_indices): if pai >= i: plot_axis_indices[j] = plot_axis_indices[j] + 1 # Reshape data data = data.reshape(new_shape) # Generate plot ax = ImageAnimatorWCS(data, wcs=self.wcs, image_axes=plot_axis_indices, unit_x_axis=new_axes_units[plot_axis_indices[0]], unit_y_axis=new_axes_units[plot_axis_indices[1]], axis_ranges=new_axes_coordinates, **kwargs) # Set the labels of the plot ax.axes.coords[0].set_axislabel( self.wcs.world_axis_physical_types[plot_axis_indices[0]]) ax.axes.coords[1].set_axislabel( self.wcs.world_axis_physical_types[plot_axis_indices[1]]) # If one of the plot axes is set manually, produce a basic ImageAnimator object. else: # If axis labels not set by user add to kwargs. ax = ImageAnimator(data, image_axes=plot_axis_indices, axis_ranges=new_axes_coordinates, **kwargs) # Add the labels of the plot ax.axes.set_xlabel(default_labels[plot_axis_indices[0]]) ax.axes.set_ylabel(default_labels[plot_axis_indices[1]]) return ax def _animate_cube_1D(self, plot_axis_index=-1, axes_coordinates=None, axes_units=None, data_unit=None, **kwargs): """ Animates an axis of a cube as a line plot with sliders for other axes. """ if axes_coordinates is None: axes_coordinates = [None] * self.data.ndim if axes_units is None: axes_units = [None] * self.data.ndim # Get real world axis values along axis to be plotted and enter into axes_ranges kwarg. if axes_coordinates[plot_axis_index] is None: xname = self.world_axis_physical_types[plot_axis_index] xdata = self.axis_world_coords(plot_axis_index, edges=True) elif isinstance(axes_coordinates[plot_axis_index], str): xname = axes_coordinates[plot_axis_index] xdata = _get_extra_coord_edges(self.extra_coords[xname]["value"]) else: xname = "" xdata = axes_coordinates[plot_axis_index] # Change x data to desired units it set by user. if isinstance(xdata, u.Quantity): if axes_units[plot_axis_index] is None: unit_x_axis = xdata.unit else: unit_x_axis = axes_units[plot_axis_index] xdata = xdata.to(unit_x_axis).value else: if axes_units[plot_axis_index] is not None: raise TypeError(INVALID_UNIT_SET_MESSAGE) else: unit_x_axis = None # Put xdata back into axes_coordinates as a masked array. if len(xdata.shape) > 1: # Since LineAnimator currently only accepts 1-D arrays for the x-axis, collapse xdata # to single dimension by taking mean along non-plotting axes. index = utils.wcs.get_dependent_data_axes(self.wcs, plot_axis_index, self.missing_axes) reduce_axis = np.where(index == np.array(plot_axis_index))[0] index = np.delete(index, reduce_axis) # Reduce the data by taking mean xdata = np.mean(xdata, axis=tuple(index)) axes_coordinates[plot_axis_index] = xdata # Set default x label default_xlabel = f"{xname} [{unit_x_axis}]" # Derive y axis data if data_unit is None: data = self.data data_unit = self.unit else: if self.unit is None: raise TypeError("NDCube.unit is None. Must be an astropy.units.unit or " "valid unit string in order to set data_unit.") else: data = (self.data * self.unit).to(data_unit).value # If min or max of data is inf or nan, set ylim of plot manually. ylim = kwargs.pop("ylim", None) if ylim is None and not np.isfinite([data.min(), data.max()]).all(): if self.mask is not None: good_data = data[np.invert(self.mask)] else: good_data = data ylim = (np.nanmin(good_data), np.nanmax(good_data)) # Combine data with mask # data = np.ma.masked_array(data, self.mask) # Set default y label default_ylabel = f"Data [{unit_x_axis}]" # Initiate line animator object. ax = LineAnimator(data, plot_axis_index=plot_axis_index, axis_ranges=axes_coordinates, xlabel=default_xlabel, ylabel=f"Data [{data_unit}]", ylim=ylim, **kwargs) return ax def _derive_axes_coordinates(self, axes_coordinates, axes_units, data_shape, edges=False): new_axes_coordinates = [] new_axes_units = [] default_labels = [] default_label_text = "" for i, axis_coordinate in enumerate(axes_coordinates): # If axis coordinate is None, derive axis values from WCS. if axis_coordinate is None: # If the new_axis_coordinate is not independent, i.e. dimension is >2D # and not equal to dimension of data, then the new_axis_coordinate must # be reduced to a 1D ndarray by taking the mean along all non-plotting axes. new_axis_coordinate = self.axis_world_coords(i, edges=edges) axis_label_text = self.world_axis_physical_types[i] # If the shape of the data is not 1, or all the axes are not dependent if new_axis_coordinate.ndim != 1 and new_axis_coordinate.ndim != len(data_shape): dependent_axes = utils.wcs.get_dependent_data_axes(self.wcs, i, self.missing_axes) reduce_axis = np.where(dependent_axes == np.array([i]))[0] index = np.delete(np.arange(len(dependent_axes)), reduce_axis) # Reduce the data by taking mean new_axis_coordinate = np.mean(new_axis_coordinate, axis=tuple(index)) elif isinstance(axis_coordinate, str): # If axis coordinate is a string, derive axis values from # corresponding extra coord. # Calculate edge value if required new_axis_coordinate = _get_extra_coord_edges( self.extra_coords[axis_coordinate]["value"]) if edges \ else self.extra_coords[axis_coordinate]["value"] axis_label_text = axis_coordinate else: # Else user must have manually set the axis coordinates. new_axis_coordinate = axis_coordinate axis_label_text = default_label_text # If axis coordinate is a Quantity, convert to unit supplied by user. if isinstance(new_axis_coordinate, u.Quantity): if axes_units[i] is None: new_axis_unit = new_axis_coordinate.unit new_axis_coordinate = new_axis_coordinate.value else: new_axis_unit = axes_units[i] new_axis_coordinate = new_axis_coordinate.to(new_axis_unit).value elif isinstance(new_axis_coordinate[0], datetime.datetime): axis_label_text = "{}/sec since {}".format( axis_label_text, new_axis_coordinate[0]) new_axis_coordinate = np.array([(t - new_axis_coordinate[0]).total_seconds() for t in new_axis_coordinate]) new_axis_unit = u.s else: if axes_units[i] is None: new_axis_unit = None else: raise TypeError(INVALID_UNIT_SET_MESSAGE) # Derive default axis label if isinstance(new_axis_coordinate, datetime.datetime): if axis_label_text == default_label_text: default_label = "{}".format(new_axis_coordinate.strftime("%Y/%m/%d %H:%M")) else: default_label = "{} [{}]".format( axis_label_text, new_axis_coordinate.strftime("%Y/%m/%d %H:%M")) else: default_label = f"{axis_label_text} [{new_axis_unit}]" # Append new coordinates, units and labels to output list. new_axes_coordinates.append(new_axis_coordinate) new_axes_units.append(new_axis_unit) default_labels.append(default_label) return new_axes_coordinates, new_axes_units, default_labels def _support_101_plot_API(plot_axis_indices, axes_coordinates, axes_units, data_unit, kwargs): """ Check if user has used old API and convert it to new API. """ # Get old API variable values. image_axes = kwargs.pop("image_axes", None) axis_ranges = kwargs.pop("axis_ranges", None) unit_x_axis = kwargs.pop("unit_x_axis", None) unit_y_axis = kwargs.pop("unit_y_axis", None) unit = kwargs.pop("unit", None) # Check if conflicting new and old API values have been set. # If not, set new API using old API and raise deprecation warning. if image_axes is not None: variable_names = ("image_axes", "plot_axis_indices") _raise_101_API_deprecation_warning(*variable_names) if plot_axis_indices is None: plot_axis_indices = image_axes else: _raise_API_error(*variable_names) if axis_ranges is not None: variable_names = ("axis_ranges", "axes_coordinates") _raise_101_API_deprecation_warning(*variable_names) if axes_coordinates is None: axes_coordinates = axis_ranges else: _raise_API_error(*variable_names) if (unit_x_axis is not None or unit_y_axis is not None) and axes_units is not None: _raise_API_error("unit_x_axis and/or unit_y_axis", "axes_units") if axes_units is None: variable_names = ("unit_x_axis and unit_y_axis", "axes_units") if unit_x_axis is not None: _raise_101_API_deprecation_warning(*variable_names) if len(plot_axis_indices) == 1: axes_units = unit_x_axis elif len(plot_axis_indices) == 2: if unit_y_axis is None: axes_units = [unit_x_axis, None] else: axes_units = [unit_x_axis, unit_y_axis] else: raise ValueError("Length of image_axes must be less than 3.") else: if unit_y_axis is not None: _raise_101_API_deprecation_warning(*variable_names) axes_units = [None, unit_y_axis] if unit is not None: variable_names = ("unit", "data_unit") _raise_101_API_deprecation_warning(*variable_names) if data_unit is None: data_unit = unit else: _raise_API_error(*variable_names) # Return values of new API return plot_axis_indices, axes_coordinates, axes_units, data_unit, kwargs def _raise_API_error(old_name, new_name): raise ValueError( "Conflicting inputs: {} (old API) cannot be set if {} (new API) is set".format( old_name, new_name)) def _raise_101_API_deprecation_warning(old_name, new_name): warn( ("{} is deprecated and will not be supported in version 2.0." " It will be replaced by {}. See docstring.").format( old_name, new_name), DeprecationWarning) ././@PaxHeader0000000000000000000000000000003300000000000011451 xustar000000000000000027 mtime=1605787057.150476 ndcube-1.4.2/ndcube/mixins/sequence_plotting.py0000644000175100001640000022042500000000000022124 0ustar00vstsdocker00000000000000import copy import numbers import numpy as np import matplotlib as mpl import matplotlib.pyplot as plt import astropy.units as u try: from sunpy.visualization.animator import LineAnimator except ImportError: from sunpy.visualization.imageanimator import LineAnimator from ndcube import utils from ndcube.utils.cube import _get_extra_coord_edges from ndcube.visualization.animator import ImageAnimatorWCS __all__ = ['NDCubeSequencePlotMixin'] NON_COMPATIBLE_UNIT_MESSAGE = \ "All sequence sub-cubes' unit attribute are not compatible with data_unit set by user." AXES_UNIT_ERRONESLY_SET_MESSAGE = \ "axes_units element must be None unless corresponding axes_coordinate is None or a Quantity." class NDCubeSequencePlotMixin: def plot(self, axes=None, plot_axis_indices=None, axes_coordinates=None, axes_units=None, data_unit=None, **kwargs): """ Visualizes data in the NDCubeSequence with the sequence axis as a separate dimension. Based on the dimensionality of the sequence and value of plot_axis_indices kwarg, a Line/Image Animation/Plot is produced. Parameters ---------- axes: `astropy.visualization.wcsaxes.core.WCSAxes` or `None`. The axes to plot onto. If None the current axes will be used. plot_axis_indices: `int` or iterable of one or two `int`. If two axis indices are given, the sequence is visualized as an image or 2D animation, assuming the sequence has at least 2 dimensions. The dimension indicated by the 0th index is displayed on the x-axis while the dimension indicated by the 1st index is displayed on the y-axis. If only one axis index is given (either as an int or a list of one int), then a 1D line animation is produced with the indicated dimension on the x-axis and other dimensions represented by animations sliders. Default=[-1, -2]. If sequence only has one dimension, plot_axis_indices is ignored and a static 1D line plot is produced. axes_coordinates: `None` or `list` of `None` `astropy.units.Quantity` `numpy.ndarray` `str` Denotes physical coordinates for plot and slider axes. If None coordinates derived from the WCS objects will be used for all axes. If a list, its length should equal either the number sequence dimensions or the length of plot_axis_indices. If the length equals the number of sequence dimensions, each element describes the coordinates of the corresponding sequence dimension. If the length equals the length of plot_axis_indices, the 0th entry describes the coordinates of the x-axis while (if length is 2) the 1st entry describes the coordinates of the y-axis. Slider axes are implicitly set to None. If the number of sequence dimensions equals the length of plot_axis_indices, the latter convention takes precedence. The value of each entry should be either `None` (implies derive the coordinates from the WCS objects), an `astropy.units.Quantity` or a `numpy.ndarray` of coordinates for each pixel, or a `str` denoting a valid extra coordinate. The physical coordinates expected by axes_coordinates should be an array of pixel_edges. A str entry in axes_coordinates signifies that an extra_coord will be used for the axis's coordinates. The str must be a valid name of an extra_coord that corresponds to the same axis to which it is applied in the plot. axes_units: `None` or `list` of `None`, `astropy.units.Unit` and/or `str` If None units derived from the WCS objects will be used for all axes. If a list, its length should equal either the number sequence dimensions or the length of plot_axis_indices. If the length equals the number of sequence dimensions, each element gives the unit in which the coordinates along the corresponding sequence dimension should displayed whether they be a plot axes or a slider axes. If the length equals the length of plot_axis_indices, the 0th entry describes the unit in which the x-axis coordinates should be displayed while (if length is 2) the 1st entry describes the unit in which the y-axis should be displayed. Slider axes are implicitly set to None. If the number of sequence dimensions equals the length of plot_axis_indices, the latter convention takes precedence. The value of each entry should be either `None` (implies derive the unit from the WCS object of the 0th sub-cube), `astropy.units.Unit` or a valid unit `str`. data_unit: `astropy.unit.Unit` or valid unit `str` or None Unit in which data be displayed. If the length of plot_axis_indices is 2, a 2D image/animation is produced and data_unit determines the unit represented by the color table. If the length of plot_axis_indices is 1, a 1D plot/animation is produced and data_unit determines the unit in which the y-axis is displayed. Returns ------- ax: `matplotlib.axes.Axes`, `ndcube.mixins.sequence_plotting.ImageAnimatorNDCubeSequence` or `ndcube.mixins.sequence_plotting.ImageAnimatorCubeLikeNDCubeSequence` Axes or animation object depending on dimensionality of NDCubeSequence """ # Check kwargs are in consistent formats and set default values if not done so by user. naxis = len(self.dimensions) plot_axis_indices, axes_coordinates, axes_units = _prep_axes_kwargs( naxis, plot_axis_indices, axes_coordinates, axes_units) if naxis == 1: # Make 1D line plot. ax = self._plot_1D_sequence(axes_coordinates, axes_units, data_unit, **kwargs) else: if len(plot_axis_indices) == 1: # Since sequence has more than 1 dimension and number of plot axes is 1, # produce a 1D line animation. if axes_units is not None: unit_x_axis = axes_units[plot_axis_indices[0]] else: unit_x_axis = None ax = LineAnimatorNDCubeSequence(self, plot_axis_indices[0], axes_coordinates, unit_x_axis, data_unit, **kwargs) elif len(plot_axis_indices) == 2: if naxis == 2: # Since sequence has 2 dimensions and number of plot axes is 2, # produce a 2D image. ax = self._plot_2D_sequence(plot_axis_indices, axes_coordinates, axes_units, data_unit, **kwargs) else: # Since sequence has more than 2 dimensions and number of plot axes is 2, # produce a 2D animation. ax = ImageAnimatorNDCubeSequence( self, plot_axis_indices=plot_axis_indices, axes_coordinates=axes_coordinates, axes_units=axes_units, **kwargs) return ax def plot_as_cube(self, axes=None, plot_axis_indices=None, axes_coordinates=None, axes_units=None, data_unit=None, **kwargs): """ Visualizes data in the NDCubeSequence with the sequence axis folded into the common axis. Based on the cube-like dimensionality of the sequence and value of plot_axis_indices kwarg, a Line/Image Plot/Animation is produced. Parameters ---------- axes: `astropy.visualization.wcsaxes.core.WCSAxes` or None. The axes to plot onto. If None the current axes will be used. plot_axis_indices: `int` or iterable of one or two `int`. If two axis indices are given, the sequence is visualized as an image or 2D animation, assuming the sequence has at least 2 cube-like dimensions. The cube-like dimension indicated by the 0th index is displayed on the x-axis while the cube-like dimension indicated by the 1st index is displayed on the y-axis. If only one axis index is given (either as an int or a list of one int), then a 1D line animation is produced with the indicated cube-like dimension on the x-axis and other cube-like dimensions represented by animations sliders. Default=[-1, -2]. If sequence only has one cube-like dimension, plot_axis_indices is ignored and a static 1D line plot is produced. axes_coordinates: None or `list` of None, `astropy.units.Quantity`, `numpy.ndarray`, `str` Denotes physical coordinates for plot and slider axes. If None coordinates derived from the WCS objects will be used for all axes. If a list, its length should equal either the number cube-like dimensions or the length of plot_axis_indices. If the length equals the number of cube-like dimensions, each element describes the coordinates of the corresponding cube-like dimension. If the length equals the length of plot_axis_indices, the 0th entry describes the coordinates of the x-axis while (if length is 2) the 1st entry describes the coordinates of the y-axis. Slider axes are implicitly set to None. If the number of cube-like dimensions equals the length of plot_axis_indices, the latter convention takes precedence. The value of each entry should be either None (implies derive the coordinates from the WCS objects), an `astropy.units.Quantity` or a `numpy.ndarray` of coordinates for each pixel, or a `str` denoting a valid extra coordinate. The physical coordinates expected by axes_coordinates should be an array of pixel_edges. A str entry in axes_coordinates signifies that an extra_coord will be used for the axis's coordinates. The str must be a valid name of an extra_coord that corresponds to the same axis to which it is applied in the plot. axes_units: None or `list` of None, `astropy.units.Unit` and/or `str` If None units derived from the WCS objects will be used for all axes. If a list, its length should equal either the number cube-like dimensions or the length of plot_axis_indices. If the length equals the number of cube-like dimensions, each element gives the unit in which the coordinates along the corresponding cube-like dimension should displayed whether they be a plot axes or a slider axes. If the length equals the length of plot_axis_indices, the 0th entry describes the unit in which the x-axis coordinates should be displayed while (if length is 2) the 1st entry describes the unit in which the y-axis should be displayed. Slider axes are implicitly set to None. If the number of cube-like dimensions equals the length of plot_axis_indices, the latter convention takes precedence. The value of each entry should be either None (implies derive the unit from the WCS object of the 0th sub-cube), `astropy.units.Unit` or a valid unit `str`. data_unit: `astropy.unit.Unit` or valid unit `str` or None Unit in which data be displayed. If the length of plot_axis_indices is 2, a 2D image/animation is produced and data_unit determines the unit represented by the color table. If the length of plot_axis_indices is 1, a 1D plot/animation is produced and data_unit determines the unit in which the y-axis is displayed. Returns ------- ax: `matplotlib.axes.Axes`, `ndcube.mixins.sequence_plotting.ImageAnimatorNDCubeSequence` or `ndcube.mixins.sequence_plotting.ImageAnimatorCubeLikeNDCubeSequence` Axes or animation object depending on dimensionality of NDCubeSequence """ # Verify common axis is set. if self._common_axis is None: raise TypeError("Common axis must be set.") # Check kwargs are in consistent formats and set default values if not done so by user. naxis = len(self.cube_like_dimensions) plot_axis_indices, axes_coordinates, axes_units = _prep_axes_kwargs( naxis, plot_axis_indices, axes_coordinates, axes_units) # Produce plot/image/animation based on cube-like dimensions of sequence. if naxis == 1: # Since sequence has 1 cube-like dimension, produce a 1D line plot. ax = self._plot_2D_sequence_as_1Dline(axes_coordinates, axes_units, data_unit, **kwargs) else: if len(plot_axis_indices) == 1: # Since sequence has more than 1 cube-like dimension and # number of plot axes is 1, produce a 1D line animation. if axes_units is not None: unit_x_axis = axes_units[plot_axis_indices[0]] else: unit_x_axis = None ax = LineAnimatorCubeLikeNDCubeSequence(self, plot_axis_indices[0], axes_coordinates, unit_x_axis, data_unit=data_unit, **kwargs) elif len(plot_axis_indices) == 2: if naxis == 2: # Since sequence has 2 cube-like dimensions and # number of plot axes is 2, produce a 2D image. ax = self._plot_3D_sequence_as_2Dimage(axes, plot_axis_indices, axes_coordinates, axes_units, data_unit, **kwargs) else: # Since sequence has more than 2 cube-like dimensions and # number of plot axes is 2, produce a 2D animation. ax = ImageAnimatorCubeLikeNDCubeSequence( self, plot_axis_indices=plot_axis_indices, axes_coordinates=axes_coordinates, axes_units=axes_units, **kwargs) return ax def _plot_1D_sequence(self, axes_coordinates=None, axes_units=None, data_unit=None, **kwargs): """ Visualizes an NDCubeSequence of scalar NDCubes as a line plot. A scalar NDCube is one whose NDCube.data is a scalar rather than an array. Parameters ---------- axes_coordinates: `numpy.ndarray` `astropy.unit.Quantity` `str` `None` or length 1 `list` Denotes the physical coordinates of the x-axis. If list, must be of length 1 containing one object of one of the other allowed types. If None, coordinates are derived from the WCS objects. If an `astropy.units.Quantity` or a `numpy.ndarray` gives the coordinates for each pixel along the x-axis. If a `str`, denotes the extra coordinate to be used. The extra coordinate must correspond to the sequence axis. The physical coordinates expected by axes_coordinates should be an array of pixel_edges. A str entry in axes_coordinates signifies that an extra_coord will be used for the axis's coordinates. The str must be a valid name of an extra_coord that corresponds to the same axis to which it is applied in the plot. axes_units: `astropy.unit.Unit` or valid unit `str` or length 1 `list` of those types. Unit in which X-axis should be displayed. Must be compatible with the unit of the coordinate denoted by x_axis_range. Not used if x_axis_range is a `numpy.ndarray` or the designated extra coordinate is a `numpy.ndarray` data_unit: `astropy.units.unit` or valid unit `str` The units into which the y-axis should be displayed. The unit attribute of all the sub-cubes must be compatible to set this kwarg. """ # Derive x-axis coordinates and unit from inputs. x_axis_coordinates, unit_x_axis = _derive_1D_coordinates_and_units(axes_coordinates, axes_units) # Check that the unit attribute is a set in all cubes and derive unit_y_axis if not set. unit_y_axis = data_unit sequence_units, unit_y_axis = _determine_sequence_units(self.data, unit_y_axis) # If not all cubes have their unit set, create a data array from cube's data. if sequence_units is None: ydata = np.array([cube.data for cube in self.data]) else: # If all cubes have unit set, create a data quantity from cubes' data. ydata = u.Quantity([cube.data * sequence_units[i] for i, cube in enumerate(self.data)], unit=unit_y_axis).value # Determine uncertainties. sequence_uncertainty_nones = [] for i, cube in enumerate(self.data): if cube.uncertainty is None: sequence_uncertainty_nones.append(i) if sequence_uncertainty_nones == list(range(len(self.data))): # If all cube uncertainties are None, make yerror also None. yerror = None else: # Else determine uncertainties, giving 0 uncertainty for # cubes with uncertainty of None. if sequence_units is None: yerror = np.array([cube.uncertainty.array for cube in self.data]) yerror[sequence_uncertainty_nones] = 0. else: # If all cubes have compatible units, ensure uncertainties are in the same unit. yerror = [] for i, cube in enumerate(self.data): if i in sequence_uncertainty_nones: yerror.append(0. * sequence_units[i]) else: yerror.append(cube.uncertainty.array * sequence_units[i]) yerror = u.Quantity(yerror, unit=unit_y_axis).value # Define x-axis data. if x_axis_coordinates is None: # Since scalar NDCubes have no array/pixel indices, WCS translations don't work. # Therefore x-axis values will be unitless sequence indices unless supplied by user # or an extra coordinate is designated. xdata = np.arange(int(self.dimensions[0].value)) xname = self.world_axis_physical_types[0] elif isinstance(x_axis_coordinates, str): xdata = self.sequence_axis_extra_coords[x_axis_coordinates] xname = x_axis_coordinates else: xdata = x_axis_coordinates xname = self.world_axis_physical_types[0] if isinstance(xdata, u.Quantity): if unit_x_axis is None: unit_x_axis = xdata.unit else: xdata = xdata.to(unit_x_axis) else: unit_x_axis = None default_xlabel = f"{xname} [{unit_x_axis}]" fig, ax = _make_1D_sequence_plot(xdata, ydata, yerror, unit_y_axis, default_xlabel, kwargs) return ax def _plot_2D_sequence_as_1Dline(self, axes_coordinates=None, axes_units=None, data_unit=None, **kwargs): """ Visualizes an NDCubeSequence of 1D NDCubes with a common axis as a line plot. Called if plot_as_cube=True. Parameters same as _plot_1D_sequence """ # Derive x-axis coordinates and unit from inputs. x_axis_coordinates, unit_x_axis = _derive_1D_coordinates_and_units(axes_coordinates, axes_units) # Check that the unit attribute is set of all cubes and derive unit_y_axis if not set. unit_y_axis = data_unit sequence_units, unit_y_axis = _determine_sequence_units(self.data, unit_y_axis) # If all cubes have unit set, create a y data quantity from cube's data. if sequence_units is None: ydata = np.concatenate([cube.data for cube in self.data]) else: # If all cubes have unit set, create a data quantity from cubes' data. ydata = np.concatenate([(cube.data * sequence_units[i]).to(unit_y_axis).value for i, cube in enumerate(self.data)]) # Determine uncertainties. # Check which cubes don't have uncertainties. sequence_uncertainty_nones = [] for i, cube in enumerate(self.data): if cube.uncertainty is None: sequence_uncertainty_nones.append(i) if sequence_uncertainty_nones == list(range(len(self.data))): # If no sub-cubes have uncertainty, set overall yerror to None. yerror = None else: # Else determine uncertainties, giving 0 uncertainty for # cubes with uncertainty of None. yerror = [] if sequence_units is None: for i, cube in enumerate(self.data): if i in sequence_uncertainty_nones: yerror.append(np.zeros(cube.data.shape)) else: yerror.append(cube.uncertainty.array) else: for i, cube in enumerate(self.data): if i in sequence_uncertainty_nones: yerror.append((np.zeros(cube.data.shape) * sequence_units[i]).to( unit_y_axis).value) else: yerror.append((cube.uncertainty.array * sequence_units[i]).to( unit_y_axis).value) yerror = np.concatenate(yerror) # Define x-axis data. if x_axis_coordinates is None: print('a', unit_x_axis) if unit_x_axis is None: print('b', unit_x_axis) unit_x_axis = np.asarray(self[0].wcs.wcs.cunit)[ np.invert(self[0].missing_axes)][0] print('c', unit_x_axis) xdata = u.Quantity(np.concatenate([cube.axis_world_coords().to(unit_x_axis).value for cube in self.data]), unit=unit_x_axis) xname = self.cube_like_world_axis_physical_types[0] print('d', unit_x_axis) elif isinstance(x_axis_coordinates, str): xdata = self.common_axis_extra_coords[x_axis_coordinates] xname = x_axis_coordinates else: xdata = x_axis_coordinates xname = "" if isinstance(xdata, u.Quantity): if unit_x_axis is None: unit_x_axis = xdata.unit else: xdata = xdata.to(unit_x_axis) else: unit_x_axis = None default_xlabel = f"{xname} [{unit_x_axis}]" # For consistency, make xdata an array if a Quantity. Wait until now # because if xdata is a Quantity, its unit is needed until now. if isinstance(xdata, u.Quantity): xdata = xdata.value # Plot data fig, ax = _make_1D_sequence_plot(xdata, ydata, yerror, unit_y_axis, default_xlabel, kwargs) return ax def _plot_2D_sequence(self, plot_axis_indices=None, axes_coordinates=None, axes_units=None, data_unit=None, **kwargs): """ Visualizes an NDCubeSequence of 1D NDCubes as a 2D image. **kwargs are fed into matplotlib.image.NonUniformImage. Parameters same as self.plot() """ # Set default values of kwargs if not set. if axes_coordinates is None: axes_coordinates = [None, None] if axes_units is None: axes_units = [None, None] # Convert plot_axis_indices to array for function operations. plot_axis_indices = np.asarray(plot_axis_indices) # Check that the unit attribute is set of all cubes and derive unit_y_axis if not set. sequence_units, data_unit = _determine_sequence_units(self.data, data_unit) # If all cubes have unit set, create a data quantity from cube's data. if sequence_units is not None: data = np.stack([(cube.data * sequence_units[i]).to(data_unit).value for i, cube in enumerate(self.data)]) else: data = np.stack([cube.data for i, cube in enumerate(self.data)]) if plot_axis_indices[0] < plot_axis_indices[1]: # Transpose data if user-defined images_axes require it. data = data.transpose() # Determine index of above axes variables corresponding to sequence and cube axes. # Since the axes variables have been re-oriented before this function was called # so the 0th element corresponds to the sequence axis, and the 1st to the cube axis, # determining this is trivial. sequence_axis_index = 0 cube_axis_index = 1 # Derive the coordinates, unit, and default label of the cube axis. cube_axis_unit = axes_units[cube_axis_index] if axes_coordinates[cube_axis_index] is None: if cube_axis_unit is None: cube_axis_unit = np.array(self[0].wcs.wcs.cunit)[ np.invert(self[0].missing_axes)][0] cube_axis_coords = self[0].axis_world_coords().to(cube_axis_unit).value cube_axis_name = self.world_axis_physical_types[1] else: if isinstance(axes_coordinates[cube_axis_index], str): cube_axis_coords = \ self[0].extra_coords[axes_coordinates[cube_axis_index]]["value"] cube_axis_name = axes_coordinates[cube_axis_index] else: cube_axis_coords = axes_coordinates[cube_axis_index] cube_axis_name = "" if isinstance(cube_axis_coords, u.Quantity): if cube_axis_unit is None: cube_axis_unit = cube_axis_coords.unit cube_axis_coords = cube_axis_coords.value else: cube_axis_coords = cube_axis_coords.to(cube_axis_unit).value else: if cube_axis_unit is not None: raise ValueError(AXES_UNIT_ERRONESLY_SET_MESSAGE) default_cube_axis_label = f"{cube_axis_name} [{cube_axis_unit}]" axes_coordinates[cube_axis_index] = cube_axis_coords axes_units[cube_axis_index] = cube_axis_unit # Derive the coordinates, unit, and default label of the sequence axis. sequence_axis_unit = axes_units[sequence_axis_index] if axes_coordinates[sequence_axis_index] is None: sequence_axis_coords = np.arange(len(self.data)) sequence_axis_name = self.world_axis_physical_types[0] elif isinstance(axes_coordinates[sequence_axis_index], str): sequence_axis_coords = \ self.sequence_axis_extra_coords[axes_coordinates[sequence_axis_index]] sequence_axis_name = axes_coordinates[sequence_axis_index] else: sequence_axis_coords = axes_coordinates[sequence_axis_index] sequence_axis_name = self.world_axis_physical_types[0] if isinstance(sequence_axis_coords, u.Quantity): if sequence_axis_unit is None: sequence_axis_unit = sequence_axis_coords.unit sequence_axis_coords = sequence_axis_coords.value else: sequence_axis_coords = sequence_axis_coords.to(sequence_axis_unit).value else: if sequence_axis_unit is not None: raise ValueError(AXES_UNIT_ERRONESLY_SET_MESSAGE) default_sequence_axis_label = f"{sequence_axis_name} [{sequence_axis_unit}]" axes_coordinates[sequence_axis_index] = sequence_axis_coords axes_units[sequence_axis_index] = sequence_axis_unit axes_labels = [None, None] axes_labels[cube_axis_index] = default_cube_axis_label axes_labels[sequence_axis_index] = default_sequence_axis_label # Plot image. # Create figure and axes objects. fig, ax = plt.subplots(1, 1) # Since we can't assume the x-axis will be uniform, create NonUniformImage # axes and add it to the axes object. im_ax = mpl.image.NonUniformImage(ax, extent=(axes_coordinates[plot_axis_indices[0]][0], axes_coordinates[plot_axis_indices[0]][-1], axes_coordinates[plot_axis_indices[1]][0], axes_coordinates[plot_axis_indices[1]][-1]), **kwargs) im_ax.set_data(axes_coordinates[plot_axis_indices[0]], axes_coordinates[plot_axis_indices[1]], data) ax.add_image(im_ax) # Set the limits, labels, etc. of the axes. ax.set_xlim((axes_coordinates[plot_axis_indices[0]][0], axes_coordinates[plot_axis_indices[0]][-1])) ax.set_ylim((axes_coordinates[plot_axis_indices[1]][0], axes_coordinates[plot_axis_indices[1]][-1])) ax.set_xlabel(axes_labels[plot_axis_indices[0]]) ax.set_ylabel(axes_labels[plot_axis_indices[1]]) return ax def _plot_3D_sequence_as_2Dimage(self, axes=None, plot_axis_indices=None, axes_coordinates=None, axes_units=None, data_unit=None, **kwargs): """ Visualizes an NDCubeSequence of 2D NDCubes with a common axis as a 2D image. Called if plot_as_cube=True. """ # Set default values of kwargs if not set. if axes_coordinates is None: axes_coordinates = [None, None] if axes_units is None: axes_units = [None, None] # Convert plot_axis_indices to array for function operations. plot_axis_indices = np.asarray(plot_axis_indices) # Check that the unit attribute is set of all cubes and derive unit_y_axis if not set. sequence_units, data_unit = _determine_sequence_units(self.data, data_unit) # If all cubes have unit set, create a data quantity from cube's data. if sequence_units is not None: data = np.concatenate([(cube.data * sequence_units[i]).to(data_unit).value for i, cube in enumerate(self.data)], axis=self._common_axis) else: data = np.concatenate([cube.data for cube in self.data], axis=self._common_axis) if plot_axis_indices[0] < plot_axis_indices[1]: data = data.transpose() # Determine index of common axis and other cube axis. common_axis_index = self._common_axis cube_axis_index = [0, 1] cube_axis_index.pop(common_axis_index) cube_axis_index = cube_axis_index[0] # Derive the coordinates, unit, and default label of the cube axis. cube_axis_unit = axes_units[cube_axis_index] if axes_coordinates[cube_axis_index] is None: if cube_axis_unit is None: cube_axis_unit = np.array(self[0].wcs.wcs.cunit)[ np.invert(self[0].missing_axes)][0] cube_axis_coords = \ self[0].axis_world_coords()[cube_axis_index].to(cube_axis_unit).value cube_axis_name = self.cube_like_world_axis_physical_types[1] else: if isinstance(axes_coordinates[cube_axis_index], str): cube_axis_coords = \ self[0].extra_coords[axes_coordinates[cube_axis_index]]["value"] cube_axis_name = axes_coordinates[cube_axis_index] else: cube_axis_coords = axes_coordinates[cube_axis_index] cube_axis_name = "" if isinstance(cube_axis_coords, u.Quantity): if cube_axis_unit is None: cube_axis_unit = cube_axis_coords.unit cube_axis_coords = cube_axis_coords.value else: cube_axis_coords = cube_axis_coords.to(cube_axis_unit).value else: if cube_axis_unit is not None: raise ValueError(AXES_UNIT_ERRONESLY_SET_MESSAGE) default_cube_axis_label = f"{cube_axis_name} [{cube_axis_unit}]" axes_coordinates[cube_axis_index] = cube_axis_coords axes_units[cube_axis_index] = cube_axis_unit # Derive the coordinates, unit, and default label of the common axis. common_axis_unit = axes_units[common_axis_index] if axes_coordinates[common_axis_index] is None: # Concatenate values along common axis for each cube. if common_axis_unit is None: wcs_common_axis_index = utils.cube.data_axis_to_wcs_axis( common_axis_index, self[0].missing_axes) common_axis_unit = np.array(self[0].wcs.wcs.cunit)[wcs_common_axis_index] common_axis_coords = u.Quantity(np.concatenate( [cube.axis_world_coords()[common_axis_index].to(common_axis_unit).value for cube in self.data]), unit=common_axis_unit) common_axis_name = self.cube_like_world_axis_physical_types[common_axis_index] elif isinstance(axes_coordinates[common_axis_index], str): common_axis_coords = \ self.common_axis_extra_coords[axes_coordinates[common_axis_index]] common_axis_name = axes_coordinates[common_axis_index] else: common_axis_coords = axes_coordinates[common_axis_index] common_axis_name = "" if isinstance(common_axis_coords, u.Quantity): if common_axis_unit is None: common_axis_unit = common_axis_coords.unit common_axis_coords = common_axis_coords.value else: common_axis_coords = common_axis_coords.to(common_axis_unit).value else: if common_axis_unit is not None: raise ValueError(AXES_UNIT_ERRONESLY_SET_MESSAGE) default_common_axis_label = f"{common_axis_name} [{common_axis_unit}]" axes_coordinates[common_axis_index] = common_axis_coords axes_units[common_axis_index] = common_axis_unit axes_labels = [None, None] axes_labels[cube_axis_index] = default_cube_axis_label axes_labels[common_axis_index] = default_common_axis_label # Plot image. # Create figure and axes objects. fig, ax = plt.subplots(1, 1) # Since we can't assume the x-axis will be uniform, create NonUniformImage # axes and add it to the axes object. im_ax = mpl.image.NonUniformImage( ax, extent=(axes_coordinates[plot_axis_indices[0]][0], axes_coordinates[plot_axis_indices[0]][-1], axes_coordinates[plot_axis_indices[1]][0], axes_coordinates[plot_axis_indices[1]][-1]), **kwargs) im_ax.set_data(axes_coordinates[plot_axis_indices[0]], axes_coordinates[plot_axis_indices[1]], data) ax.add_image(im_ax) # Set the limits, labels, etc. of the axes. ax.set_xlim((axes_coordinates[plot_axis_indices[0]][0], axes_coordinates[plot_axis_indices[0]][-1])) ax.set_ylim((axes_coordinates[plot_axis_indices[1]][0], axes_coordinates[plot_axis_indices[1]][-1])) ax.set_xlabel(axes_labels[plot_axis_indices[0]]) ax.set_ylabel(axes_labels[plot_axis_indices[1]]) return ax class ImageAnimatorNDCubeSequence(ImageAnimatorWCS): """ Animates N-dimensional data with the associated astropy WCS object. The following keyboard shortcuts are defined in the viewer: left': previous step on active slider right': next step on active slider top': change the active slider up one bottom': change the active slider down one 'p': play/pause active slider This viewer can have user defined buttons added by specifying the labels and functions called when those buttons are clicked as keyword arguments. Parameters ---------- seq: `ndcube.NDCubeSequence` The list of cubes. image_axes: `list` The two axes that make the image fig: `matplotlib.figure.Figure` Figure to use axis_ranges: list of physical coordinates for array or None If None array indices will be used for all axes. If a list it should contain one element for each axis of the numpy array. For the image axes a [min, max] pair should be specified which will be passed to :func:`matplotlib.pyplot.imshow` as extent. For the slider axes a [min, max] pair can be specified or an array the same length as the axis which will provide all values for that slider. If None is specified for an axis then the array indices will be used for that axis. The physical coordinates expected by axis_ranges should be an array of pixel_edges. interval: `int` Animation interval in ms colorbar: `bool` Plot colorbar button_labels: `list` List of strings to label buttons button_func: `list` List of functions to map to the buttons unit_x_axis: `astropy.units.Unit` The unit of x axis. unit_y_axis: `astropy.units.Unit` The unit of y axis. Extra keywords are passed to imshow. """ def __init__(self, seq, wcs=None, axes=None, plot_axis_indices=None, axes_coordinates=None, axes_units=None, data_unit=None, **kwargs): self.sequence = seq.data # Required by parent class. # Set default values of kwargs if not set. if wcs is None: wcs = seq[0].wcs if axes_coordinates is None: axes_coordinates = [None] * len(seq.dimensions) if axes_units is None: axes_units = [None] * len(seq.dimensions) # Determine units of each cube in sequence. sequence_units, data_unit = _determine_sequence_units(seq.data, data_unit) # If all cubes have unit set, create a data quantity from cube's data. if sequence_units is None: data_stack = np.stack([cube.data for i, cube in enumerate(seq.data)]) else: data_stack = np.stack([(cube.data * sequence_units[i]).to(data_unit).value for i, cube in enumerate(seq.data)]) self.cumul_cube_lengths = np.cumsum(np.ones(len(seq.data))) # Add dimensions of length 1 of concatenated data array # shape for an missing axes. if seq[0].wcs.naxis != len(seq.dimensions) - 1: new_shape = list(data_stack.shape) for i in np.arange(seq[0].wcs.naxis)[seq[0].missing_axes[::-1]]: new_shape.insert(i + 1, 1) # Also insert dummy coordinates and units. axes_coordinates.insert(i + 1, None) axes_units.insert(i + 1, None) data_stack = data_stack.reshape(new_shape) # Add dummy axis to WCS object to represent sequence axis. new_wcs = utils.wcs.append_sequence_axis_to_wcs(wcs) super().__init__( data_stack, wcs=new_wcs, image_axes=plot_axis_indices, axis_ranges=axes_coordinates, unit_x_axis=axes_units[plot_axis_indices[0]], unit_y_axis=axes_units[plot_axis_indices[1]], **kwargs) class ImageAnimatorCubeLikeNDCubeSequence(ImageAnimatorWCS): """ Animates N-dimensional data with the associated astropy WCS object. The following keyboard shortcuts are defined in the viewer: left': previous step on active slider right': next step on active slider top': change the active slider up one bottom': change the active slider down one 'p': play/pause active slider This viewer can have user defined buttons added by specifying the labels and functions called when those buttons are clicked as keyword arguments. Parameters ---------- seq: `ndcube.datacube.CubeSequence` The list of cubes. image_axes: `list` The two axes that make the image fig: `matplotlib.figure.Figure` Figure to use axis_ranges: list of physical coordinates for array or None If None array indices will be used for all axes. If a list it should contain one element for each axis of the numpy array. For the image axes a [min, max] pair should be specified which will be passed to :func:`matplotlib.pyplot.imshow` as extent. For the slider axes a [min, max] pair can be specified or an array the same length as the axis which will provide all values for that slider. If None is specified for an axis then the array indices will be used for that axis. The physical coordinates expected by axis_ranges should be an array of pixel_edges. interval: `int` Animation interval in ms colorbar: `bool` Plot colorbar button_labels: `list` List of strings to label buttons button_func: `list` List of functions to map to the buttons unit_x_axis: `astropy.units.Unit` The unit of x axis. unit_y_axis: `astropy.units.Unit` The unit of y axis. Extra keywords are passed to imshow. """ def __init__(self, seq, wcs=None, axes=None, plot_axis_indices=None, axes_coordinates=None, axes_units=None, data_unit=None, **kwargs): if seq._common_axis is None: raise TypeError("Common axis must be set to use this class. " "Use ImageAnimatorNDCubeSequence.") self.sequence = seq.data # Required by parent class. # Set default values of kwargs if not set. if wcs is None: wcs = seq[0].wcs if axes_coordinates is None: axes_coordinates = [None] * len(seq.cube_like_dimensions) if axes_units is None: axes_units = [None] * len(seq.cube_like_dimensions) # Determine units of each cube in sequence. sequence_units, data_unit = _determine_sequence_units(seq.data, data_unit) # If all cubes have unit set, create a data quantity from cube's data. if sequence_units is None: data_concat = np.concatenate([cube.data for cube in seq.data], axis=seq._common_axis) else: data_concat = np.concatenate( [(cube.data * sequence_units[i]).to(data_unit).value for i, cube in enumerate(seq.data)], axis=seq._common_axis) self.cumul_cube_lengths = np.cumsum(np.array( [c.dimensions[0].value for c in seq.data], dtype=int)) # Add dimensions of length 1 of concatenated data array # shape for an missing axes. if seq[0].wcs.naxis != len(seq._dimensions) - 1: new_shape = list(data_concat.shape) for i in np.arange(seq[0].wcs.naxis)[seq[0].missing_axes[::-1]]: new_shape.insert(i, 1) # Also insert dummy coordinates and units. axes_coordinates.insert(i, None) axes_units.insert(i, None) data_concat = data_concat.reshape(new_shape) super().__init__( data_concat, wcs=wcs, image_axes=plot_axis_indices, axis_ranges=axes_coordinates, unit_x_axis=axes_units[plot_axis_indices[0]], unit_y_axis=axes_units[plot_axis_indices[1]], **kwargs) def update_plot(self, val, im, slider): val = int(val) ax_ind = self.slider_axes[slider.slider_ind] ind = np.argmin(np.abs(self.axis_ranges[ax_ind] - val)) self.frame_slice[ax_ind] = ind list_slices_wcsaxes = list(self.slices_wcsaxes) sequence_slice = utils.sequence._convert_cube_like_index_to_sequence_slice( val, self.cumul_cube_lengths) sequence_index = sequence_slice.sequence_index cube_index = sequence_slice.common_axis_item list_slices_wcsaxes[self.wcs.naxis - ax_ind - 1] = cube_index self.slices_wcsaxes = list_slices_wcsaxes if val != slider.cval: self.axes.reset_wcs( wcs=self.sequence[sequence_index].wcs, slices=self.slices_wcsaxes) self._set_unit_in_axis(self.axes) im.set_array(self.data[self.frame_slice]) slider.cval = val class LineAnimatorNDCubeSequence(LineAnimator): """ Animates N-dimensional data with the associated astropy WCS object. The following keyboard shortcuts are defined in the viewer: left': previous step on active slider right': next step on active slider top': change the active slider up one bottom': change the active slider down one 'p': play/pause active slider This viewer can have user defined buttons added by specifying the labels and functions called when those buttons are clicked as keyword arguments. Parameters ---------- seq: `ndcube.datacube.CubeSequence` The list of cubes. image_axes: `list` The two axes that make the image fig: `matplotlib.figure.Figure` Figure to use axis_ranges: list of physical coordinates for array or None If None array indices will be used for all axes. If a list it should contain one element for each axis of the numpy array. For the image axes a [min, max] pair should be specified which will be passed to :func:`matplotlib.pyplot.imshow` as extent. For the slider axes a [min, max] pair can be specified or an array the same length as the axis which will provide all values for that slider. If None is specified for an axis then the array indices will be used for that axis. The physical coordinates expected by axis_ranges should be an array of pixel_edges. interval: `int` Animation interval in ms colorbar: `bool` Plot colorbar button_labels: `list` List of strings to label buttons button_func: `list` List of functions to map to the buttons unit_x_axis: `astropy.units.Unit` The unit of x axis. unit_y_axis: `astropy.units.Unit` The unit of y axis. Extra keywords are passed to imshow. """ def __init__(self, seq, plot_axis_index=None, axis_ranges=None, unit_x_axis=None, data_unit=None, xlabel=None, ylabel=None, xlim=None, ylim=None, **kwargs): if plot_axis_index is None: plot_axis_index = -1 # Combine data from cubes in sequence. If all cubes have a unit, # put data into data_unit. sequence_units, data_unit = _determine_sequence_units(seq.data, data_unit) if sequence_units is None: if data_unit is None: data_concat = np.stack([cube.data for i, cube in enumerate(seq.data)]) else: raise TypeError(NON_COMPATIBLE_UNIT_MESSAGE) else: data_concat = np.stack([(cube.data * sequence_units[i]).to(data_unit).value for i, cube in enumerate(seq.data)]) # If some cubes have a mask set, convert data to masked array. # If other cubes do not have a mask set, set all mask to False. # If no cubes have a mask, keep data as a simple array. cubes_with_mask = np.array([False if cube.mask is None else True for cube in seq.data]) if cubes_with_mask.any(): if cubes_with_mask.all(): mask_concat = np.stack([cube.mask for cube in seq.data]) else: masks = [] for i, cube in enumerate(seq.data): if cubes_with_mask[i]: masks.append(cube.mask) else: masks.append(np.zeros_like(cube.data, dtype=bool)) mask_concat = np.stack(masks) data_concat = np.ma.masked_array(data_concat, mask_concat) # Ensure plot_axis_index is represented in the positive convention. if plot_axis_index < 0: plot_axis_index = len(seq.dimensions) + plot_axis_index # Calculate the x-axis values if axis_ranges not supplied. if axis_ranges is None: axis_ranges = [None] * len(seq.dimensions) if plot_axis_index == 0: axis_ranges[plot_axis_index] = _get_extra_coord_edges( np.arange(len(seq.data)), axis=plot_axis_index) else: cube_plot_axis_index = plot_axis_index - 1 # Define unit of x-axis if not supplied by user. if unit_x_axis is None: wcs_plot_axis_index = utils.cube.data_axis_to_wcs_axis( cube_plot_axis_index, seq[0].missing_axes) unit_x_axis = np.asarray(seq[0].wcs.wcs.cunit)[wcs_plot_axis_index] # Get x-axis values from each cube and combine into a single # array for axis_ranges kwargs. x_axis_coords = _get_extra_coord_edges(_get_non_common_axis_x_axis_coords( seq.data, cube_plot_axis_index, unit_x_axis), axis=plot_axis_index) axis_ranges[plot_axis_index] = np.stack(x_axis_coords) # Set x-axis label. if xlabel is None: xlabel = "{} [{}]".format(seq.world_axis_physical_types[plot_axis_index], unit_x_axis) else: # If the axis range is being defined by an extra coordinate... if isinstance(axis_ranges[plot_axis_index], str): axis_extra_coord = axis_ranges[plot_axis_index] if plot_axis_index == 0: # If the sequence axis is the plot axis, use # sequence_axis_extra_coords to get the extra coord values # for whole sequence. x_axis_coords = seq.sequence_axis_extra_coords[axis_extra_coord] if isinstance(x_axis_coords, u.Quantity): if unit_x_axis is None: unit_x_axis = x_axis_coords.unit else: x_axis_coords = x_axis_coords.to(unit_x_axis) x_axis_coords = x_axis_coords.value else: # Else get extra coord values from each cube and # combine into a single array for axis_ranges kwargs. # First, confirm extra coord is of same type and corresponds # to same axes in each cube. extra_coord_type = np.empty(len(seq.data), dtype=object) extra_coord_axes = np.empty(len(seq.data), dtype=object) x_axis_coords = [] for i, cube in enumerate(seq.data): cube_axis_extra_coord = cube.extra_coords[axis_extra_coord] extra_coord_type[i] = type(cube_axis_extra_coord["value"]) extra_coord_axes[i] = cube_axis_extra_coord["axis"] x_axis_coords.append(cube_axis_extra_coord["value"]) if extra_coord_type.all() == extra_coord_type[0]: extra_coord_type = extra_coord_type[0] else: raise TypeError("Extra coord {} must be of same type for all NDCubes to " "use it to define a plot axis.".format(axis_extra_coord)) if extra_coord_axes.all() == extra_coord_axes[0]: if isinstance(extra_coord_axes[0], numbers.Integral): extra_coord_axes = [int(extra_coord_axes[0])] else: extra_coord_axes = list(extra_coord_axes[0]).sort() else: raise ValueError("Extra coord {} must correspond to same axes in each " "NDCube to use it to define a plot axis.".format( axis_extra_coord)) # If the extra coord is a quantity, convert to the correct unit. if extra_coord_type is u.Quantity: if unit_x_axis is None: unit_x_axis = seq[0].extra_coords[axis_extra_coord]["value"].unit x_axis_coords = [x_axis_value.to(unit_x_axis).value for x_axis_value in x_axis_coords] # If extra coord is same for each cube, storing # values as single 1D axis range will suffice. if ((np.array(x_axis_coords) == x_axis_coords[0]).all() and (len(extra_coord_axes) == 1)): x_axis_coords = x_axis_coords[0] else: # Else if all axes are not dependent, create an array of x-axis # coords for each cube that are the same shape as the data in the # respective cubes where the x coords are replicated in the extra # dimensions. Then stack them together along the sequence axis so # the final x-axis coord array is the same shape as the data array. # This will be used in determining the correct x-axis coords for # each frame of the animation. if len(extra_coord_axes) != data_concat.ndim: x_axis_coords_copy = copy.deepcopy(x_axis_coords) x_axis_coords = [] for i, x_axis_cube_coords in enumerate(x_axis_coords_copy): # For each cube in the sequence, use np.tile to replicate # the x-axis coords through the higher dimensions. # But first give extra dummy (length 1) dimensions to the # x-axis coords array so its number of dimensions is the # same as the cube's data array. # First, create shape of pre-np.tiled x-coord array for the cube. coords_reshape = np.array([1] * seq[i].data.ndim) # Convert x_axis_cube_coords to a numpy array x_axis_cube_coords = np.array(x_axis_cube_coords) coords_reshape[extra_coord_axes] = x_axis_cube_coords.shape # Then reshape x-axis array to give it the dummy dimensions. x_axis_cube_coords = x_axis_cube_coords.reshape( tuple(coords_reshape)) # Now the correct dummy dimensions are in place so the # number of dimensions in the x-axis coord array equals # the number of dimensions of the cube's data array, # replicating the coords through the higher dimensions # is simple using np.tile. tile_shape = np.array(seq[i].data.shape) tile_shape[extra_coord_axes] = 1 x_axis_cube_coords = np.tile(x_axis_cube_coords, tile_shape) # Append new dimension-ed x-axis coords array for this cube # sequence x-axis coords list. x_axis_coords.append(x_axis_cube_coords) # Stack the x-axis coords along a new axis for the sequence axis so # its the same shape as the data array. x_axis_coords = np.stack(x_axis_coords) # Set x-axis label. if xlabel is None: xlabel = f"{axis_extra_coord} [{unit_x_axis}]" # Re-enter x-axis values into axis_ranges axis_ranges[plot_axis_index] = _get_extra_coord_edges( x_axis_coords, axis=plot_axis_index) # Else coordinate must have been defined manually. else: if isinstance(axis_ranges[plot_axis_index], u.Quantity): if unit_x_axis is None: unit_x_axis = axis_ranges[plot_axis_index].unit axis_ranges[plot_axis_index] = axis_ranges[plot_axis_index].value else: axis_ranges[plot_axis_index] = \ axis_ranges[plot_axis_index].to(unit_x_axis).value else: if unit_x_axis is not None: raise TypeError(AXES_UNIT_ERRONESLY_SET_MESSAGE) if xlabel is None: xlabel = f" [{unit_x_axis}]" # Make label for y-axis. if ylabel is None: ylabel = f"Data [{data_unit}]" super().__init__( data_concat, plot_axis_index=plot_axis_index, axis_ranges=axis_ranges, xlabel=xlabel, ylabel=ylabel, xlim=xlim, ylim=ylim, **kwargs) class LineAnimatorCubeLikeNDCubeSequence(LineAnimator): """ Animates N-dimensional data with the associated astropy WCS object. The following keyboard shortcuts are defined in the viewer: left': previous step on active slider right': next step on active slider top': change the active slider up one bottom': change the active slider down one 'p': play/pause active slider This viewer can have user defined buttons added by specifying the labels and functions called when those buttons are clicked as keyword arguments. Parameters ---------- seq: `ndcube.datacube.CubeSequence` The list of cubes. image_axes: `list` The two axes that make the image fig: `matplotlib.figure.Figure` Figure to use axis_ranges: list of physical coordinates for array or None If None array indices will be used for all axes. If a list it should contain one element for each axis of the numpy array. For the image axes a [min, max] pair should be specified which will be passed to :func:`matplotlib.pyplot.imshow` as extent. For the slider axes a [min, max] pair can be specified or an array the same length as the axis which will provide all values for that slider. If None is specified for an axis then the array indices will be used for that axis. The physical coordinates expected by axis_ranges should be an array of pixel_edges. interval: `int` Animation interval in ms colorbar: `bool` Plot colorbar button_labels: `list` List of strings to label buttons button_func: `list` List of functions to map to the buttons unit_x_axis: `astropy.units.Unit` The unit of x axis. unit_y_axis: `astropy.units.Unit` The unit of y axis. Extra keywords are passed to imshow. """ def __init__(self, seq, plot_axis_index=None, axis_ranges=None, unit_x_axis=None, data_unit=None, xlabel=None, ylabel=None, xlim=None, ylim=None, **kwargs): if plot_axis_index is None: plot_axis_index = -1 # Combine data from cubes in sequence. If all cubes have a unit, # put data into data_unit. sequence_units, data_unit = _determine_sequence_units(seq.data, data_unit) if sequence_units is None: if data_unit is None: data_concat = np.concatenate([cube.data for i, cube in enumerate(seq.data)], axis=seq._common_axis) else: raise TypeError(NON_COMPATIBLE_UNIT_MESSAGE) else: data_concat = np.concatenate([(cube.data * sequence_units[i]).to(data_unit).value for i, cube in enumerate(seq.data)], axis=seq._common_axis) # If some cubes have a mask set, convert data to masked array. # If other cubes do not have a mask set, set all mask to False. # If no cubes have a mask, keep data as a simple array. cubes_with_mask = np.array([False if cube.mask is None else True for cube in seq.data]) if cubes_with_mask.any(): if cubes_with_mask.all(): mask_concat = np.concatenate( [cube.mask for cube in seq.data], axis=seq._common_axis) else: masks = [] for i, cube in enumerate(seq.data): if cubes_with_mask[i]: masks.append(cube.mask) else: masks.append(np.zeros_like(cube.data, dtype=bool)) mask_concat = np.concatenate(masks, axis=seq._common_axis) data_concat = np.ma.masked_array(data_concat, mask_concat) # Ensure plot_axis_index is represented in the positive convention. if plot_axis_index < 0: plot_axis_index = len(seq.cube_like_dimensions) + plot_axis_index # Calculate the x-axis values if axis_ranges not supplied. if axis_ranges is None: axis_ranges = [None] * len(seq.cube_like_dimensions) # Define unit of x-axis if not supplied by user. if unit_x_axis is None: wcs_plot_axis_index = utils.cube.data_axis_to_wcs_axis( plot_axis_index, seq[0].missing_axes) unit_x_axis = np.asarray( seq[0].wcs.wcs.cunit)[np.invert(seq[0].missing_axes)][wcs_plot_axis_index] if plot_axis_index == seq._common_axis: # Determine whether common axis is dependent. x_axis_cube_coords = np.concatenate( [cube.axis_world_coords(plot_axis_index).to(unit_x_axis).value for cube in seq.data], axis=plot_axis_index) dependent_axes = utils.wcs.get_dependent_data_axes( seq[0].wcs, plot_axis_index, seq[0].missing_axes) if len(dependent_axes) > 1: independent_axes = list(range(data_concat.ndim)) for i in list(dependent_axes)[::-1]: independent_axes.pop(i) # Expand dimensionality of x_axis_cube_coords using np.tile # Create dummy axes for non-dependent axes cube_like_shape = np.array([int(s.value) for s in seq.cube_like_dimensions]) dummy_reshape = copy.deepcopy(cube_like_shape) dummy_reshape[independent_axes] = 1 x_axis_cube_coords = x_axis_cube_coords.reshape(dummy_reshape) # Now get inverse of number of repeats to create full shaped array. # The repeats is the inverse of dummy_reshape. tile_shape = copy.deepcopy(cube_like_shape) tile_shape[np.array(dependent_axes)] = 1 x_axis_coords = _get_extra_coord_edges( np.tile(x_axis_cube_coords, tile_shape), axis=plot_axis_index) else: # Get x-axis values from each cube and combine into a single # array for axis_ranges kwargs. x_axis_coords = _get_non_common_axis_x_axis_coords(seq.data, plot_axis_index, unit_x_axis) axis_ranges[plot_axis_index] = _get_extra_coord_edges( np.concatenate(x_axis_coords, axis=seq._common_axis), axis=plot_axis_index) # Set axis labels and limits, etc. if xlabel is None: xlabel = "{} [{}]".format( seq.cube_like_world_axis_physical_types[plot_axis_index], unit_x_axis) if ylabel is None: ylabel = f"Data [{data_unit}]" if axis_ranges is None: axis_ranges = [None] * data_concat.ndim super().__init__( data_concat, plot_axis_index=plot_axis_index, axis_ranges=axis_ranges, xlabel=xlabel, ylabel=ylabel, xlim=xlim, ylim=ylim, **kwargs) def _get_non_common_axis_x_axis_coords(seq_data, plot_axis_index, unit_x_axis): """ Get coords of an axis from NDCubes and combine into single array. """ x_axis_coords = [] for i, cube in enumerate(seq_data): # Get the x-axis coordinates for each cube. if unit_x_axis is None: x_axis_cube_coords = cube.axis_world_coords(plot_axis_index).value else: x_axis_cube_coords = cube.axis_world_coords(plot_axis_index).to(unit_x_axis).value # If the returned x-values have fewer dimensions than the cube, # repeat the x-values through the higher dimensions. if x_axis_cube_coords.shape != cube.data.shape: # Get sequence axes dependent and independent of plot_axis_index. dependent_axes = utils.wcs.get_dependent_data_axes( cube.wcs, plot_axis_index, cube.missing_axes) independent_axes = list(range(len(cube.dimensions))) for i in list(dependent_axes)[::-1]: independent_axes.pop(i) # Expand dimensionality of x_axis_cube_coords using np.tile tile_shape = tuple(list(np.array( cube.data.shape)[independent_axes]) + [1] * len(dependent_axes)) x_axis_cube_coords = np.tile(x_axis_cube_coords, tile_shape) # Since np.tile puts original array's dimensions as last, # reshape x_axis_cube_coords to cube's shape. x_axis_cube_coords = x_axis_cube_coords.reshape(cube.data.shape) x_axis_coords.append(x_axis_cube_coords) return x_axis_coords def _determine_sequence_units(cubesequence_data, unit=None): """ Returns units of cubes in sequence and derives data unit if not set. If not all cubes have their unit attribute set, an error is raised. Parameters ---------- cubesequence_data: `list` of `ndcube.NDCube` Taken from NDCubeSequence.data attribute. unit: `astropy.units.Unit` or `None` If None, an appropriate unit is derived from first cube in sequence. Returns ------- sequence_units: `list` of `astropy.units.Unit` Unit of each cube. unit: `astropy.units.Unit` If input unit is not None, then the same as input. Otherwise it is the unit of the first cube in the sequence. """ # Check that the unit attribute is set of all cubes. If not, unit_y_axis sequence_units = [] for i, cube in enumerate(cubesequence_data): if cube.unit is None: break else: sequence_units.append(cube.unit) if len(sequence_units) != len(cubesequence_data): sequence_units = None # If all cubes have unit set, create a data quantity from cube's data. if sequence_units is None: if unit is not None: raise ValueError(NON_COMPATIBLE_UNIT_MESSAGE) else: if unit is None: unit = sequence_units[0] return sequence_units, unit def _make_1D_sequence_plot(xdata, ydata, yerror, unit_y_axis, default_xlabel, kwargs): # Define plot settings if not set in kwargs. xlabel = kwargs.pop("xlabel", default_xlabel) ylabel = kwargs.pop("ylabel", f"Data [{unit_y_axis}]") title = kwargs.pop("title", "") xlim = kwargs.pop("xlim", None) ylim = kwargs.pop("ylim", None) # Plot data fig, ax = plt.subplots(1, 1) ax.errorbar(xdata, ydata, yerror, **kwargs) ax.set_xlabel(xlabel) ax.set_ylabel(ylabel) ax.set_title(title) ax.set_xlim(xlim) ax.set_ylim(ylim) return fig, ax def _prep_axes_kwargs(naxis, plot_axis_indices, axes_coordinates, axes_units): """ Checks input values are correct based on number of sequence dimensions and sets defaults. Parameters ---------- plot_axis_indices: As for NDCubeSequencePlotMixin.plot or NDCubeSequencePlotMixin.plot_as_cube axes_coordinates: As for NDCubeSequencePlotMixin.plot or NDCubeSequencePlotMixin.plot_as_cube axes_units: As for NDCubeSequencePlotMixin.plot or NDCubeSequencePlotMixin.plot_as_cube Returns ------- plot_axis_indices: None or `list` of `int` of length 1 or 2. axes_coordinates: `None` or `list` of `None` `astropy.units.Quantity` `numpy.ndarray` `str` Length of list equals number of sequence axes. The physical coordinates expected by axes_coordinates should be an array of pixel_edges. axes_units: None or `list` of `None` `astropy.units.Unit` or `str` Length of list equals number of sequence axes. """ # If plot_axis_indices, axes_coordinates, axes_units are not None and not lists, # convert to lists for consistent indexing behaviour. if (not isinstance(plot_axis_indices, list)) and (plot_axis_indices is not None): plot_axis_indices = [plot_axis_indices] if (not isinstance(axes_coordinates, list)) and (axes_coordinates is not None): axes_coordinates = [axes_coordinates] if (not isinstance(axes_units, list)) and (axes_units is not None): axes_units = [axes_units] # Set default value of plot_axis_indices if not set by user. if plot_axis_indices is None: plot_axis_indices = [-1, -2] else: # If number of sequence dimensions is greater than 1, # ensure length of plot_axis_indices is 1 or 2. # No need to check case where number of sequence dimensions is 1 # as plot_axis_indices is ignored in that case. if naxis > 1 and len(plot_axis_indices) not in [1, 2]: raise ValueError("plot_axis_indices can have at most length 2.") if axes_coordinates is not None: if naxis > 1: # If convention of axes_coordinates and axes_units being length of # plot_axis_index is being used, convert to convention where their # length equals sequence dimensions. Only do this if number of dimensions if # greater than 1 as the conventions are equivalent if there is only one dimension. if len(axes_coordinates) == len(plot_axis_indices): none_axes_coordinates = np.array([None] * naxis) none_axes_coordinates[plot_axis_indices] = axes_coordinates axes_coordinates = list(none_axes_coordinates) # Now axes_coordinates have been converted to a consistent convention, # ensure their length equals the number of sequence dimensions. if len(axes_coordinates) != naxis: raise ValueError(f"length of axes_coordinates must be {naxis}.") # Ensure all elements in axes_coordinates are of correct types. ax_coord_types = (u.Quantity, np.ndarray, str) for axis_coordinate in axes_coordinates: if axis_coordinate is not None and not isinstance(axis_coordinate, ax_coord_types): raise TypeError("axes_coordinates must be one of {} or list of those.".format( [None] + list(ax_coord_types))) if axes_units is not None: if naxis > 1: if len(axes_units) == len(plot_axis_indices): none_axes_units = np.array([None] * naxis) none_axes_units[plot_axis_indices] = axes_units axes_units = list(none_axes_units) # Now axes_units have been converted to a consistent convention, # ensure their length equals the number of sequence dimensions. if len(axes_units) != naxis: raise ValueError(f"length of axes_units must be {naxis}.") # Ensure all elements in axes_units are of correct types. ax_unit_types = (u.UnitBase, str) for axis_unit in axes_units: if axis_unit is not None and not isinstance(axis_unit, ax_unit_types): raise TypeError("axes_units must be one of {0} or list of {0}.".format( ax_unit_types)) return plot_axis_indices, axes_coordinates, axes_units def _derive_1D_coordinates_and_units(axes_coordinates, axes_units): if axes_coordinates is None: x_axis_coordinates = axes_coordinates else: if not isinstance(axes_coordinates, list): axes_coordinates = [axes_coordinates] x_axis_coordinates = axes_coordinates[0] if axes_units is None: unit_x_axis = axes_units else: if not isinstance(axes_units, list): axes_units = [axes_units] unit_x_axis = axes_units[0] return x_axis_coordinates, unit_x_axis ././@PaxHeader0000000000000000000000000000003300000000000011451 xustar000000000000000027 mtime=1605787057.150476 ndcube-1.4.2/ndcube/ndcollection.py0000644000175100001640000002614300000000000017543 0ustar00vstsdocker00000000000000import collections.abc import copy import textwrap import numpy as np from ndcube import NDCube, NDCubeSequence from ndcube.utils.cube import convert_extra_coords_dict_to_input_format import ndcube.utils.collection as collection_utils __all__ = ["NDCollection"] class NDCollection(dict): def __init__(self, key_data_pairs, aligned_axes=None, meta=None, **kwargs): """ A class for holding and manipulating a collection of aligned NDCube or NDCubeSequences. Parameters ---------- data: sequence of `tuple`s of (`str`, `~ndcube.NDCube` or `~ndcube.NDCubeSequence`) The names and data cubes/sequences to held in the collection. aligned_axes: `tuple` of `int`, `tuple` of `tuple`s of `int`, 'all', or None, optional Axes of each cube/sequence that are aligned in numpy order. If elements are int, then the same axis numbers in all cubes/sequences are aligned. If elements are tuples of ints, then must be one tuple for every cube/sequence. Each element of each tuple gives the axes of each cube/sequence that are aligned. If 'all', all axes are aligned in natural order, i.e. the 0th axes of all cubes are aligned, as are the 1st, and so on. Default=None meta: `dict`, optional General metadata for the overall collection. Example ------- Say the collection holds two NDCubes, each of 3 dimensions. ``aligned_axes = (1, 2)`` means that axis 1 (0-based counting) of cube0 is aligned with axis 1 of cube1, and axis 2 of cube0 is aligned with axis 2 of cube1. However, if ``aligned_axes = ((0, 1), (2, 1))`` then the first tuple corresponds to cube0 and the second with cube1. This is interpretted as axis 0 of cube0 is aligned with axis 2 of cube1 while axis 1 of cube0 is aligned with axis 1 of cube1. """ # Enter data and metadata into object. super().__init__(key_data_pairs) self.meta = meta # Convert aligned axes to required format. sanitize_inputs = kwargs.pop("sanitize_inputs", True) if aligned_axes is not None: keys, data = zip(*key_data_pairs) # Sanitize aligned axes unless hidden kwarg indicates not to. if sanitize_inputs: aligned_axes = collection_utils._sanitize_aligned_axes(keys, data, aligned_axes) else: aligned_axes = dict(zip(keys, aligned_axes)) if kwargs: raise TypeError( f"__init__() got an unexpected keyword argument: '{list(kwargs.keys())[0]}'") # Attach aligned axes to object. self.aligned_axes = aligned_axes if self.aligned_axes is None: self.n_aligned_axes = 0 else: self.n_aligned_axes = len(self.aligned_axes[keys[0]]) @property def _first_key(self): return list(self.keys())[0] def __str__(self): return (textwrap.dedent(f"""\ NDCollection ------------ Cube keys: {tuple(self.keys())} Number of Cubes: {len(self)} Aligned dimensions: {self.aligned_dimensions} Aligned world physical axis types: {self.aligned_world_axis_physical_types}""")) def __repr__(self): return f"{object.__repr__(self)}\n{str(self)}" @property def aligned_dimensions(self): """ Returns the lengths of all aligned axes. If there are no aligned axes, returns None. """ if self.aligned_axes is not None: return np.asanyarray(self[self._first_key].dimensions, dtype=object)[ np.array(self.aligned_axes[self._first_key])] @property def aligned_world_axis_physical_types(self): """ Returns the physical types of the aligned axes of an ND object in the collection. If there are no aligned axes, returns None. """ if self.aligned_axes is not None: axis_types = np.array(self[self._first_key].world_axis_physical_types) return tuple(axis_types[np.array(self.aligned_axes[self._first_key])]) def __getitem__(self, item): # There are two ways to slice: # by key or sequence of keys, i.e. slice out given cubes in the collection, or # by typical python numeric slicing API, # i.e. slice the each component cube along the aligned axes. # If item is single string, slicing is simple. if isinstance(item, str): return super().__getitem__(item) # If item is not a single string... else: # If item is a sequence, ensure strings and numeric items are not mixed. item_is_strings = False if isinstance(item, collections.abc.Sequence): item_strings = [isinstance(item_, str) for item_ in item] item_is_strings = all(item_strings) # Ensure strings are not mixed with slices. if (not item_is_strings) and (not all(np.invert(item_strings))): raise TypeError("Cannot mix keys and non-keys when indexing instance.") # If sequence is all strings, extract the cubes corresponding to the string keys. if item_is_strings: new_data = [self[_item] for _item in item] new_keys = item new_aligned_axes = tuple([self.aligned_axes[item_] for item_ in item]) # Else, the item is assumed to be a typical slicing item. # Slice each cube in collection using information in this item. # However, this can only be done if there are aligned axes. else: if self.aligned_axes is None: raise IndexError("Cannot slice unless collection has aligned axes.") # Derive item to be applied to each cube in collection and # whether any aligned axes are dropped by the slicing. collection_items, new_aligned_axes = self._generate_collection_getitems(item) # Apply those slice items to each cube in collection. new_data = [self[key][tuple(cube_item)] for key, cube_item in zip(self, collection_items)] # Since item is not strings, no cube in collection is dropped. # Therefore the collection keys remain unchanged. new_keys = list(self.keys()) return self.__class__(list(zip(new_keys, new_data)), aligned_axes=new_aligned_axes, meta=self.meta, sanitize_inputs=False) def _generate_collection_getitems(self, item): # There are 3 supported cases of the slice item: int, slice, tuple of ints and/or slices. # Compile appropriate slice items for each cube in the collection and # and drop any aligned axes that are sliced out. # First, define empty lists of slice items to be applied to each cube in collection. collection_items = [[slice(None)] * len(self[key].dimensions) for key in self] # Define empty list to hold aligned axes dropped by the slicing. drop_aligned_axes_indices = [] # Case 1: int # First aligned axis is dropped. if isinstance(item, int): drop_aligned_axes_indices = [0] # Insert item to each cube's slice item. for i, key in enumerate(self): collection_items[i][self.aligned_axes[key][0]] = item # Case 2: slice elif isinstance(item, slice): # Insert item to each cube's slice item. for i, key in enumerate(self): collection_items[i][self.aligned_axes[key][0]] = item # Note that slice interval's of 1 result in an axis of length 1. # The axis is not dropped. # Case 3: tuple of ints/slices # Search sub-items within tuple for ints or 1-interval slices. elif isinstance(item, tuple): # Ensure item is not longer than number of aligned axes if len(item) > self.n_aligned_axes: raise IndexError("Too many indices") for i, axis_item in enumerate(item): if isinstance(axis_item, int): drop_aligned_axes_indices.append(i) for j, key in enumerate(self): collection_items[j][self.aligned_axes[key][i]] = axis_item else: raise TypeError(f"Unsupported slicing type: {axis_item}") # Use indices of dropped axes determine above to update aligned_axes # by removing any that have been dropped. drop_aligned_axes_indices = np.array(drop_aligned_axes_indices) new_aligned_axes = collection_utils._update_aligned_axes( drop_aligned_axes_indices, self.aligned_axes, self._first_key) return collection_items, new_aligned_axes def copy(self): return self.__class__(self.items(), tuple(self.aligned_axes.values()), meta=self.meta, sanitize_inputs=False) def setdefault(self): raise NotImplementedError("NDCollection does not support setdefault.") def popitem(self): raise NotImplementedError("NDCollection does not support popitem.") def pop(self, key): # Extract desired cube from collection. popped_cube = super().pop(key) # Delete corresponding aligned axes popped_aligned_axes = self.aligned_axes.pop(key) return popped_cube def update(self, *args): """ Merges a new collection with current one replacing objects with common keys. Takes either a single input (`NDCollection`) or two inputs (sequence of key/value pair and aligned axes associated with each key/value pair. """ # If two inputs, inputs must be key_data_pairs and aligned_axes. if len(args) == 2: key_data_pairs = args[0] new_keys, new_data = zip(*key_data_pairs) new_aligned_axes = collection_utils._sanitize_aligned_axes(new_keys, new_data, args[1]) else: # If one arg given, input must be NDCollection. collection = args[0] new_keys = list(collection.keys()) new_data = list(collection.values()) key_data_pairs = zip(new_keys, new_data) new_aligned_axes = collection.aligned_axes # Check aligned axes of new inputs are compatible with those in self. # As they've already been sanitized, only one set of aligned axes need be checked. collection_utils.assert_aligned_axes_compatible( self[self._first_key].dimensions, new_data[0].dimensions, self.aligned_axes[self._first_key], new_aligned_axes[new_keys[0]]) # Update collection super().update(key_data_pairs) self.aligned_axes.update(new_aligned_axes) def __delitem__(self, key): super().__delitem__(key) self.aligned_axes.__delitem__(key) def __setitem__(self, key, value): raise NotImplementedError("NDCollection does not support __setitem__. " "Use NDCollection.update instead") ././@PaxHeader0000000000000000000000000000003300000000000011451 xustar000000000000000027 mtime=1605787057.150476 ndcube-1.4.2/ndcube/ndcube.py0000644000175100001640000011032000000000000016315 0ustar00vstsdocker00000000000000 import abc from collections import namedtuple import numbers import textwrap import warnings import astropy.nddata import astropy.units as u from astropy.utils.decorators import deprecated import numpy as np from ndcube.mixins import NDCubeSlicingMixin, NDCubePlotMixin from ndcube.ndcube_sequence import NDCubeSequence from ndcube import utils from ndcube.utils import wcs as wcs_utils from ndcube.utils.cube import _pixel_centers_or_edges, _get_dimension_for_pixel __all__ = ['NDCubeABC', 'NDCubeBase', 'NDCube', 'NDCubeOrdered'] class NDCubeMetaClass(abc.ABCMeta): """ A metaclass that combines `abc.ABCMeta`. """ class NDCubeABC(astropy.nddata.NDData, metaclass=NDCubeMetaClass): @abc.abstractmethod def pixel_to_world(self, *quantity_axis_list): """ Convert a pixel coordinate to a data (world) coordinate by using `~astropy.wcs.WCS.all_pix2world`. Parameters ---------- quantity_axis_list : iterable An iterable of `~astropy.units.Quantity` with unit as pixel `pix`. Note that these quantities must be entered as separate arguments, not as one list. origin : `int`. Origin of the top-left corner. i.e. count from 0 or 1. Normally, origin should be 0 when passing numpy indices, or 1 if passing values from FITS header or map attributes. See `~astropy.wcs.WCS.wcs_pix2world` for more information. Default is 0. Returns ------- coord : `list` A list of arrays containing the output coordinates reverse of the wcs axis order. """ @abc.abstractmethod def world_to_pixel(self, *quantity_axis_list): """ Convert a world coordinate to a data (pixel) coordinate by using `~astropy.wcs.WCS.all_world2pix`. Parameters ---------- quantity_axis_list : iterable A iterable of `~astropy.units.Quantity`. Note that these quantities must be entered as separate arguments, not as one list. origin : `int` Origin of the top-left corner. i.e. count from 0 or 1. Normally, origin should be 0 when passing numpy indices, or 1 if passing values from FITS header or map attributes. See `~astropy.wcs.WCS.wcs_world2pix` for more information. Default is 0. Returns ------- coord : `list` A list of arrays containing the output coordinates reverse of the wcs axis order. """ @abc.abstractproperty def dimensions(self): pass @abc.abstractproperty def world_axis_physical_types(self): pass @abc.abstractmethod def crop_by_coords(self, lower_corner, interval_widths=None, upper_corner=None, units=None): """ Crops an NDCube given minimum values and interval widths along axes. Parameters ---------- lower_corner: iterable of `astropy.units.Quantity` or `float` The minimum desired values along each relevant axis after cropping described in physical units consistent with the NDCube's wcs object. The length of the iterable must equal the number of data dimensions and must have the same order as the data. interval_widths: iterable of `astropy.units.Quantity` or `float` The width of the region of interest in each dimension in physical units consistent with the NDCube's wcs object. The length of the iterable must equal the number of data dimensions and must have the same order as the data. This argument will be removed in versions 2.0, please use upper_corner argument. upper_corner: iterable of `astropy.units.Quantity` or `float` The maximum desired values along each relevant axis after cropping described in physical units consistent with the NDCube's wcs object. The length of the iterable must equal the number of data dimensions and must have the same order as the data. units: iterable of `astropy.units.quantity.Quantity`, optionnal If the inputs are set without units, the user must set the units inside this argument as `str`. The length of the iterable must equal the number of data dimensions and must have the same order as the data. Returns ------- result: NDCube """ class NDCubeBase(NDCubeSlicingMixin, NDCubeABC): """ Class representing N dimensional cubes. Extra arguments are passed on to `~astropy.nddata.NDData`. Parameters ---------- data: `numpy.ndarray` The array holding the actual data in this object. wcs: `ndcube.wcs.wcs.WCS` The WCS object containing the axes' information uncertainty : any type, optional Uncertainty in the dataset. Should have an attribute uncertainty_type that defines what kind of uncertainty is stored, for example "std" for standard deviation or "var" for variance. A metaclass defining such an interface is NDUncertainty - but isn’t mandatory. If the uncertainty has no such attribute the uncertainty is stored as UnknownUncertainty. Defaults to None. mask : any type, optional Mask for the dataset. Masks should follow the numpy convention that valid data points are marked by False and invalid ones with True. Defaults to None. meta : dict-like object, optional Additional meta information about the dataset. If no meta is provided an empty collections.OrderedDict is created. Default is None. unit : Unit-like or str, optional Unit for the dataset. Strings that can be converted to a Unit are allowed. Default is None. extra_coords : iterable of `tuple`, each with three entries (`str`, `int`, `astropy.units.quantity` or array-like) Gives the name, axis of data, and values of coordinates of a data axis not included in the WCS object. copy : bool, optional Indicates whether to save the arguments as copy. True copies every attribute before saving it while False tries to save every parameter as reference. Note however that it is not always possible to save the input as reference. Default is False. missing_axes : `list` of `bool` Designates which axes in wcs object do not have a corresponding axis is the data. True means axis is "missing", False means axis corresponds to a data axis. Ordering corresponds to the axis ordering in the WCS object, i.e. reverse of data. For example, say the data's y-axis corresponds to latitude and x-axis corresponds to wavelength. In order the convert the y-axis to latitude the WCS must contain a "missing" longitude axis as longitude and latitude are not separable. """ def __init__(self, data, wcs, uncertainty=None, mask=None, meta=None, unit=None, extra_coords=None, copy=False, missing_axes=None, **kwargs): if missing_axes is None: # Ensure old missing_axis name not being used. If so, raise deprecation warning. missing_axes = kwargs.get("missing_axis", None) if missing_axes is None: missing_axes = [False] * wcs.naxis else: warnings.warn( "missing_axis has been deprecated by missing_axes and " "will no longer be supported after ndcube 2.0.", DeprecationWarning) self.missing_axes = missing_axes if data.ndim is not wcs.naxis: count = 0 for bool_ in self.missing_axes: if not bool_: count += 1 if count is not data.ndim: raise ValueError("The number of data dimensions and number of " "wcs non-missing axes do not match.") # Format extra coords. if extra_coords: self._extra_coords_wcs_axis = \ utils.cube._format_input_extra_coords_to_extra_coords_wcs_axis( extra_coords, self.missing_axes, data.shape) else: self._extra_coords_wcs_axis = None # Initialize NDCube. super().__init__(data, wcs=wcs, uncertainty=uncertainty, mask=mask, meta=meta, unit=unit, copy=copy, **kwargs) @property def dimensions(self): """ Returns a named tuple with two attributes: 'shape' gives the shape of the data dimensions; 'axis_types' gives the WCS axis type of each dimension, e.g. WAVE or HPLT-TAN for wavelength of helioprojected latitude. """ return u.Quantity(self.data.shape, unit=u.pix) @property @deprecated(since='1.4.1', message='NDCube.world_axis_physical_types will be removed in version 2.0. ' 'Use NDCube.wcs.world_axis_physical_types or ' 'NDCube.array_axis_physical_types instead.') def world_axis_physical_types(self): """ Returns an iterable of strings describing the physical type for each world axis. The strings conform to the International Virtual Observatory Alliance standard, UCD1+ controlled Vocabulary. For a description of the standard and definitions of the different strings and string components, see http://www.ivoa.net/documents/latest/UCDlist.html. """ ctype = list(self.wcs.wcs.ctype) axes_ctype = [] for i, axis in enumerate(self.missing_axes): if not axis: # Find keys in wcs_ivoa_mapping dict that represent start of CTYPE. # Ensure CTYPE is capitalized. keys = list(filter(lambda key: ctype[i].upper().startswith(key), wcs_utils.wcs_ivoa_mapping)) # Assuming CTYPE is supported by wcs_ivoa_mapping, use its corresponding axis name. if len(keys) == 1: axis_name = wcs_utils.wcs_ivoa_mapping.get(keys[0]) # If CTYPE not supported, raise a warning and set the axis name to CTYPE. elif len(keys) == 0: warnings.warn("CTYPE not recognized by ndcube. " "Please raise an issue at " "https://github.com/sunpy/ndcube/issues citing the " "unsupported CTYPE as we'll include it: " "CTYPE = {}".format(ctype[i])) axis_name = "custom:{}".format(ctype[i]) # If there are multiple valid keys, raise an error. else: raise ValueError("Non-unique CTYPE key. Please raise an issue at " "https://github.com/sunpy/ndcube/issues citing the " "following CTYPE and non-unique keys: " "CTYPE = {}; keys = {}".format(ctype[i], keys)) axes_ctype.append(axis_name) return tuple(axes_ctype[::-1]) @property def array_axis_physical_types(self): """ Returns the WCS physical types associated with each array axis. Returns an iterable of tuples where each tuple corresponds to an array axis and holds strings denoting the WCS physical types associated with that array axis. Since multiple physical types can be associated with one array axis, tuples can be of different lengths. Likewise, as a single physical type can correspond to multiple array axes, the same physical type string can appear in multiple tuples. """ axis_correlation_matrix, world_axis_physical_types = ( wcs_utils.reduced_correlation_matrix_and_world_physical_types( self.wcs.axis_correlation_matrix, self.wcs.world_axis_physical_types, self.missing_axes)) return [tuple(world_axis_physical_types[axis_correlation_matrix[:, i]]) for i in range(axis_correlation_matrix.shape[1])][::-1] @property def missing_axis(self): warnings.warn( ("The missing_axis list has been deprecated by missing_axes" " and will no longer be supported after ndcube 2.0."), DeprecationWarning) return self.missing_axes def pixel_to_world(self, *quantity_axis_list): # The docstring is defined in NDDataBase origin = 0 list_arg = [] indexed_not_as_one = [] result = [] quantity_index = 0 for i in range(len(self.missing_axes)): wcs_index = self.wcs.naxis - 1 - i # the cases where the wcs dimension was made 1 and the missing_axes is True if self.missing_axes[wcs_index]: list_arg.append(self.wcs.wcs.crpix[wcs_index] - 1 + origin) else: # else it is not the case where the dimension of wcs is 1. list_arg.append(quantity_axis_list[quantity_index].to(u.pix).value) quantity_index += 1 # appending all the indexes to be returned in the answer indexed_not_as_one.append(wcs_index) list_arguments = list_arg[::-1] pixel_to_world = self.wcs.all_pix2world(*list_arguments, origin) # collecting all the needed answer in this list. for index in indexed_not_as_one[::-1]: result.append(u.Quantity(pixel_to_world[index], unit=self.wcs.wcs.cunit[index])) return result[::-1] def world_to_pixel(self, *quantity_axis_list): # The docstring is defined in NDDataBase origin = 0 list_arg = [] indexed_not_as_one = [] result = [] quantity_index = 0 for i in range(len(self.missing_axes)): wcs_index = self.wcs.naxis - 1 - i # the cases where the wcs dimension was made 1 and the missing_axes is True if self.missing_axes[wcs_index]: list_arg.append(self.wcs.wcs.crval[wcs_index] + 1 - origin) else: # else it is not the case where the dimension of wcs is 1. list_arg.append( quantity_axis_list[quantity_index].to(self.wcs.wcs.cunit[wcs_index]).value) quantity_index += 1 # appending all the indexes to be returned in the answer indexed_not_as_one.append(wcs_index) list_arguments = list_arg[::-1] world_to_pixel = self.wcs.all_world2pix(*list_arguments, origin) # collecting all the needed answer in this list. for index in indexed_not_as_one[::-1]: result.append(u.Quantity(world_to_pixel[index], unit=u.pix)) return result[::-1] def axis_world_coords(self, *axes, edges=False): """ Returns WCS coordinate values of all pixels for all axes. Parameters ---------- axes: `int` or `str`, or multiple `int` or `str` Axis number in numpy ordering or unique substring of `~ndcube.NDCube.world_axis_physical_types` of axes for which real world coordinates are desired. axes=None implies all axes will be returned. edges: `bool` The edges argument helps in returning `pixel_edges` instead of `pixel_values`. Default value is False, which returns `pixel_values`. True return `pixel_edges` Returns ------- axes_coords: `list` of `astropy.units.Quantity` Real world coords for axes in order requested by user. Example ------- >>> NDCube.all_world_coords(('lat', 'lon')) # doctest: +SKIP >>> NDCube.all_world_coords(2) # doctest: +SKIP """ # Define the dimensions of the cube and the total number of axes. cube_dimensions = np.array(self.dimensions.value, dtype=int) n_dimensions = cube_dimensions.size world_axis_types = self.wcs.world_axis_physical_types[::-1] # Determine axis numbers of user supplied axes. if axes == (): int_axes = np.arange(n_dimensions) else: if isinstance(axes, int): int_axes = np.array([axes]) elif isinstance(axes, str): int_axes = np.array([ utils.cube.get_axis_number_from_axis_name(axes, world_axis_types)]) else: int_axes = np.empty(len(axes), dtype=int) for i, axis in enumerate(axes): if isinstance(axis, int): if axis < 0: int_axes[i] = n_dimensions + axis else: int_axes[i] = axis elif isinstance(axis, str): int_axes[i] = utils.cube.get_axis_number_from_axis_name( axis, world_axis_types) # Ensure user has not entered the same axis twice. repeats = {x for x in int_axes if np.where(int_axes == x)[0].size > 1} if repeats: raise ValueError("The following axes were specified more than once: {}".format( ' '.join(map(str, repeats)))) n_axes = len(int_axes) axes_coords = [None] * n_axes axes_translated = np.zeros_like(int_axes, dtype=bool) # Determine which axes are dependent on others. # Ensure the axes are in numerical order. dependent_axes = [list(utils.wcs.get_dependent_data_axes(self.wcs, axis, self.missing_axes)) for axis in int_axes] n_dependent_axes = [len(da) for da in dependent_axes] # Iterate through each axis and perform WCS translation. for i, axis in enumerate(int_axes): # If axis has already been translated, do not do so again. if not axes_translated[i]: if n_dependent_axes[i] == 1: # Construct pixel quantities in each dimension letting # other dimensions all have 0 pixel value. # Replace array in quantity list corresponding to current axis with # np.arange array. quantity_list = [u.Quantity(np.zeros(_get_dimension_for_pixel( cube_dimensions[dependent_axes[i]], edges)), unit=u.pix)] * n_dimensions quantity_list[axis] = u.Quantity( _pixel_centers_or_edges( cube_dimensions[axis], edges), unit=u.pix) else: # If the axis is dependent on another, perform # translations on all dependent axes. # Construct pixel quantities in each dimension letting # other dimensions all have 0 pixel value. # Construct orthogonal pixel index arrays for dependent axes. quantity_list = [u.Quantity(np.zeros(tuple( [_get_dimension_for_pixel(cube_dimensions[k], edges) for k in dependent_axes[i]])), unit=u.pix)] * n_dimensions dependent_pixel_quantities = np.meshgrid( *[_pixel_centers_or_edges(cube_dimensions[k], edges) * u.pix for k in dependent_axes[i]], indexing="ij") for k, axis in enumerate(dependent_axes[i]): quantity_list[axis] = dependent_pixel_quantities[k] # Perform wcs translation dependent_axes_coords = self.pixel_to_world(*quantity_list) # Place world coords into output list for dependent_axis in dependent_axes[i]: if dependent_axis in int_axes: # Due to error check above we know dependent # axis can appear in int_axes at most once. j = np.where(int_axes == dependent_axis)[0][0] axes_coords[j] = dependent_axes_coords[dependent_axis] # Remove axis from list that have now been translated. axes_translated[j] = True if len(axes_coords) == 1: return axes_coords[0] else: return tuple(axes_coords) def axis_world_coords_values(self, *axes, edges=False): """ Returns WCS coordinate values of all pixels for desired axes. Parameters ---------- axes: `int` or `str`, or multiple `int` or `str` Axis number in numpy ordering or unique substring of `~ndcube.NDCube.wcs.world_axis_physical_types` of axes for which real world coordinates are desired. axes=None implies all axes will be returned. edges: `bool` If True, the coords at the edges of the pixels are returned rather than the coords at the center of the pixels. Note that there are n+1 edges for n pixels which is reflected in the returned coords. Default=False, i.e. pixel centers are returned. Returns ------- coord_values: `collections.namedtuple` Real world coords labeled with their real world physical types for the axes requested by the user. Returned in same order as axis_names. Example ------- >>> NDCube.all_world_coords_values(('lat', 'lon')) # doctest: +SKIP >>> NDCube.all_world_coords_values(2) # doctest: +SKIP """ # Create meshgrid of all pixel coordinates. # If user, wants edges, set pixel values to pixel edges. # Else make pixel centers. wcs_shape = self.data.shape[::-1] # Insert length-1 axes for missing axes. for i in np.arange(len(self.missing_axes))[self.missing_axes]: wcs_shape = np.insert(wcs_shape, i, 1) if edges: wcs_shape = tuple(np.array(wcs_shape) + 1) pixel_inputs = np.meshgrid(*[np.arange(i) - 0.5 for i in wcs_shape], indexing='ij', sparse=True) else: pixel_inputs = np.meshgrid(*[np.arange(i) for i in wcs_shape], indexing='ij', sparse=True) # Get world coords for all axes and all pixels. axes_coords = list(self.wcs.pixel_to_world_values(*pixel_inputs)) # Reduce duplication across independent dimensions for each coord # and transpose to make dimensions mimic numpy array order rather than WCS order. # Add units to coords for i, axis_coord in enumerate(axes_coords): slices = np.array([slice(None)] * self.wcs.world_n_dim) slices[np.invert(self.wcs.axis_correlation_matrix[i])] = 0 axes_coords[i] = axis_coord[tuple(slices)].T axes_coords[i] *= u.Unit(self.wcs.world_axis_units[i]) world_axis_physical_types = self.wcs.world_axis_physical_types # If user has supplied axes, extract only the # world coords that correspond to those axes. if axes: # Convert input axes to WCS world axis indices. world_indices = set() for axis in axes: if isinstance(axis, numbers.Integral): # If axis is int, it is a numpy order array axis. # Convert to pixel axis in WCS order. axis = wcs_utils.convert_between_array_and_pixel_axes( np.array([axis]), self.wcs.pixel_n_dim)[0] # Get WCS world axis indices that correspond to the WCS pixel axis # and add to list of indices of WCS world axes whose coords will be returned. world_indices.update(wcs_utils.pixel_axis_to_world_axes( axis, self.wcs.axis_correlation_matrix)) elif isinstance(axis, str): # If axis is str, it is a physical type or substring of a physical type. world_indices.update({wcs_utils.physical_type_to_world_axis( axis, world_axis_physical_types)}) else: raise TypeError(f"Unrecognized axis type: {axis, type(axis)}. " "Must be of type (numbers.Integral, str)") # Use inferred world axes to extract the desired coord value # and corresponding physical types. world_indices = np.array(list(world_indices), dtype=int) axes_coords = np.array(axes_coords, dtype=object)[world_indices] world_axis_physical_types = tuple(np.array(world_axis_physical_types)[world_indices]) # Return in array order. # First replace characters in physical types forbidden for namedtuple identifiers. identifiers = [] for physical_type in world_axis_physical_types[::-1]: identifier = physical_type.replace(":", "_") identifier = identifier.replace(".", "_") identifier = identifier.replace("-", "__") identifiers.append(identifier) CoordValues = namedtuple("CoordValues", identifiers) return CoordValues(*axes_coords[::-1]) @property def extra_coords(self): """ Dictionary of extra coords where each key is the name of an extra coordinate supplied by user during instantiation of the NDCube. The value of each key is itself a dictionary with the following keys: | 'axis': `int` | The number of the data axis to which the extra coordinate corresponds. | 'value': `astropy.units.Quantity` or array-like | The value of the extra coordinate at each pixel/array element along the | corresponding axis (given by the 'axis' key, above). Note this means | that the length of 'value' must be equal to the length of the data axis | to which is corresponds. """ if not self._extra_coords_wcs_axis: result = None else: result = {} for key in list(self._extra_coords_wcs_axis.keys()): result[key] = { "axis": utils.cube.wcs_axis_to_data_axis( self._extra_coords_wcs_axis[key]["wcs axis"], self.missing_axes), "value": self._extra_coords_wcs_axis[key]["value"]} return result def crop_by_coords(self, lower_corner, interval_widths=None, upper_corner=None, units=None): # The docstring is defined in NDDataBase n_dim = self.data.ndim # Raising a value error if the arguments have not the same dimensions. # Calculation of upper_corner with the inputing interval_widths # This part of the code will be removed in version 2.0 if interval_widths: warnings.warn( "interval_widths will be removed from the API in version 2.0" ", please use upper_corner argument.") if upper_corner: raise ValueError("Only one of interval_widths or upper_corner " "can be set. Recommend using upper_corner as " "interval_widths is deprecated.") if (len(lower_corner) != len(interval_widths)) or (len(lower_corner) != n_dim): raise ValueError("lower_corner and interval_widths must have " "same number of elements as number of data " "dimensions.") upper_corner = [lower_corner[i] + interval_widths[i] for i in range(n_dim)] # Raising a value error if the arguments have not the same dimensions. if (len(lower_corner) != len(upper_corner)) or (len(lower_corner) != n_dim): raise ValueError("lower_corner and upper_corner must have same" "number of elements as number of data dimensions.") if units: # Raising a value error if units have not the data dimensions. if len(units) != n_dim: raise ValueError('units must have same number of elements as ' 'number of data dimensions.') # If inputs are not Quantity objects, they are modified into specified units lower_corner = [u.Quantity(lower_corner[i], unit=units[i]) for i in range(self.data.ndim)] upper_corner = [u.Quantity(upper_corner[i], unit=units[i]) for i in range(self.data.ndim)] else: if any([not isinstance(x, u.Quantity) for x in lower_corner + upper_corner]): raise TypeError("lower_corner and interval_widths/upper_corner must be " "of type astropy.units.Quantity or the units kwarg " "must be set.") # Get all corners of region of interest. all_world_corners_grid = np.meshgrid( *[u.Quantity([lower_corner[i], upper_corner[i]], unit=lower_corner[i].unit).value for i in range(self.data.ndim)]) all_world_corners = [all_world_corners_grid[i].flatten() * lower_corner[i].unit for i in range(n_dim)] # Convert to pixel coordinates all_pix_corners = self.world_to_pixel(*all_world_corners) # Derive slicing item with which to slice NDCube. # Be sure to round down min pixel and round up + 1 the max pixel. item = tuple([slice(int(np.clip(axis_pixels.value.min(), 0, None)), int(np.ceil(axis_pixels.value.max())) + 1) for axis_pixels in all_pix_corners]) return self[item] def crop_by_extra_coord(self, coord_name, min_coord_value, max_coord_value): """ Crops an NDCube given a minimum value and interval width along an extra coord. Parameters ---------- coord_name: `str` Name of extra coordinate by which to crop. min_coord_value: Single value `astropy.units.Quantity` The minimum desired value of the extra coord after cropping. Unit must be consistent with the extra coord on which cropping is based. min_coord_value: Single value `astropy.units.Quantity` The maximum desired value of the extra coord after cropping. Unit must be consistent with the extra coord on which cropping is based. Returns ------- result: `ndcube.NDCube` """ if not isinstance(coord_name, str): raise TypeError("The API for this function has changed. " "Please give coord_name, min_coord_value, max_coord_value") extra_coord_dict = self.extra_coords[coord_name] if isinstance(extra_coord_dict["value"], u.Quantity): extra_coord_values = extra_coord_dict["value"] else: extra_coord_values = np.asarray(extra_coord_dict["value"]) w = np.logical_and(extra_coord_values >= min_coord_value, extra_coord_values < max_coord_value) w = np.arange(len(extra_coord_values))[w] item = [slice(None)] * len(self.dimensions) item[extra_coord_dict["axis"]] = slice(w[0], w[-1] + 1) return self[tuple(item)] def __str__(self): return textwrap.dedent(f"""\ NDCube --------------------- {{wcs}} --------------------- Length of NDCube: {self.dimensions} Axis Types of NDCube: {self.world_axis_physical_types}""").format(wcs=str(self.wcs)) def __repr__(self): return f"{object.__repr__(self)}\n{str(self)}" def explode_along_axis(self, axis): """ Separates slices of NDCubes along a given cube axis into a NDCubeSequence of (N-1)DCubes. Parameters ---------- axis : `int` The axis along which the data is to be changed. Returns ------- result : `ndcube_sequence.NDCubeSequence` """ # If axis is -ve then calculate the axis from the length of the dimensions of one cube if axis < 0: axis = len(self.dimensions) + axis # To store the resultant cube result_cubes = [] # All slices are initially initialised as slice(None, None, None) cube_slices = [slice(None, None, None)] * self.data.ndim # Slicing the cube inside result_cube for i in range(self.data.shape[axis]): # Setting the slice value to the index so that the slices are done correctly. cube_slices[axis] = i # Set to None the metadata of sliced cubes. item = tuple(cube_slices) sliced_cube = self[item] sliced_cube.meta = None # Appending the sliced cubes in the result_cube list result_cubes.append(sliced_cube) # Creating a new NDCubeSequence with the result_cubes and common axis as axis return NDCubeSequence(result_cubes, common_axis=axis, meta=self.meta) class NDCube(NDCubeBase, NDCubePlotMixin, astropy.nddata.NDArithmeticMixin): pass class NDCubeOrdered(NDCube): """ Class representing N dimensional cubes with oriented WCS. Extra arguments are passed on to NDData's init. The order is TIME, SPECTRAL, SOLAR-x, SOLAR-Y and any other dimension. For example, in an x, y, t cube the order would be (t,x,y) and in a lambda, t, y cube the order will be (t, lambda, y). Extra arguments are passed on to NDData's init. Parameters ---------- data: `numpy.ndarray` The array holding the actual data in this object. wcs: `ndcube.wcs.wcs.WCS` The WCS object containing the axes' information. The axes' priorities are time, spectral, celestial. This means that if present, each of these axis will take precedence over the others. uncertainty : any type, optional Uncertainty in the dataset. Should have an attribute uncertainty_type that defines what kind of uncertainty is stored, for example "std" for standard deviation or "var" for variance. A metaclass defining such an interface is NDUncertainty - but isn’t mandatory. If the uncertainty has no such attribute the uncertainty is stored as UnknownUncertainty. Defaults to None. mask : any type, optional Mask for the dataset. Masks should follow the numpy convention that valid data points are marked by False and invalid ones with True. Defaults to None. meta : dict-like object, optional Additional meta information about the dataset. If no meta is provided an empty collections.OrderedDict is created. Default is None. unit : Unit-like or str, optional Unit for the dataset. Strings that can be converted to a Unit are allowed. Default is None. copy : bool, optional Indicates whether to save the arguments as copy. True copies every attribute before saving it while False tries to save every parameter as reference. Note however that it is not always possible to save the input as reference. Default is False. """ def __init__(self, data, wcs, uncertainty=None, mask=None, meta=None, unit=None, extra_coords=None, copy=False, missing_axes=None, **kwargs): axtypes = list(wcs.wcs.ctype)[::-1] array_order = utils.cube.select_order(axtypes) result_data = data.transpose(array_order) result_wcs = utils.wcs.reindex_wcs(wcs, np.array(array_order)) if uncertainty is not None: result_uncertainty = uncertainty.transpose(array_order) else: result_uncertainty = None if mask is not None: result_mask = mask.transpose(array_order) else: result_mask = None # Reorder extra coords if needed. if extra_coords: reordered_extra_coords = [] for coord in extra_coords: coord_list = list(coord) coord_list[1] = array_order[coord_list[1]] reordered_extra_coords.append(tuple(coord_list)) super().__init__(result_data, result_wcs, uncertainty=result_uncertainty, mask=result_mask, meta=meta, unit=unit, extra_coords=reordered_extra_coords, copy=copy, missing_axes=missing_axes, **kwargs) ././@PaxHeader0000000000000000000000000000003300000000000011451 xustar000000000000000027 mtime=1605787057.150476 ndcube-1.4.2/ndcube/ndcube_sequence.py0000644000175100001640000003032200000000000020210 0ustar00vstsdocker00000000000000import copy import numbers import textwrap import astropy.units as u from astropy.utils.decorators import deprecated import numpy as np from ndcube.mixins.sequence_plotting import NDCubeSequencePlotMixin from ndcube import utils __all__ = ['NDCubeSequence'] class NDCubeSequenceBase: """ Class representing list of cubes. Parameters ---------- data_list : `list` List of cubes. meta : `dict` or None The header of the NDCubeSequence. common_axis: `int` or None The data axis which is common between the NDCubeSequence and the Cubes within. For example, if the Cubes are sequenced in chronological order and time is one of the zeroth axis of each Cube, then common_axis should be se to 0. This enables the option for the NDCubeSequence to be indexed as though it is one single Cube. """ def __init__(self, data_list, meta=None, common_axis=None, **kwargs): self.data = data_list self.meta = meta if common_axis is not None: self._common_axis = int(common_axis) else: self._common_axis = common_axis @property def dimensions(self): return self._dimensions @property def _dimensions(self): dimensions = [len(self.data) * u.pix] + list(self.data[0].dimensions) if len(dimensions) > 1: # If there is a common axis, length of cube's along it may not # be the same. Therefore if the lengths are different, # represent them as a tuple of all the values, else as an int. if self._common_axis is not None: common_axis_lengths = [cube.data.shape[self._common_axis] for cube in self.data] if len(np.unique(common_axis_lengths)) != 1: common_axis_dimensions = [cube.dimensions[self._common_axis] for cube in self.data] dimensions[self._common_axis + 1] = u.Quantity( common_axis_dimensions, unit=common_axis_dimensions[0].unit) return tuple(dimensions) @property @deprecated(since='1.4.1', message='NDCubeSequence.world_axis_physical_types will be removed in ndcube 2.0' '. Use NDCubeSequence.array_axis_physical_types instead.') def world_axis_physical_types(self): return tuple(["meta.obs.sequence"] + list(self.data[0].world_axis_physical_types)) @property def array_axis_physical_types(self): return [("meta.obs.sequence",)] + self.data[0].array_axis_physical_types @property def cube_like_dimensions(self): if not isinstance(self._common_axis, int): raise TypeError("Common axis must be set.") dimensions = list(self._dimensions) cube_like_dimensions = list(self._dimensions[1:]) if dimensions[self._common_axis + 1].isscalar: cube_like_dimensions[self._common_axis] = u.Quantity( dimensions[0].value * dimensions[self._common_axis + 1].value, unit=u.pix) else: cube_like_dimensions[self._common_axis] = sum(dimensions[self._common_axis + 1]) # Combine into single Quantity cube_like_dimensions = u.Quantity(cube_like_dimensions, unit=u.pix) return cube_like_dimensions @property @deprecated(since='1.4.1', message='NDCubeSequence.cube_like_world_axis_physical_types will be removed ' 'in ndcube 2.0. ' 'Use NDCubeSequence.cube_like_array_axis_physical_types instead.') def cube_like_world_axis_physical_types(self): return self.data[0].world_axis_physical_types @property def cube_like_array_axis_physical_types(self): if self._common_axis is None: raise ValueError("Common axis must be set.") return self.data[0].array_axis_physical_types def __getitem__(self, item): if isinstance(item, numbers.Integral): return self.data[item] # Create an empty sequence in which to place the sliced cubes. result = type(self)([], meta=self.meta, common_axis=self._common_axis) if isinstance(item, slice): result.data = self.data[item] else: if isinstance(item[0], numbers.Integral): result = self.data[item[0]][item[1:]] else: result.data = [cube[item[1:]] for cube in self.data[item[0]]] # Determine common axis after slicing. if self._common_axis is not None: drop_cube_axes = [isinstance(i, numbers.Integral) for i in item[1:]] if (len(drop_cube_axes) > self._common_axis and drop_cube_axes[self._common_axis] is True): result._common_axis = None else: result._common_axis = \ self._common_axis - sum(drop_cube_axes[:self._common_axis]) return result @property def index_as_cube(self): """ Method to slice the NDCubesequence instance as a single cube. Example ------- >>> # Say we have three Cubes each cube has common_axis=0 is time and shape=(3,3,3) >>> data_list = [cubeA, cubeB, cubeC] # doctest: +SKIP >>> cs = NDCubeSequence(data_list, meta=None, common_axis=0) # doctest: +SKIP >>> # return zeroth time slice of cubeB in via normal NDCubeSequence indexing. >>> cs[1,:,0,:] # doctest: +SKIP >>> # Return same slice using this function >>> cs.index_sequence_as_cube[3:6, 0, :] # doctest: +SKIP """ if self._common_axis is None: raise ValueError("common_axis cannot be None") return _IndexAsCubeSlicer(self) @property def common_axis_extra_coords(self): if not isinstance(self._common_axis, int): raise ValueError("Common axis is not set.") # Get names and units of coords along common axis. axis_coord_names, axis_coord_units = utils.sequence._get_axis_extra_coord_names_and_units( self.data, self._common_axis) # Compile dictionary of common axis extra coords. if axis_coord_names is not None: return utils.sequence._get_int_axis_extra_coords( self.data, axis_coord_names, axis_coord_units, self._common_axis) else: return None @property def sequence_axis_extra_coords(self): sequence_coord_names, sequence_coord_units = \ utils.sequence._get_axis_extra_coord_names_and_units(self.data, None) if sequence_coord_names is not None: # Define empty dictionary which will hold the extra coord # values not assigned a cube data axis. sequence_extra_coords = {} # Define list of None signifying unit of each coord. It will # be filled in in for loop below. sequence_coord_units = [None] * len(sequence_coord_names) # Iterate through cubes and populate values of each extra coord # not assigned a cube data axis. cube_extra_coords = [cube.extra_coords for cube in self.data] for i, coord_key in enumerate(sequence_coord_names): coord_values = np.array([None] * len(self.data), dtype=object) for j, cube in enumerate(self.data): # Construct list of coord values from each cube for given extra coord. try: coord_values[j] = cube_extra_coords[j][coord_key]["value"] # Determine whether extra coord is a quantity by checking # whether any one value has a unit. As we are not # assuming that all cubes have the same extra coords # along the sequence axis, we will keep checking as we # move through the cubes until all cubes are checked or # we have found a unit. if (isinstance(cube_extra_coords[j][coord_key]["value"], u.Quantity) and not sequence_coord_units[i]): sequence_coord_units[i] = cube_extra_coords[j][coord_key]["value"].unit except KeyError: pass # If the extra coord is normally a Quantity, replace all # None occurrences in coord value array with a NaN, and # convert coord_values from an array of Quantities to a # single Quantity of length equal to number of cubes in # sequence. w_none = np.where(coord_values == None)[0] # NOQA if sequence_coord_units[i]: # This part of if statement is coded in an apparently # round about way but necessitated because you can't # put a NaN quantity into an array and keep its unit. w_not_none = np.where(coord_values != None)[0] # NOQA coord_values = u.Quantity(list(coord_values[w_not_none]), unit=sequence_coord_units[i]) coord_values = list(coord_values.value) for index in w_none: coord_values.insert(index, np.nan) coord_values = u.Quantity(coord_values, unit=sequence_coord_units[i]).flatten() else: coord_values[w_none] = np.nan sequence_extra_coords[coord_key] = coord_values else: sequence_extra_coords = None return sequence_extra_coords def explode_along_axis(self, axis): """ Separates slices of NDCubes in sequence along a given cube axis into (N-1)DCubes. Parameters ---------- axis : `int` The axis along which the data is to be changed. """ # if axis is None then set axis as common axis. if self._common_axis is not None: if self._common_axis != axis: raise ValueError("axis and common_axis should be equal.") # is axis is -ve then calculate the axis from the length of the dimensions of one cube if axis < 0: axis = len(self.dimensions[1::]) + axis # To store the resultant cube result_cubes = [] # All slices are initially initialised as slice(None, None, None) result_cubes_slice = [slice(None, None, None)] * len(self[0].data.shape) # the range of the axis that needs to be sliced range_of_axis = self[0].data.shape[axis] for ndcube in self.data: for index in range(range_of_axis): # setting the slice value to the index so that the slices are done correctly. result_cubes_slice[axis] = index # appending the sliced cubes in the result_cube list result_cubes.append(ndcube.__getitem__(tuple(result_cubes_slice))) # creating a new sequence with the result_cubes keeping the meta and common axis as axis return self._new_instance(result_cubes, meta=self.meta) def __str__(self): return (textwrap.dedent(f"""\ NDCubeSequence --------------------- Length of NDCubeSequence: {self.dimensions[0]} Shape of 1st NDCube: {self.dimensions[1::]} Axis Types of 1st NDCube: {self.world_axis_physical_types[1:]}""")) def __repr__(self): return f"{object.__repr__(self)}\n{str(self)}" @classmethod def _new_instance(cls, data_list, meta=None, common_axis=None): """ Instantiate a new instance of this class using given data. """ return cls(data_list, meta=meta, common_axis=common_axis) class NDCubeSequence(NDCubeSequenceBase, NDCubeSequencePlotMixin): pass """ Cube Sequence Helpers """ class _IndexAsCubeSlicer: """ Helper class to make slicing in index_as_cube sliceable/indexable like a numpy array. Parameters ---------- seq : `ndcube.NDCubeSequence` Object of NDCubeSequence. """ def __init__(self, seq): self.seq = seq def __getitem__(self, item): return utils.sequence._index_sequence_as_cube(self.seq, item) ././@PaxHeader0000000000000000000000000000003400000000000011452 xustar000000000000000028 mtime=1605787217.3871849 ndcube-1.4.2/ndcube/tests/0000755000175100001640000000000000000000000015650 5ustar00vstsdocker00000000000000././@PaxHeader0000000000000000000000000000003300000000000011451 xustar000000000000000027 mtime=1605787057.150476 ndcube-1.4.2/ndcube/tests/__init__.py0000644000175100001640000000015400000000000017761 0ustar00vstsdocker00000000000000# Licensed under a 3-clause BSD style license - see LICENSE.rst """ This module contains package tests. """ ././@PaxHeader0000000000000000000000000000003300000000000011451 xustar000000000000000027 mtime=1605787057.150476 ndcube-1.4.2/ndcube/tests/helpers.py0000644000175100001640000000711100000000000017664 0ustar00vstsdocker00000000000000 """ Helpers for testing ndcube. """ import unittest import numpy as np from ndcube import NDCube, NDCubeSequence, utils __all__ = ['assert_extra_coords_equal', 'assert_metas_equal', 'assert_cubes_equal', 'assert_cubesequences_equal', 'assert_wcs_are_equal'] def assert_extra_coords_equal(test_input, extra_coords): assert test_input.keys() == extra_coords.keys() for key in list(test_input.keys()): assert test_input[key]['axis'] == extra_coords[key]['axis'] assert (test_input[key]['value'] == extra_coords[key]['value']).all() def assert_metas_equal(test_input, expected_output): if not (test_input is None and expected_output is None): assert test_input.keys() == expected_output.keys() for key in list(test_input.keys()): assert test_input[key] == expected_output[key] def assert_cubes_equal(test_input, expected_cube): unit_tester = unittest.TestCase() assert isinstance(test_input, type(expected_cube)) assert np.all(test_input.mask == expected_cube.mask) assert_wcs_are_equal(test_input.wcs, expected_cube.wcs) assert test_input.missing_axes == expected_cube.missing_axes if type(test_input.uncertainty) is not type(expected_cube.uncertainty): # noqa raise AssertionError("NDCube uncertainties not of same type: {0} != {1}".format( type(test_input.uncertainty), type(expected_cube.uncertainty))) if test_input.uncertainty is not None: assert test_input.uncertainty.array.shape == expected_cube.uncertainty.array.shape assert test_input.wcs.world_axis_physical_types == expected_cube.wcs.world_axis_physical_types assert all(test_input.dimensions.value == expected_cube.dimensions.value) assert test_input.dimensions.unit == expected_cube.dimensions.unit if type(test_input.extra_coords) is not type(expected_cube.extra_coords): # noqa raise AssertionError("NDCube extra_coords not of same type: {0} != {1}".format( type(test_input.extra_coords), type(expected_cube.extra_coords))) if test_input.extra_coords is not None: assert_extra_coords_equal(test_input.extra_coords, expected_cube.extra_coords) def assert_cubesequences_equal(test_input, expected_sequence): assert isinstance(test_input, type(expected_sequence)) assert_metas_equal(test_input.meta, expected_sequence.meta) assert test_input._common_axis == expected_sequence._common_axis for i, cube in enumerate(test_input.data): assert_cubes_equal(cube, expected_sequence.data[i]) def assert_wcs_are_equal(wcs1, wcs2): """ Assert function for testing two wcs object. Used in testing NDCube. """ assert list(wcs1.wcs.ctype) == list(wcs2.wcs.ctype) assert list(wcs1.wcs.crval) == list(wcs2.wcs.crval) assert list(wcs1.wcs.crpix) == list(wcs2.wcs.crpix) assert list(wcs1.wcs.cdelt) == list(wcs2.wcs.cdelt) assert list(wcs1.wcs.cunit) == list(wcs2.wcs.cunit) assert wcs1.wcs.naxis == wcs2.wcs.naxis def assert_collections_equal(collection1, collection2): assert collection1.keys() == collection2.keys() assert collection1.aligned_axes == collection2.aligned_axes for cube1, cube2 in zip(collection1.values(), collection2.values()): # Check cubes are same type. assert type(cube1) is type(cube2) if isinstance(cube1, NDCube): assert_cubes_equal(cube1, cube2) elif isinstance(cube1, NDCubeSequence): assert_cubesequences_equal(cube1, cube2) else: raise TypeError("Unsupported Type in NDCollection: {0}".format(type(cube1))) ././@PaxHeader0000000000000000000000000000003300000000000011451 xustar000000000000000027 mtime=1605787057.150476 ndcube-1.4.2/ndcube/tests/test_ndcollection.py0000644000175100001640000001416500000000000021745 0ustar00vstsdocker00000000000000import copy import pytest import numpy as np import astropy.wcs import astropy.units as u from ndcube import NDCube, NDCubeSequence, NDCollection from ndcube.tests import helpers # Define some mock data data0 = np.ones((3, 4, 5)) data1 = np.zeros((5, 3, 4)) data2 = data0 * 2 # Define WCS object for all cubes. wcs_input_dict = { 'CTYPE1': 'WAVE ', 'CUNIT1': 'Angstrom', 'CDELT1': 0.2, 'CRPIX1': 0, 'CRVAL1': 10, 'NAXIS1': 5, 'CTYPE2': 'HPLT-TAN', 'CUNIT2': 'deg', 'CDELT2': 0.5, 'CRPIX2': 2, 'CRVAL2': 0.5, 'NAXIS2': 4, 'CTYPE3': 'HPLN-TAN', 'CUNIT3': 'deg', 'CDELT3': 0.4, 'CRPIX3': 2, 'CRVAL3': 1, 'NAXIS3': 3} input_wcs = astropy.wcs.WCS(wcs_input_dict) wcs_input_dict1 = { 'CTYPE3': 'WAVE ', 'CUNIT3': 'Angstrom', 'CDELT3': 0.2, 'CRPIX3': 0, 'CRVAL3': 10, 'NAXIS3': 5, 'CTYPE1': 'HPLT-TAN', 'CUNIT1': 'deg', 'CDELT1': 0.5, 'CRPIX1': 2, 'CRVAL1': 0.5, 'NAXIS1': 4, 'CTYPE2': 'HPLN-TAN', 'CUNIT2': 'deg', 'CDELT2': 0.4, 'CRPIX2': 2, 'CRVAL2': 1, 'NAXIS2': 3} input_wcs1 = astropy.wcs.WCS(wcs_input_dict1) # Define cubes. cube0 = NDCube(data0, input_wcs) cube1 = NDCube(data1, input_wcs1) cube2 = NDCube(data2, input_wcs) # Define sequences. sequence02 = NDCubeSequence([cube0, cube2]) sequence20 = NDCubeSequence([cube2, cube0]) # Define collections aligned_axes = ((1, 2), (2, 0), (1, 2)) keys = ("cube0", "cube1", "cube2") cube_collection = NDCollection([("cube0", cube0), ("cube1", cube1), ("cube2", cube2)], aligned_axes) seq_collection = NDCollection([("seq0", sequence02), ("seq1", sequence20)], aligned_axes="all") @pytest.mark.parametrize("item,collection,expected", [ (0, cube_collection, NDCollection([("cube0", cube0[:, 0]), ("cube1", cube1[:, :, 0]), ("cube2", cube2[:, 0])], aligned_axes=((1,), (0,), (1,)))), (slice(1, 3), cube_collection, NDCollection( [("cube0", cube0[:, 1:3]), ("cube1", cube1[:, :, 1:3]), ("cube2", cube2[:, 1:3])], aligned_axes=aligned_axes)), (slice(-3, -1), cube_collection, NDCollection( [("cube0", cube0[:, -3:-1]), ("cube1", cube1[:, :, -3:-1]), ("cube2", cube2[:, -3:-1])], aligned_axes=aligned_axes)), ((slice(None), slice(1, 2)), cube_collection, NDCollection( [("cube0", cube0[:, :, 1:2]), ("cube1", cube1[1:2]), ("cube2", cube2[:, :, 1:2])], aligned_axes=aligned_axes)), ((slice(2, 4), slice(-3, -1)), cube_collection, NDCollection( [("cube0", cube0[:, 2:4, -3:-1]), ("cube1", cube1[-3:-1, :, 2:4]), ("cube2", cube2[:, 2:4, -3:-1])], aligned_axes=aligned_axes)), ((0, 0), cube_collection, NDCollection( [("cube0", cube0[:, 0, 0]), ("cube1", cube1[0, :, 0]), ("cube2", cube2[:, 0, 0])], aligned_axes=None)), (("cube0", "cube2"), cube_collection, NDCollection( [("cube0", cube0), ("cube2", cube2)], aligned_axes=(aligned_axes[0], aligned_axes[2]))), (0, seq_collection, NDCollection([("seq0", sequence02[0]), ("seq1", sequence20[0])], aligned_axes=((0, 1, 2), (0, 1, 2)))), ((slice(None), 1, slice(1, 3)), seq_collection, NDCollection([("seq0", sequence02[:, 1, 1:3]), ("seq1", sequence20[:, 1, 1:3])], aligned_axes=((0, 1, 2), (0, 1, 2)))) ]) def test_collection_slicing(item, collection, expected): helpers.assert_collections_equal(collection[item], expected) @pytest.mark.parametrize("item,collection,expected", [("cube1", cube_collection, cube1)]) def test_slice_cube_from_collection(item, collection, expected): helpers.assert_cubes_equal(collection[item], expected) def test_collection_copy(): helpers.assert_collections_equal(cube_collection.copy(), cube_collection) @pytest.mark.parametrize("collection,popped_key,expected_popped,expected_collection", [ (cube_collection, "cube0", cube0, NDCollection([("cube1", cube1), ("cube2", cube2)], aligned_axes=aligned_axes[1:]))]) def test_collection_pop(collection, popped_key, expected_popped, expected_collection): popped_collection = collection.copy() output = popped_collection.pop(popped_key) helpers.assert_cubes_equal(output, expected_popped) helpers.assert_collections_equal(popped_collection, expected_collection) @pytest.mark.parametrize("collection,key,expected", [ (cube_collection, "cube0", NDCollection([("cube1", cube1), ("cube2", cube2)], aligned_axes=aligned_axes[1:]))]) def test_del_collection(collection, key, expected): del_collection = collection.copy() del del_collection[key] helpers.assert_collections_equal(del_collection, expected) @pytest.mark.parametrize("collection,key,data,aligned_axes,expected", [ (cube_collection, "cube1", cube2, aligned_axes[2], NDCollection( [("cube0", cube0), ("cube1", cube2), ("cube2", cube2)], aligned_axes=((1, 2), (1, 2), (1, 2)))), (cube_collection, "cube3", cube2, aligned_axes[2], NDCollection( [("cube0", cube0), ("cube1", cube1), ("cube2", cube2), ("cube3", cube2)], aligned_axes=((1, 2), (2, 0), (1, 2), (1, 2))))]) def test_collection_update_key_data_pair_input(collection, key, data, aligned_axes, expected): updated_collection = collection.copy() updated_collection.update([(key, data)], aligned_axes) helpers.assert_collections_equal(updated_collection, expected) def test_collection_update_collecton_input(): orig_collection = NDCollection([("cube0", cube0), ("cube1", cube1)], aligned_axes[:2]) cube1_alt = NDCube(data1*2, input_wcs1) new_collection = NDCollection([("cube1", cube1_alt), ("cube2", cube2)], aligned_axes[1:]) orig_collection.update(new_collection) expected = NDCollection([("cube0", cube0), ("cube1", cube1_alt), ("cube2", cube2)], aligned_axes) helpers.assert_collections_equal(orig_collection, expected) @pytest.mark.parametrize("collection, expected_aligned_dimensions", [ (cube_collection, [4, 5]*u.pix), (seq_collection, np.array([2*u.pix, 3*u.pix, 4*u.pix, 5*u.pix], dtype=object))]) def test_aligned_dimensions(collection, expected_aligned_dimensions): assert all(collection.aligned_dimensions == expected_aligned_dimensions) ././@PaxHeader0000000000000000000000000000003400000000000011452 xustar000000000000000028 mtime=1605787057.1544762 ndcube-1.4.2/ndcube/tests/test_ndcube.py0000644000175100001640000012405000000000000020523 0ustar00vstsdocker00000000000000""" Tests for NDCube. """ from collections import OrderedDict import datetime import pytest import numpy as np import astropy.units as u from ndcube import NDCube, NDCubeOrdered from ndcube.utils.wcs import WCS, _wcs_slicer from ndcube.tests import helpers from ndcube.ndcube_sequence import NDCubeSequence # sample data for tests # TODO: use a fixture reading from a test file. file TBD. ht = {'CTYPE3': 'HPLT-TAN', 'CUNIT3': 'deg', 'CDELT3': 0.5, 'CRPIX3': 0, 'CRVAL3': 0, 'NAXIS3': 2, 'CTYPE2': 'WAVE ', 'CUNIT2': 'Angstrom', 'CDELT2': 0.2, 'CRPIX2': 0, 'CRVAL2': 0, 'NAXIS2': 3, 'CTYPE1': 'TIME ', 'CUNIT1': 'min', 'CDELT1': 0.4, 'CRPIX1': 0, 'CRVAL1': 0, 'NAXIS1': 4} wt = WCS(header=ht, naxis=3) data = np.array([[[1, 2, 3, 4], [2, 4, 5, 3], [0, -1, 2, 3]], [[2, 4, 5, 1], [10, 5, 2, 2], [10, 3, 3, 0]]]) hm = {'CTYPE1': 'WAVE ', 'CUNIT1': 'Angstrom', 'CDELT1': 0.2, 'CRPIX1': 0, 'CRVAL1': 10, 'NAXIS1': 4, 'CTYPE2': 'HPLT-TAN', 'CUNIT2': 'deg', 'CDELT2': 0.5, 'CRPIX2': 2, 'CRVAL2': 0.5, 'NAXIS2': 3, 'CTYPE3': 'HPLN-TAN', 'CUNIT3': 'deg', 'CDELT3': 0.4, 'CRPIX3': 2, 'CRVAL3': 1, 'NAXIS3': 2} wm = WCS(header=hm, naxis=3) h_disordered = { 'CTYPE1': 'TIME ', 'CUNIT1': 'min', 'CDELT1': 0.4, 'CRPIX1': 0, 'CRVAL1': 0, 'NAXIS1': 2, 'CTYPE2': 'WAVE ', 'CUNIT2': 'Angstrom', 'CDELT2': 0.2, 'CRPIX2': 0, 'CRVAL2': 10, 'NAXIS2': 4, 'CTYPE3': 'HPLT-TAN', 'CUNIT3': 'deg', 'CDELT3': 0.5, 'CRPIX3': 2, 'CRVAL3': 0.5, 'NAXIS3': 3, 'CTYPE4': 'HPLN-TAN', 'CUNIT4': 'deg', 'CDELT4': 0.4, 'CRPIX4': 2, 'CRVAL4': 1, 'NAXIS4': 2} w_disordered = WCS(header=h_disordered, naxis=4) data_disordered = np.zeros((2, 3, 4, 2)) data_disordered[:, :, :, 0] = data data_disordered[:, :, :, 1] = data h_ordered = { 'CTYPE1': 'HPLN-TAN', 'CUNIT1': 'deg', 'CDELT1': 0.4, 'CRPIX1': 2, 'CRVAL1': 1, 'NAXIS1': 2, 'CTYPE2': 'HPLT-TAN', 'CUNIT2': 'deg', 'CDELT2': 0.5, 'CRPIX2': 2, 'CRVAL2': 0.5, 'NAXIS2': 3, 'CTYPE3': 'WAVE ', 'CUNIT3': 'Angstrom', 'CDELT3': 0.2, 'CRPIX3': 0, 'CRVAL3': 10, 'NAXIS3': 4, 'CTYPE4': 'TIME ', 'CUNIT4': 'min', 'CDELT4': 0.4, 'CRPIX4': 0, 'CRVAL4': 0, 'NAXIS4': 2} w_ordered = WCS(header=h_ordered, naxis=4) data_ordered = np.zeros((2, 4, 3, 2)) data_ordered[0] = data.transpose() data_ordered[1] = data.transpose() h_rotated = {'CTYPE1': 'HPLN-TAN', 'CUNIT1': 'arcsec', 'CDELT1': 0.4, 'CRPIX1': 0, 'CRVAL1': 0, 'NAXIS1': 5, 'CTYPE2': 'HPLT-TAN', 'CUNIT2': 'arcsec', 'CDELT2': 0.5, 'CRPIX2': 0, 'CRVAL2': 0, 'NAXIS2': 5, 'CTYPE3': 'Time ', 'CUNIT3': 'seconds', 'CDELT3': 0.3, 'CRPIX3': 0, 'CRVAL3': 0, 'NAXIS3': 2, 'PC1_1': 0.714963912964, 'PC1_2': -0.699137151241, 'PC1_3': 0.0, 'PC2_1': 0.699137151241, 'PC2_2': 0.714963912964, 'PC2_3': 0.0, 'PC3_1': 0.0, 'PC3_2': 0.0, 'PC3_3': 1.0} w_rotated = WCS(header=h_rotated, naxis=3) data_rotated = np.array([[[1, 2, 3, 4, 6], [2, 4, 5, 3, 1], [0, -1, 2, 4, 2], [3, 5, 1, 2, 0]], [[2, 4, 5, 1, 3], [1, 5, 2, 2, 4], [2, 3, 4, 0, 5], [0, 1, 2, 3, 4]]]) mask_cubem = data > 0 mask_cube = data >= 0 uncertaintym = data uncertainty = np.sqrt(data) mask_disordered = data_disordered > 0 uncertainty_disordered = data_disordered mask_ordered = data_ordered > 0 uncertainty_ordered = data_ordered cubem = NDCube( data, wm, mask=mask_cubem, uncertainty=uncertaintym, extra_coords=[('time', 0, u.Quantity(range(data.shape[0]), unit=u.pix)), ('hello', 1, u.Quantity(range(data.shape[1]), unit=u.pix)), ('bye', 2, u.Quantity(range(data.shape[2]), unit=u.pix))]) cube_disordered_inputs = ( data_disordered, w_disordered, mask_disordered, uncertainty_disordered, [('spam', 0, u.Quantity(range(data_disordered.shape[0]), unit=u.pix)), ('hello', 1, u.Quantity(range(data_disordered.shape[1]), unit=u.pix)), ('bye', 2, u.Quantity(range(data_disordered.shape[2]), unit=u.pix))]) cube_disordered = NDCube(cube_disordered_inputs[0], cube_disordered_inputs[1], mask=cube_disordered_inputs[2], uncertainty=cube_disordered_inputs[3], extra_coords=cube_disordered_inputs[4]) cube_ordered = NDCubeOrdered( data_ordered, w_ordered, mask=mask_ordered, uncertainty=uncertainty_ordered, extra_coords=[('spam', 3, u.Quantity(range(data_disordered.shape[0]), unit=u.pix)), ('hello', 2, u.Quantity(range(data_disordered.shape[1]), unit=u.pix)), ('bye', 1, u.Quantity(range(data_disordered.shape[2]), unit=u.pix))]) cube = NDCube( data, wt, mask=mask_cube, uncertainty=uncertainty, missing_axes=[False, False, False, True], extra_coords=[('time', 0, u.Quantity(range(data.shape[0]), unit=u.pix)), ('hello', 1, u.Quantity(range(data.shape[1]), unit=u.pix)), ('bye', 2, u.Quantity(range(data.shape[2]), unit=u.pix))]) cubet = NDCube( data, wm, mask=mask_cubem, uncertainty=uncertaintym, extra_coords=[('time', 0, np.array([datetime.datetime(2000, 1, 1) + datetime.timedelta(minutes=i) for i in range(data.shape[0])])), ('hello', 1, u.Quantity(range(data.shape[1]), unit=u.pix)), ('bye', 2, u.Quantity(range(data.shape[2]), unit=u.pix))]) cube_rotated = NDCube( data_rotated, w_rotated, mask=mask_cube, uncertainty=uncertainty, missing_axes=[False, False, False], extra_coords=[('time', 0, u.Quantity(range(data_rotated.shape[0]), unit=u.pix)), ('hello', 1, u.Quantity(range(data_rotated.shape[1]), unit=u.pix)), ('bye', 2, u.Quantity(range(data_rotated.shape[2]), unit=u.pix))]) @pytest.mark.parametrize( "test_input,expected,mask,wcs,uncertainty,dimensions,world_axis_physical_types,extra_coords", [(cubem[:, 1], NDCube, mask_cubem[:, 1], _wcs_slicer(wm, [False, False, False], (slice(None, None, None), 1)), data[:, 1], u.Quantity((2, 4), unit=u.pix), ('custom:pos.helioprojective.lon', 'em.wl'), {'bye': {'axis': 1, 'value': u.Quantity(range(int(cubem.dimensions[2].value)), unit=u.pix)}, 'hello': {'axis': None, 'value': u.Quantity(1, unit=u.pix)}, 'time': {'axis': 0, 'value': u.Quantity(range(int(cubem.dimensions[0].value)), unit=u.pix)}} ), (cubem[:, 0:2], NDCube, mask_cubem[:, 0:2], _wcs_slicer(wm, [False, False, False], (slice(None, None, None), slice(0, 2, None))), data[:, 0:2], u.Quantity((2, 2, 4), unit=u.pix), ('custom:pos.helioprojective.lon', 'custom:pos.helioprojective.lat', 'em.wl'), {'bye': {'axis': 2, 'value': u.Quantity(range(int(cubem.dimensions[2].value)), unit=u.pix)}, 'hello': {'axis': 1, 'value': u.Quantity(range(2), unit=u.pix)}, 'time': {'axis': 0, 'value': u.Quantity(range(int(cubem.dimensions[0].value)), unit=u.pix)}} ), (cubem[:, :], NDCube, mask_cubem[:, :], _wcs_slicer(wm, [False, False, False], (slice(None, None, None), slice(None, None, None))), data[:, :], u.Quantity((2, 3, 4), unit=u.pix), ('custom:pos.helioprojective.lon', 'custom:pos.helioprojective.lat', 'em.wl'), {'time': {'axis': 0, 'value': u.Quantity(range(int(cubem.dimensions[0].value)), unit=u.pix)}, 'hello': {'axis': 1, 'value': u.Quantity(range(int(cubem.dimensions[1].value)), unit=u.pix)}, 'bye': {'axis': 2, 'value': u.Quantity(range(int(cubem.dimensions[2].value)), unit=u.pix)}} ), (cubem[1, 1], NDCube, mask_cubem[1, 1], _wcs_slicer(wm, [False, False, False], (1, 1)), data[1, 1], u.Quantity((4, ), unit=u.pix), tuple(['em.wl']), {'time': {'axis': None, 'value': u.Quantity(1, unit=u.pix)}, 'hello': {'axis': None, 'value': u.Quantity(1, unit=u.pix)}, 'bye': {'axis': 0, 'value': u.Quantity(range(int(cubem.dimensions[2].value)), unit=u.pix)}} ), (cubem[1, 0:2], NDCube, mask_cubem[1, 0:2], _wcs_slicer(wm, [False, False, False], (1, slice(0, 2, None))), data[1, 0:2], u.Quantity((2, 4), unit=u.pix), ('custom:pos.helioprojective.lat', 'em.wl'), {'time': {'axis': None, 'value': u.Quantity(1, unit=u.pix)}, 'hello': {'axis': 0, 'value': u.Quantity(range(2), unit=u.pix)}, 'bye': {'axis': 1, 'value': u.Quantity(range(int(cubem.dimensions[2].value)), unit=u.pix)}} ), (cubem[1, :], NDCube, mask_cubem[1, :], _wcs_slicer(wm, [False, False, False], (1, slice(None, None, None))), data[1, :], u.Quantity((3, 4), unit=u.pix), ('custom:pos.helioprojective.lat', 'em.wl'), {'time': {'axis': None, 'value': u.Quantity(1, unit=u.pix)}, 'hello': {'axis': 0, 'value': u.Quantity(range(int(cubem.dimensions[1].value)), unit=u.pix)}, 'bye': {'axis': 1, 'value': u.Quantity(range(int(cubem.dimensions[2].value)), unit=u.pix)}} ), (cube[:, 1], NDCube, mask_cube[:, 1], _wcs_slicer(wt, [True, False, False, False], (slice(None, None, None), 1)), uncertainty[:, 1], u.Quantity((2, 4), unit=u.pix), ('custom:pos.helioprojective.lat', 'time'), {'time': {'axis': 0, 'value': u.Quantity(range(int(cube.dimensions[0].value)), unit=u.pix)}, 'hello': {'axis': None, 'value': u.Quantity(1, unit=u.pix)}, 'bye': {'axis': 1, 'value': u.Quantity(range(int(cube.dimensions[2].value)), unit=u.pix)}} ), (cube[:, 0:2], NDCube, mask_cube[:, 0:2], _wcs_slicer(wt, [True, False, False, False], (slice(None, None, None), slice(0, 2, None))), uncertainty[:, 0:2], u.Quantity((2, 2, 4), unit=u.pix), ('custom:pos.helioprojective.lat', 'em.wl', 'time'), {'time': {'axis': 0, 'value': u.Quantity(range(int(cube.dimensions[0].value)), unit=u.pix)}, 'hello': {'axis': 1, 'value': u.Quantity(range(2), unit=u.pix)}, 'bye': {'axis': 2, 'value': u.Quantity(range(int(cube.dimensions[2].value)), unit=u.pix)}} ), (cube[:, :], NDCube, mask_cube[:, :], _wcs_slicer(wt, [True, False, False, False], (slice(None, None, None), slice(None, None, None))), uncertainty[:, :], u.Quantity((2, 3, 4), unit=u.pix), ('custom:pos.helioprojective.lat', 'em.wl', 'time'), {'time': {'axis': 0, 'value': u.Quantity(range(int(cube.dimensions[0].value)), unit=u.pix)}, 'hello': {'axis': 1, 'value': u.Quantity(range(int(cube.dimensions[1].value)), unit=u.pix)}, 'bye': {'axis': 2, 'value': u.Quantity(range(int(cube.dimensions[2].value)), unit=u.pix)}} ), (cube[1, 1], NDCube, mask_cube[1, 1], _wcs_slicer(wt, [True, False, False, False], (1, 1)), uncertainty[1, 1], u.Quantity((4, ), unit=u.pix), tuple(['time']), {'time': {'axis': None, 'value': u.Quantity(1, unit=u.pix)}, 'hello': {'axis': None, 'value': u.Quantity(1, unit=u.pix)}, 'bye': {'axis': 0, 'value': u.Quantity(range(int(cube.dimensions[2].value)), unit=u.pix)}} ), (cube[1, 0:2], NDCube, mask_cube[1, 0:2], _wcs_slicer(wt, [True, False, False, False], (1, slice(0, 2, None))), uncertainty[1, 0:2], u.Quantity((2, 4), unit=u.pix), ('em.wl', 'time'), {'time': {'axis': None, 'value': u.Quantity(1, unit=u.pix)}, 'hello': {'axis': 0, 'value': u.Quantity(range(2), unit=u.pix)}, 'bye': {'axis': 1, 'value': u.Quantity(range(int(cube.dimensions[2].value)), unit=u.pix)}} ), (cube[1, :], NDCube, mask_cube[1, :], _wcs_slicer(wt, [True, False, False, False], (1, slice(0, 2, None))), uncertainty[1, :], u.Quantity((3, 4), unit=u.pix), ('em.wl', 'time'), {'time': {'axis': None, 'value': u.Quantity(1, unit=u.pix)}, 'hello': {'axis': 0, 'value': u.Quantity(range(int(cube.dimensions[1].value)), unit=u.pix)}, 'bye': {'axis': 1, 'value': u.Quantity(range(int(cube.dimensions[2].value)), unit=u.pix)}} )]) def test_slicing_second_axis(test_input, expected, mask, wcs, uncertainty, dimensions, world_axis_physical_types, extra_coords): assert isinstance(test_input, expected) assert np.all(test_input.mask == mask) helpers.assert_wcs_are_equal(test_input.wcs, wcs[0]) assert test_input.missing_axes == wcs[1] assert test_input.uncertainty.array.shape == uncertainty.shape assert np.all(test_input.dimensions.value == dimensions.value) assert test_input.dimensions.unit == dimensions.unit assert test_input.world_axis_physical_types == world_axis_physical_types helpers.assert_extra_coords_equal(test_input.extra_coords, extra_coords) @pytest.mark.parametrize( "test_input,expected,mask,wcs,uncertainty,dimensions,world_axis_physical_types,extra_coords", [(cubem[1], NDCube, mask_cubem[1], _wcs_slicer(wm, [False, False, False], 1), data[1], u.Quantity((3, 4), unit=u.pix), ('custom:pos.helioprojective.lat', 'em.wl'), {'time': {'axis': None, 'value': u.Quantity(1, unit=u.pix)}, 'hello': {'axis': 0, 'value': u.Quantity(range(int(cubem.dimensions[1].value)), unit=u.pix)}, 'bye': {'axis': 1, 'value': u.Quantity(range(int(cubem.dimensions[2].value)), unit=u.pix)}} ), (cubem[0:2], NDCube, mask_cubem[0:2], _wcs_slicer(wm, [False, False, False], slice(0, 2, None)), data[0:2], u.Quantity((2, 3, 4), unit=u.pix), ('custom:pos.helioprojective.lon', 'custom:pos.helioprojective.lat', 'em.wl'), {'time': {'axis': 0, 'value': u.Quantity(range(2), unit=u.pix)}, 'hello': {'axis': 1, 'value': u.Quantity(range(int(cubem.dimensions[1].value)), unit=u.pix)}, 'bye': {'axis': 2, 'value': u.Quantity(range(int(cubem.dimensions[2].value)), unit=u.pix)}} ), (cubem[:], NDCube, mask_cubem[:], _wcs_slicer(wm, [False, False, False], slice(None, None, None)), data[:], u.Quantity((2, 3, 4), unit=u.pix), ('custom:pos.helioprojective.lon', 'custom:pos.helioprojective.lat', 'em.wl'), {'time': {'axis': 0, 'value': u.Quantity(range(int(cubem.dimensions[0].value)), unit=u.pix)}, 'hello': {'axis': 1, 'value': u.Quantity(range(int(cubem.dimensions[1].value)), unit=u.pix)}, 'bye': {'axis': 2, 'value': u.Quantity(range(int(cubem.dimensions[2].value)), unit=u.pix)}} ), (cube[1], NDCube, mask_cube[1], _wcs_slicer(wt, [True, False, False, False], 1), uncertainty[1], u.Quantity((3, 4), unit=u.pix), ('em.wl', 'time'), {'time': {'axis': None, 'value': u.Quantity(1, unit=u.pix)}, 'hello': {'axis': 0, 'value': u.Quantity(range(int(cube.dimensions[1].value)), unit=u.pix)}, 'bye': {'axis': 1, 'value': u.Quantity(range(int(cube.dimensions[2].value)), unit=u.pix)}} ), (cube[0:2], NDCube, mask_cube[0:2], _wcs_slicer(wt, [True, False, False, False], slice(0, 2, None)), uncertainty[0:2], u.Quantity((2, 3, 4), unit=u.pix), ('custom:pos.helioprojective.lat', 'em.wl', 'time'), {'time': {'axis': 0, 'value': u.Quantity(range(2), unit=u.pix)}, 'hello': {'axis': 1, 'value': u.Quantity(range(int(cube.dimensions[1].value)), unit=u.pix)}, 'bye': {'axis': 2, 'value': u.Quantity(range(int(cube.dimensions[2].value)), unit=u.pix)}} ), (cube[:], NDCube, mask_cube[:], _wcs_slicer(wt, [True, False, False, False], slice(None, None, None)), uncertainty[:], u.Quantity((2, 3, 4), unit=u.pix), ('custom:pos.helioprojective.lat', 'em.wl', 'time'), {'time': {'axis': 0, 'value': u.Quantity(range(int(cube.dimensions[0].value)), unit=u.pix)}, 'hello': {'axis': 1, 'value': u.Quantity(range(int(cube.dimensions[1].value)), unit=u.pix)}, 'bye': {'axis': 2, 'value': u.Quantity(range(int(cube.dimensions[2].value)), unit=u.pix)}} )]) def test_slicing_first_axis(test_input, expected, mask, wcs, uncertainty, dimensions, world_axis_physical_types, extra_coords): assert isinstance(test_input, expected) assert np.all(test_input.mask == mask) helpers.assert_wcs_are_equal(test_input.wcs, wcs[0]) assert test_input.missing_axes == wcs[1] assert test_input.uncertainty.array.shape == uncertainty.shape assert np.all(test_input.dimensions.value == dimensions.value) assert test_input.dimensions.unit == dimensions.unit assert test_input.world_axis_physical_types == world_axis_physical_types helpers.assert_extra_coords_equal(test_input.extra_coords, extra_coords) @pytest.mark.parametrize( "test_input,expected,mask,wcs,uncertainty,dimensions,world_axis_physical_types,extra_coords", [(cubem[:, :, 1], NDCube, mask_cubem[:, :, 1], _wcs_slicer(wm, [False, False, False], (slice(None, None, None), slice(None, None, None), 1)), data[:, :, 1], u.Quantity((2, 3), unit=u.pix), ('custom:pos.helioprojective.lon', 'custom:pos.helioprojective.lat'), {'time': {'axis': 0, 'value': u.Quantity(range(int(cubem.dimensions[0].value)), unit=u.pix)}, 'hello': {'axis': 1, 'value': u.Quantity(range(int(cube.dimensions[1].value)), unit=u.pix)}, 'bye': {'axis': None, 'value': u.Quantity(1, unit=u.pix)}} ), (cubem[:, :, 0:2], NDCube, mask_cubem[:, :, 0:2], _wcs_slicer(wm, [False, False, False], (slice(None, None, None), slice(None, None, None), slice(0, 2, None))), data[:, :, 0:2], u.Quantity((2, 3, 2), unit=u.pix), ('custom:pos.helioprojective.lon', 'custom:pos.helioprojective.lat', 'em.wl'), {'time': {'axis': 0, 'value': u.Quantity(range(int(cubem.dimensions[0].value)), unit=u.pix)}, 'hello': {'axis': 1, 'value': u.Quantity(range(int(cubem.dimensions[1].value)), unit=u.pix)}, 'bye': {'axis': 2, 'value': u.Quantity(range(2), unit=u.pix)}} ), (cubem[:, :, :], NDCube, mask_cubem[:, :, :], _wcs_slicer(wm, [False, False, False], (slice(None, None, None), slice(None, None, None), slice(None, None, None))), data[:, :, :], u.Quantity((2, 3, 4), unit=u.pix), ('custom:pos.helioprojective.lon', 'custom:pos.helioprojective.lat', 'em.wl'), {'time': {'axis': 0, 'value': u.Quantity(range(int(cubem.dimensions[0].value)), unit=u.pix)}, 'hello': {'axis': 1, 'value': u.Quantity(range(int(cubem.dimensions[1].value)), unit=u.pix)}, 'bye': {'axis': 2, 'value': u.Quantity(range(int(cubem.dimensions[2].value)), unit=u.pix)}} ), (cubem[:, 1, 1], NDCube, mask_cubem[:, 1, 1], _wcs_slicer(wm, [False, False, False], (slice(None, None, None), 1, 1)), data[:, 1, 1], u.Quantity((2, ), unit=u.pix), tuple(['custom:pos.helioprojective.lon']), {'time': {'axis': 0, 'value': u.Quantity(range(int(cubem.dimensions[0].value)), unit=u.pix)}, 'hello': {'axis': None, 'value': u.Quantity(1, unit=u.pix)}, 'bye': {'axis': None, 'value': u.Quantity(1, unit=u.pix)}} ), (cubem[:, 1, 0:2], NDCube, mask_cubem[:, 1, 0:2], _wcs_slicer(wm, [False, False, False], (slice(None, None, None), 1, slice(0, 2, None))), data[:, 1, 0:2], u.Quantity((2, 2), unit=u.pix), ('custom:pos.helioprojective.lon', 'em.wl'), {'time': {'axis': 0, 'value': u.Quantity(range(int(cubem.dimensions[0].value)), unit=u.pix)}, 'hello': {'axis': None, 'value': u.Quantity(1, unit=u.pix)}, 'bye': {'axis': 1, 'value': u.Quantity(range(2), unit=u.pix)}} ), (cubem[:, 1, :], NDCube, mask_cubem[:, 1, :], _wcs_slicer(wm, [False, False, False], (slice(None, None, None), 1, slice(None, None, None))), data[:, 1, :], u.Quantity((2, 4), unit=u.pix), ('custom:pos.helioprojective.lon', 'em.wl'), {'time': {'axis': 0, 'value': u.Quantity(range(int(cubem.dimensions[0].value)), unit=u.pix)}, 'hello': {'axis': None, 'value': u.Quantity(1, unit=u.pix)}, 'bye': {'axis': 1, 'value': u.Quantity(range(int(cubem.dimensions[2].value)), unit=u.pix)}} ), (cubem[1, :, 1], NDCube, mask_cubem[1, :, 1], _wcs_slicer(wm, [False, False, False], (1, slice(None, None, None), 1)), data[1, :, 1], u.Quantity((3, ), unit=u.pix), tuple(['custom:pos.helioprojective.lat']), {'time': {'axis': None, 'value': u.Quantity(1, unit=u.pix)}, 'hello': {'axis': 0, 'value': u.Quantity(range(int(cubem.dimensions[1].value)), unit=u.pix)}, 'bye': {'axis': None, 'value': u.Quantity(1, unit=u.pix)}} ), (cubem[1, :, 0:2], NDCube, mask_cubem[1, :, 0:2], _wcs_slicer(wm, [False, False, False], (1, slice(None, None, None), slice(0, 2, None))), data[1, :, 0:2], u.Quantity((3, 2), unit=u.pix), ('custom:pos.helioprojective.lat', 'em.wl'), {'time': {'axis': None, 'value': u.Quantity(1, unit=u.pix)}, 'hello': {'axis': 0, 'value': u.Quantity(range(int(cubem.dimensions[1].value)), unit=u.pix)}, 'bye': {'axis': 1, 'value': u.Quantity(range(2), unit=u.pix)}} ), (cubem[1, :, :], NDCube, mask_cubem[1, :, :], _wcs_slicer(wm, [False, False, False], (1, slice(None, None, None), slice(None, None, None))), data[1, :, :], u.Quantity((3, 4), unit=u.pix), ('custom:pos.helioprojective.lat', 'em.wl'), {'time': {'axis': None, 'value': u.Quantity(1, unit=u.pix)}, 'hello': {'axis': 0, 'value': u.Quantity(range(int(cubem.dimensions[1].value)), unit=u.pix)}, 'bye': {'axis': 1, 'value': u.Quantity(range(int(cubem.dimensions[2].value)), unit=u.pix)}} ), (cubem[1, 1, 1], NDCube, mask_cubem[1, 1, 1], _wcs_slicer(wm, [False, False, False], (1, 1, 1)), data[1, 1, 1], u.Quantity((), unit=u.pix), (), {'time': {'axis': None, 'value': u.Quantity(1, unit=u.pix)}, 'hello': {'axis': None, 'value': u.Quantity(1, unit=u.pix)}, 'bye': {'axis': None, 'value': u.Quantity(1, unit=u.pix)}} ), (cubem[1, 1, 0:2], NDCube, mask_cubem[1, 1, 0:2], _wcs_slicer(wm, [False, False, False], (1, 1, slice(0, 2, None))), data[1, 1, 0:2], u.Quantity((2, ), unit=u.pix), tuple(['em.wl']), {'time': {'axis': None, 'value': u.Quantity(1, unit=u.pix)}, 'hello': {'axis': None, 'value': u.Quantity(1, unit=u.pix)}, 'bye': {'axis': 0, 'value': u.Quantity(range(2), unit=u.pix)}} ), (cubem[1, 1, :], NDCube, mask_cubem[1, 1, :], _wcs_slicer(wm, [False, False, False], (1, 1, slice(None, None, None))), data[1, 1, :], u.Quantity((4, ), unit=u.pix), tuple(['em.wl']), {'time': {'axis': None, 'value': u.Quantity(1, unit=u.pix)}, 'hello': {'axis': None, 'value': u.Quantity(1, unit=u.pix)}, 'bye': {'axis': 0, 'value': u.Quantity(range(int(cubem.dimensions[2].value)), unit=u.pix)}} ), (cube[:, :, 1], NDCube, mask_cube[:, :, 1], _wcs_slicer(wt, [True, False, False, False], (slice(None, None, None), slice(None, None, None), 1)), uncertainty[:, :, 1], u.Quantity((2, 3), unit=u.pix), ('custom:pos.helioprojective.lat', 'em.wl'), {'time': {'axis': 0, 'value': u.Quantity(range(int(cube.dimensions[0].value)), unit=u.pix)}, 'hello': {'axis': 1, 'value': u.Quantity(range(int(cube.dimensions[1].value)), unit=u.pix)}, 'bye': {'axis': None, 'value': u.Quantity(1, unit=u.pix)}} ), (cube[:, :, 0:2], NDCube, mask_cube[:, :, 0:2], _wcs_slicer(wt, [True, False, False, False], (slice(None, None, None), slice(None, None, None), slice(0, 2, None))), uncertainty[:, :, 0:2], u.Quantity((2, 3, 2), unit=u.pix), ('custom:pos.helioprojective.lat', 'em.wl', 'time'), {'time': {'axis': 0, 'value': u.Quantity(range(int(cube.dimensions[0].value)), unit=u.pix)}, 'hello': {'axis': 1, 'value': u.Quantity(range(int(cube.dimensions[1].value)), unit=u.pix)}, 'bye': {'axis': 2, 'value': u.Quantity(range(2), unit=u.pix)}} ), (cube[:, :, :], NDCube, mask_cube[:, :, :], _wcs_slicer(wt, [True, False, False, False], (slice(None, None, None), slice(None, None, None), slice(None, None, None))), uncertainty[:, :, :], u.Quantity((2, 3, 4), unit=u.pix), ('custom:pos.helioprojective.lat', 'em.wl', 'time'), {'time': {'axis': 0, 'value': u.Quantity(range(int(cube.dimensions[0].value)), unit=u.pix)}, 'hello': {'axis': 1, 'value': u.Quantity(range(int(cube.dimensions[1].value)), unit=u.pix)}, 'bye': {'axis': 2, 'value': u.Quantity(range(int(cube.dimensions[2].value)), unit=u.pix)}} ), (cube[:, 1, 1], NDCube, mask_cube[:, 1, 1], _wcs_slicer(wt, [True, False, False, False], (slice(None, None, None), 1, 1)), uncertainty[:, 1, 1], u.Quantity((2, ), unit=u.pix), tuple(['custom:pos.helioprojective.lat']), {'time': {'axis': 0, 'value': u.Quantity(range(int(cube.dimensions[0].value)), unit=u.pix)}, 'hello': {'axis': None, 'value': u.Quantity(1, unit=u.pix)}, 'bye': {'axis': None, 'value': u.Quantity(1, unit=u.pix)}} ), (cube[:, 1, 0:2], NDCube, mask_cube[:, 1, 0:2], _wcs_slicer(wt, [True, False, False, False], (slice(None, None, None), 1, slice(0, 2, None))), uncertainty[:, 1, 0:2], u.Quantity((2, 2), unit=u.pix), ('custom:pos.helioprojective.lat', 'time'), {'time': {'axis': 0, 'value': u.Quantity(range(int(cube.dimensions[0].value)), unit=u.pix)}, 'hello': {'axis': None, 'value': u.Quantity(1, unit=u.pix)}, 'bye': {'axis': 1, 'value': u.Quantity(range(2), unit=u.pix)}} ), (cube[:, 1, :], NDCube, mask_cube[:, 1, :], _wcs_slicer(wt, [True, False, False, False], (slice(None, None, None), 1, slice(None, None, None))), uncertainty[:, 1, :], u.Quantity((2, 4), unit=u.pix), ('custom:pos.helioprojective.lat', 'time'), {'time': {'axis': 0, 'value': u.Quantity(range(int(cube.dimensions[0].value)), unit=u.pix)}, 'hello': {'axis': None, 'value': u.Quantity(1, unit=u.pix)}, 'bye': {'axis': 1, 'value': u.Quantity(range(int(cube.dimensions[2].value)), unit=u.pix)}} ), (cube[1, :, 1], NDCube, mask_cube[1, :, 1], _wcs_slicer(wt, [True, False, False, False], (1, slice(None, None, None), 1)), uncertainty[1, :, 1], u.Quantity((3, ), unit=u.pix), tuple(['em.wl']), {'time': {'axis': None, 'value': u.Quantity(1, unit=u.pix)}, 'hello': {'axis': 0, 'value': u.Quantity(range(int(cube.dimensions[1].value)), unit=u.pix)}, 'bye': {'axis': None, 'value': u.Quantity(1, unit=u.pix)}} ), (cube[1, :, 0:2], NDCube, mask_cube[1, :, 0:2], _wcs_slicer(wt, [True, False, False, False], (1, slice(None, None, None), slice(0, 2, None))), uncertainty[1, :, 0:2], u.Quantity((3, 2), unit=u.pix), ('em.wl', 'time'), {'time': {'axis': None, 'value': u.Quantity(1, unit=u.pix)}, 'hello': {'axis': 0, 'value': u.Quantity(range(int(cube.dimensions[1].value)), unit=u.pix)}, 'bye': {'axis': 1, 'value': u.Quantity(range(2), unit=u.pix)}} ), (cube[1, :, :], NDCube, mask_cube[1, :, :], _wcs_slicer(wt, [True, False, False, False], (1, slice(None, None, None), slice(None, None, None))), uncertainty[1, :, :], u.Quantity((3, 4), unit=u.pix), ('em.wl', 'time'), {'time': {'axis': None, 'value': u.Quantity(1, unit=u.pix)}, 'hello': {'axis': 0, 'value': u.Quantity(range(int(cube.dimensions[1].value)), unit=u.pix)}, 'bye': {'axis': 1, 'value': u.Quantity(range(int(cube.dimensions[2].value)), unit=u.pix)}} ), (cube[1, 1, 1], NDCube, mask_cube[1, 1, 1], _wcs_slicer(wt, [True, False, False, False], (1, 1, 1)), uncertainty[1, 1, 1], u.Quantity((), unit=u.pix), (), {'time': {'axis': None, 'value': u.Quantity(1, unit=u.pix)}, 'hello': {'axis': None, 'value': u.Quantity(1, unit=u.pix)}, 'bye': {'axis': None, 'value': u.Quantity(1, unit=u.pix)}} ), (cube[1, 1, 0:2], NDCube, mask_cube[1, 1, 0:2], _wcs_slicer(wt, [True, False, False, False], (1, 1, slice(0, 2, None))), uncertainty[1, 1, 0:2], u.Quantity((2, ), unit=u.pix), tuple(['time']), {'time': {'axis': None, 'value': u.Quantity(1, unit=u.pix)}, 'hello': {'axis': None, 'value': u.Quantity(1, unit=u.pix)}, 'bye': {'axis': 0, 'value': u.Quantity(range(2), unit=u.pix)}} ), (cube[1, 1, :], NDCube, mask_cube[1, 1, :], _wcs_slicer(wt, [True, False, False, False], (1, 1, slice(0, 2, None))), uncertainty[1, 1, :], u.Quantity((4, ), unit=u.pix), tuple(['time']), {'time': {'axis': None, 'value': u.Quantity(1, unit=u.pix)}, 'hello': {'axis': None, 'value': u.Quantity(1, unit=u.pix)}, 'bye': {'axis': 0, 'value': u.Quantity(range(int(cube.dimensions[2].value)), unit=u.pix)}} )]) def test_slicing_third_axis(test_input, expected, mask, wcs, uncertainty, dimensions, world_axis_physical_types, extra_coords): assert isinstance(test_input, expected) assert np.all(test_input.mask == mask) helpers.assert_wcs_are_equal(test_input.wcs, wcs[0]) assert test_input.missing_axes == wcs[1] assert test_input.uncertainty.array.shape == uncertainty.shape assert np.all(test_input.dimensions.value == dimensions.value) assert test_input.dimensions.unit == dimensions.unit assert test_input.world_axis_physical_types == world_axis_physical_types helpers.assert_extra_coords_equal(test_input.extra_coords, extra_coords) @pytest.mark.parametrize("test_input", [(cubem)]) def test_slicing_error(test_input): with pytest.raises(IndexError): test_input[None] with pytest.raises(IndexError): test_input[0, None] with pytest.raises(NotImplementedError): test_input[[0, 1]] @pytest.mark.parametrize("test_input,expected", [ (cubem[1].pixel_to_world(*[ u.Quantity(np.arange(4), unit=u.pix), u.Quantity(np.arange(4), unit=u.pix) ])[0], wm.all_pix2world( u.Quantity(np.arange(4), unit=u.pix), u.Quantity(np.arange(4), unit=u.pix), wm.wcs.crpix[2] - 1, 0)[-2]), (cubem[1].pixel_to_world(*[ u.Quantity(np.arange(4), unit=u.pix), u.Quantity(np.arange(4), unit=u.pix) ])[1], wm.all_pix2world( u.Quantity(np.arange(4), unit=u.pix), u.Quantity(np.arange(4), unit=u.pix), wm.wcs.crpix[2] - 1, 0)[0]), (cubem[0:2].pixel_to_world(*[ u.Quantity(np.arange(4), unit=u.pix), u.Quantity(np.arange(4), unit=u.pix), u.Quantity(np.arange(4), unit=u.pix) ])[0], wm.all_pix2world( u.Quantity(np.arange(4), unit=u.pix), u.Quantity(np.arange(4), unit=u.pix), u.Quantity(np.arange(4), unit=u.pix), 0)[-1]), (cubem[0:2].pixel_to_world(*[ u.Quantity(np.arange(4), unit=u.pix), u.Quantity(np.arange(4), unit=u.pix), u.Quantity(np.arange(4), unit=u.pix) ])[1], wm.all_pix2world( u.Quantity(np.arange(4), unit=u.pix), u.Quantity(np.arange(4), unit=u.pix), u.Quantity(np.arange(4), unit=u.pix), 0)[1]), (cubem[0:2].pixel_to_world(*[ u.Quantity(np.arange(4), unit=u.pix), u.Quantity(np.arange(4), unit=u.pix), u.Quantity(np.arange(4), unit=u.pix) ])[2], wm.all_pix2world( u.Quantity(np.arange(4), unit=u.pix), u.Quantity(np.arange(4), unit=u.pix), u.Quantity(np.arange(4), unit=u.pix), 0)[0]), (cube[1].pixel_to_world(*[ u.Quantity(np.arange(4), unit=u.pix), u.Quantity(np.arange(4), unit=u.pix) ])[0], wt.all_pix2world( u.Quantity(np.arange(4), unit=u.pix), u.Quantity(np.arange(4), unit=u.pix), wt.wcs.crpix[2] - 1, wt.wcs.crpix[3] - 1, 0)[1]), (cube[1].pixel_to_world(*[ u.Quantity(np.arange(4), unit=u.pix), u.Quantity(np.arange(4), unit=u.pix) ])[1], wt.all_pix2world( u.Quantity(np.arange(4), unit=u.pix), u.Quantity(np.arange(4), unit=u.pix), wt.wcs.crpix[2] - 1, wt.wcs.crpix[3] - 1, 0)[0]), (cube[0:2].pixel_to_world(*[ u.Quantity(np.arange(4), unit=u.pix), u.Quantity(np.arange(4), unit=u.pix), u.Quantity(np.arange(4), unit=u.pix) ])[0], wt.all_pix2world( u.Quantity(np.arange(4), unit=u.pix), u.Quantity(np.arange(4), unit=u.pix), u.Quantity(np.arange(4), unit=u.pix), wt.wcs.crpix[3] - 1, 0)[2]), (cube[0:2].pixel_to_world(*[ u.Quantity(np.arange(4), unit=u.pix), u.Quantity(np.arange(4), unit=u.pix), u.Quantity(np.arange(4), unit=u.pix) ])[1], wt.all_pix2world( u.Quantity(np.arange(4), unit=u.pix), u.Quantity(np.arange(4), unit=u.pix), u.Quantity(np.arange(4), unit=u.pix), wt.wcs.crpix[3] - 1, 0)[1]), (cube[0:2].pixel_to_world(*[ u.Quantity(np.arange(4), unit=u.pix), u.Quantity(np.arange(4), unit=u.pix), u.Quantity(np.arange(4), unit=u.pix) ])[2], wt.all_pix2world( u.Quantity(np.arange(4), unit=u.pix), u.Quantity(np.arange(4), unit=u.pix), u.Quantity(np.arange(4), unit=u.pix), wt.wcs.crpix[3] - 1, 0)[0])]) def test_pixel_to_world(test_input, expected): assert np.all(test_input.value == expected) @pytest.mark.parametrize("test_input,expected", [ (cubem[1].world_to_pixel(*[ u.Quantity(np.arange(4), unit=u.deg), u.Quantity(np.arange(4), unit=u.m) ])[1], wm.all_world2pix( u.Quantity(np.arange(4), unit=u.deg), u.Quantity(np.arange(4), unit=u.m), wm.wcs.crpix[2] - 1, 0)[0]), (cubem[0:2].world_to_pixel(*[ u.Quantity(np.arange(4), unit=u.deg), u.Quantity(np.arange(4), unit=u.deg), u.Quantity(np.arange(4), unit=u.m) ])[0], wm.all_world2pix( u.Quantity(np.arange(4), unit=u.deg), u.Quantity(np.arange(4), unit=u.deg), u.Quantity(np.arange(4), unit=u.m), 0)[-1]), (cubem[0:2].world_to_pixel(*[ u.Quantity(np.arange(4), unit=u.deg), u.Quantity(np.arange(4), unit=u.deg), u.Quantity(np.arange(4), unit=u.m) ])[1], wm.all_world2pix( u.Quantity(np.arange(4), unit=u.deg), u.Quantity(np.arange(4), unit=u.deg), u.Quantity(np.arange(4), unit=u.m), 0)[1]), (cubem[0:2].world_to_pixel(*[ u.Quantity(np.arange(4), unit=u.deg), u.Quantity(np.arange(4), unit=u.deg), u.Quantity(np.arange(4), unit=u.m) ])[2], wm.all_world2pix( u.Quantity(np.arange(4), unit=u.deg), u.Quantity(np.arange(4), unit=u.deg), u.Quantity(np.arange(4), unit=u.m), 0)[0]), (cube[1].world_to_pixel(*[ u.Quantity(np.arange(4), unit=u.m), u.Quantity(np.arange(4), unit=u.min) ])[0], wt.all_world2pix( u.Quantity(np.arange(4), unit=u.m), u.Quantity(np.arange(4), unit=u.min), wt.wcs.crpix[2] - 1, wt.wcs.crpix[3] - 1, 0)[1]), (cube[1].world_to_pixel(*[ u.Quantity(np.arange(4), unit=u.m), u.Quantity(np.arange(4), unit=u.min) ])[1], wt.all_world2pix( u.Quantity(np.arange(4), unit=u.m), u.Quantity(np.arange(4), unit=u.min), wt.wcs.crpix[2] - 1, wt.wcs.crpix[3] - 1, 0)[0]), (cube[0:2].world_to_pixel(*[ u.Quantity(np.arange(4), unit=u.deg), u.Quantity(np.arange(4), unit=u.m), u.Quantity(np.arange(4), unit=u.min) ])[0], wt.all_world2pix( u.Quantity(np.arange(4), unit=u.deg), u.Quantity(np.arange(4), unit=u.m), u.Quantity(np.arange(4), unit=u.min), wt.wcs.crpix[3] - 1, 0)[2]), (cube[0:2].world_to_pixel(*[ u.Quantity(np.arange(4), unit=u.deg), u.Quantity(np.arange(4), unit=u.m), u.Quantity(np.arange(4), unit=u.min) ])[1], wt.all_world2pix( u.Quantity(np.arange(4), unit=u.deg), u.Quantity(np.arange(4), unit=u.m), u.Quantity(np.arange(4), unit=u.min), wt.wcs.crpix[3] - 1, 0)[1]), (cube[0:2].world_to_pixel(*[ u.Quantity(np.arange(4), unit=u.deg), u.Quantity(np.arange(4), unit=u.m), u.Quantity(np.arange(4), unit=u.min) ])[2], wt.all_world2pix( u.Quantity(np.arange(4), unit=u.deg), u.Quantity(np.arange(4), unit=u.m), u.Quantity(np.arange(4), unit=u.min), wt.wcs.crpix[3] - 1, 0)[0])]) def test_world_to_pixel(test_input, expected): assert np.allclose(test_input.value, expected) @pytest.mark.parametrize("test_input,expected", [ ((cubem, [0.7 * u.deg, 1.3e-5 * u.deg, 1.02e-9 * u.m], [1 * u.deg, 1 * u.deg, 4.e-11 * u.m], None), cubem[:, :, :3]), ((cube_rotated, [0 * u.s, 1.5 * u.arcsec, 0 * u.arcsec], [1 * u.s, 1 * u.arcsec, 0.5 * u.arcsec], None), cube_rotated[:, :4, 1:5]), ((cubem, [0.7 * u.deg, 1.3e-5 * u.deg, 1.02e-9 * u.m], None, [1.7 * u.deg, 1 * u.deg, 1.06e-9 * u.m]), cubem[:, :, :3]), ((cube_rotated, [0 * u.s, 1.5 * u.arcsec, 0 * u.arcsec], None, [1 * u.s, 2.5 * u.arcsec, 0.5 * u.arcsec]), cube_rotated[:, :4, 1:5]), ((cube_rotated, [0, 1.5, 0], None, [1, 2.5, 0.5], ['s', 'arcsec', 'arcsec']), cube_rotated[:, :4, 1:5])]) def test_crop_by_coords(test_input, expected): helpers.assert_cubes_equal( test_input[0].crop_by_coords(*test_input[1:]), expected) @pytest.mark.parametrize("test_input", [ (ValueError, cubem, u.Quantity([0], unit=u.deg), u.Quantity([1.5, 2.], unit=u.deg), None), (ValueError, cubem, [1 * u.s], [1 * u.s], [1 * u.s]), (ValueError, cubem, u.Quantity([0], unit=u.deg), None, u.Quantity([1.5, 2.], unit=u.deg)), (ValueError, cubem, [1], None, [1], ['s', 'deg']), (TypeError, cubem, [1, 2, 3], None, [2, 3, 4])]) def test_crop_by_coords_error(test_input): with pytest.raises(test_input[0]): test_input[1].crop_by_coords(*test_input[2:]) @pytest.mark.parametrize( "test_input,expected", [((cube, "bye", 0 * u.pix, 1.5 * u.pix), cube[:, :, 0:2]), ((cubet, "bye", 0.5 * u.pix, 3.5 * u.pix), cubet[:, :, 1:4])]) def test_crop_by_extra_coord(test_input, expected): helpers.assert_cubes_equal( test_input[0].crop_by_extra_coord(*tuple(test_input[1:])), expected) @pytest.mark.parametrize("test_input,expected", [ (cube_disordered_inputs, cube_ordered)]) def test_ndcubeordered(test_input, expected): helpers.assert_cubes_equal( NDCubeOrdered(test_input[0], test_input[1], mask=test_input[2], uncertainty=test_input[3], extra_coords=test_input[4]), expected) @pytest.mark.parametrize("test_input,expected", [ ((cubem, [2]), (u.Quantity([1.02e-09, 1.04e-09, 1.06e-09, 1.08e-09], unit=u.m),)), ((cubem, ['em']), (u.Quantity([1.02e-09, 1.04e-09, 1.06e-09, 1.08e-09], unit=u.m),)) ]) def test_all_world_coords_with_input(test_input, expected): all_coords = test_input[0].axis_world_coords_values(*test_input[1]) for i in range(len(all_coords)): np.testing.assert_allclose(all_coords[i].value, expected[i].value) assert all_coords[i].unit == expected[i].unit @pytest.mark.parametrize("test_input,expected", [ ((cubem, [2]), (u.Quantity([1.01e-09, 1.03e-09, 1.05e-09, 1.07e-09, 1.09e-09], unit=u.m),)), ((cubem, ['em']), (u.Quantity([1.01e-09, 1.03e-09, 1.05e-09, 1.07e-09, 1.09e-09], unit=u.m),)) ]) def test_all_world_coord_values_with_input_and_kwargs(test_input, expected): all_coords = test_input[0].axis_world_coords_values(*test_input[1], **{"edges": True}) for i in range(len(all_coords)): np.testing.assert_allclose(all_coords[i].value, expected[i].value) assert all_coords[i].unit == expected[i].unit @pytest.mark.parametrize("test_input,expected", [ (cubem, (u.Quantity([[0.60002173, 0.59999127, 0.5999608], [1., 1., 1.]], unit=u.deg), u.Quantity([[1.26915033e-05, 4.99987815e-01, 9.99962939e-01], [1.26918126e-05, 5.00000000e-01, 9.99987308e-01]], unit=u.deg), u.Quantity([1.02e-09, 1.04e-09, 1.06e-09, 1.08e-09], unit=u.m))), ((cubem[:, :, 0]), (u.Quantity([[0.60002173, 0.59999127, 0.5999608], [1., 1., 1.]], unit=u.deg), u.Quantity([[1.26915033e-05, 4.99987815e-01, 9.99962939e-01], [1.26918126e-05, 5.00000000e-01, 9.99987308e-01]], unit=u.deg), u.Quantity([1.02e-09], unit=u.m))) ]) def test_axis_world_coords_values_without_input(test_input, expected): all_coords = test_input.axis_world_coords_values() for i in range(len(all_coords)): np.testing.assert_allclose(all_coords[i].value, expected[i].value) assert all_coords[i].unit == expected[i].unit @pytest.mark.parametrize("test_input,expected", [ ((cubem, 0, 0), ((2 * u.pix, 3 * u.pix, 4 * u.pix), NDCubeSequence, dict, NDCube, OrderedDict)), ((cubem, 1, 0), ((3 * u.pix, 2 * u.pix, 4 * u.pix), NDCubeSequence, dict, NDCube, OrderedDict)), ((cubem, -2, 0), ((3 * u.pix, 2 * u.pix, 4 * u.pix), NDCubeSequence, dict, NDCube, OrderedDict)) ]) def test_explode_along_axis(test_input, expected): inp_cube, inp_axis, inp_slice = test_input exp_dimensions, exp_type_seq, exp_meta_seq, exp_type_cube, exp_meta_cube = expected output = inp_cube.explode_along_axis(inp_axis) assert tuple(output.dimensions) == tuple(exp_dimensions) assert any(output[inp_slice].dimensions == u.Quantity((exp_dimensions[1], exp_dimensions[2]), unit='pix')) assert isinstance(output, exp_type_seq) assert isinstance(output[inp_slice], exp_type_cube) assert isinstance(output.meta, exp_meta_seq) assert isinstance(output[inp_slice].meta, exp_meta_cube) def test_array_axis_physical_types(): expected = [ ('custom:pos.helioprojective.lon', 'custom:pos.helioprojective.lat'), ('custom:pos.helioprojective.lon', 'custom:pos.helioprojective.lat'), ('em.wl',), ('time',)] output = cube_disordered.array_axis_physical_types for i, expected_i in enumerate(expected): assert all([physical_type in expected_i for physical_type in output[i]]) ././@PaxHeader0000000000000000000000000000003400000000000011452 xustar000000000000000028 mtime=1605787057.1544762 ndcube-1.4.2/ndcube/tests/test_ndcubesequence.py0000644000175100001640000002762000000000000022261 0ustar00vstsdocker00000000000000from collections import namedtuple import pytest import datetime import unittest import sunpy.map import numpy as np import astropy.units as u from ndcube import NDCube, NDCubeSequence from ndcube.utils.wcs import WCS # sample data for tests # TODO: use a fixture reading from a test file. file TBD. data = np.array([[[1, 2, 3, 4], [2, 4, 5, 3], [0, -1, 2, 3]], [[2, 4, 5, 1], [10, 5, 2, 2], [10, 3, 3, 0]]]) data2 = np.array([[[11, 22, 33, 44], [22, 44, 55, 33], [0, -1, 22, 33]], [[22, 44, 55, 11], [10, 55, 22, 22], [10, 33, 33, 0]]]) ht = {'CTYPE3': 'HPLT-TAN', 'CUNIT3': 'deg', 'CDELT3': 0.5, 'CRPIX3': 0, 'CRVAL3': 0, 'NAXIS3': 2, 'CTYPE2': 'WAVE ', 'CUNIT2': 'Angstrom', 'CDELT2': 0.2, 'CRPIX2': 0, 'CRVAL2': 0, 'NAXIS2': 3, 'CTYPE1': 'TIME ', 'CUNIT1': 'min', 'CDELT1': 0.4, 'CRPIX1': 0, 'CRVAL1': 0, 'NAXIS1': 4} hm = { 'CTYPE1': 'WAVE ', 'CUNIT1': 'Angstrom', 'CDELT1': 0.2, 'CRPIX1': 0, 'CRVAL1': 10, 'NAXIS1': 4, 'CTYPE2': 'HPLT-TAN', 'CUNIT2': 'deg', 'CDELT2': 0.5, 'CRPIX2': 2, 'CRVAL2': 0.5, 'NAXIS2': 3, 'CTYPE3': 'HPLN-TAN', 'CUNIT3': 'deg', 'CDELT3': 0.4, 'CRPIX3': 2, 'CRVAL3': 1, 'NAXIS3': 2, } wt = WCS(header=ht, naxis=3) wm = WCS(header=hm, naxis=3) cube1 = NDCube(data, wt, missing_axes=[False, False, False, True], extra_coords=[ ('pix', 0, u.Quantity(range(data.shape[0]), unit=u.pix)), ('distance', None, u.Quantity(0, unit=u.cm)), ('time', None, datetime.datetime(2000, 1, 1, 0, 0))]) cube2 = NDCube(data, wm, extra_coords=[ ('pix', 0, u.Quantity(np.arange(1, data.shape[0] + 1), unit=u.pix) + cube1.extra_coords['pix']['value'][-1]), ('distance', None, u.Quantity(1, unit=u.cm)), ('time', None, datetime.datetime(2000, 1, 1, 0, 1))]) cube3 = NDCube(data2, wt, missing_axes=[False, False, False, True], extra_coords=[ ('pix', 0, u.Quantity(np.arange(1, data2.shape[0] + 1), unit=u.pix) + cube2.extra_coords['pix']['value'][-1]), ('distance', None, u.Quantity(2, unit=u.cm)), ('time', None, datetime.datetime(2000, 1, 1, 0, 2))]) cube4 = NDCube(data2, wm, extra_coords=[ ('pix', 0, u.Quantity(np.arange(1, data2.shape[0] + 1), unit=u.pix) + cube3.extra_coords['pix']['value'][-1]), ('distance', None, u.Quantity(3, unit=u.cm)), ('time', None, datetime.datetime(2000, 1, 1, 0, 3))]) cube2_no_no = NDCube(data, wm, extra_coords=[ ('pix', 0, u.Quantity(np.arange(1, data.shape[0] + 1), unit=u.pix) + cube1.extra_coords['pix']['value'][-1]), ('time', None, datetime.datetime(2000, 1, 1, 0, 1))]) cube3_no_time = NDCube(data2, wt, missing_axes=[False, False, False, True], extra_coords=[ ('pix', 0, u.Quantity(np.arange(1, data2.shape[0] + 1), unit=u.pix) + cube2.extra_coords['pix']['value'][-1]), ('distance', None, u.Quantity(2, unit=u.cm))]) cube3_diff_compatible_unit = NDCube( data2, wt, missing_axes=[False, False, False, True], extra_coords=[ ('pix', 0, u.Quantity(np.arange(data2.shape[0]), unit=u.pix) + cube2.extra_coords['pix']['value'][-1]), ('distance', None, u.Quantity(2, unit=u.cm).to('m')), ('time', None, datetime.datetime(2000, 1, 1, 0, 2))]) cube3_diff_incompatible_unit = NDCube( data2, wt, missing_axes=[False, False, False, True], extra_coords=[ ('pix', 0, u.Quantity(np.arange(data2.shape[0]), unit=u.pix) + cube2.extra_coords['pix']['value'][-1]), ('distance', None, u.Quantity(2, unit=u.s)), ('time', None, datetime.datetime(2000, 1, 1, 0, 2))]) cube1_time_common = NDCube( data, wt, missing_axes=[False, False, False, True], extra_coords=[('time', 1, [datetime.datetime(2000, 1, 1) + datetime.timedelta(minutes=i) for i in range(data.shape[1])])]) cube2_time_common = NDCube(data, wm, extra_coords=[ ('time', 1, [cube1_time_common.extra_coords["time"]["value"][-1] + datetime.timedelta(minutes=i) for i in range(1, data.shape[1] + 1)])]) cube1_no_extra_coords = NDCube(data, wt, missing_axes=[False, False, False, True]) cube3_no_extra_coords = NDCube(data2, wt, missing_axes=[False, False, False, True]) seq = NDCubeSequence([cube1, cube2, cube3, cube4], common_axis=0) seq_bad_common_axis = NDCubeSequence([cube1, cube2, cube3, cube4], common_axis=None) seq_time_common = NDCubeSequence([cube1_time_common, cube2_time_common], common_axis=1) seq1 = NDCubeSequence([cube1, cube2, cube3, cube4]) seq2 = NDCubeSequence([cube1, cube2_no_no, cube3_no_time, cube4]) seq3 = NDCubeSequence([cube1, cube2, cube3_diff_compatible_unit, cube4]) seq4 = NDCubeSequence([cube1, cube2, cube3_diff_incompatible_unit, cube4]) seq_no_extra_coords = NDCubeSequence([cube1_no_extra_coords, cube3_no_extra_coords], common_axis=0) nan_extra_coord = u.Quantity(range(4), unit=u.cm) nan_extra_coord.value[1] = np.nan nan_time_extra_coord = np.array([datetime.datetime(2000, 1, 1) + datetime.timedelta(minutes=i) for i in range(len(seq.data))]) nan_time_extra_coord[2] = np.nan @pytest.mark.parametrize("test_input,expected", [ (seq[0], NDCube), (seq[1], NDCube), (seq[2], NDCube), (seq[3], NDCube), (seq[0:1], NDCubeSequence), (seq[0:1, 0:2], NDCubeSequence), (seq[0:1, 0], NDCubeSequence), (seq[1:3], NDCubeSequence), (seq[0:2], NDCubeSequence), (seq[slice(0, 2)], NDCubeSequence), (seq[slice(0, 3)], NDCubeSequence), ]) def test_slice_first_index_sequence_type(test_input, expected): assert isinstance(test_input, expected) @pytest.mark.parametrize("test_input,expected", [ (seq[1:3], 2 * u.pix), (seq[0:2], 2 * u.pix), (seq[0::], 4 * u.pix), (seq[slice(0, 2)], 2 * u.pix), (seq[slice(0, 3)], 3 * u.pix), ]) def test_slice_first_index_sequence_dimensions(test_input, expected): assert test_input.dimensions[0] == expected @pytest.mark.parametrize("test_sequence, test_item, expected_common_axis", [ (seq_time_common, (slice(None), 0), 0), (seq_time_common, (slice(None), slice(0, 1)), 1), (seq_time_common, (slice(None), slice(None), 0), None), (seq_time_common, (slice(None), slice(None), slice(0, 1)), 1), (seq_time_common, (slice(None), slice(None), slice(None), 0), 1), (seq_bad_common_axis, (slice(None), 0), None) ]) def test_slice_common_axis(test_sequence, test_item, expected_common_axis): sliced_sequence = test_sequence[test_item] assert sliced_sequence._common_axis == expected_common_axis @pytest.mark.parametrize("test_input,expected", [ (seq.index_as_cube[0:5].dimensions, (3 * u.pix, [2., 2., 1.] * u.pix, 3 * u.pix, 4 * u.pix)), (seq.index_as_cube[1:3].dimensions, (2 * u.pix, 1 * u.pix, 3 * u.pix, 4 * u.pix)), (seq.index_as_cube[0:6].dimensions, (3 * u.pix, 2 * u.pix, 3 * u.pix, 4 * u.pix)), (seq.index_as_cube[0::].dimensions, (4 * u.pix, 2 * u.pix, 3 * u.pix, 4 * u.pix)), (seq.index_as_cube[0:5, 0].dimensions, (3 * u.pix, [2., 2., 1.] * u.pix, 4 * u.pix)), (seq.index_as_cube[1:3, 0:2].dimensions, (2 * u.pix, 1 * u.pix, 2 * u.pix, 4 * u.pix)), (seq.index_as_cube[0:6, 0, 0:1].dimensions, (3 * u.pix, 2 * u.pix, 1 * u.pix)), (seq.index_as_cube[0::, 0, 0].dimensions, (4 * u.pix, 2 * u.pix)), ]) def test_index_as_cube(test_input, expected): for i in range(len(test_input)): try: assert test_input[i] == expected[i] except ValueError: assert (test_input[i].value == expected[i].value).all() assert test_input[i].unit == expected[i].unit @pytest.mark.parametrize("test_input,expected", [ (seq1.explode_along_axis(0), (8 * u.pix, 3 * u.pix, 4 * u.pix)), (seq1.explode_along_axis(1), (12 * u.pix, 2 * u.pix, 4 * u.pix)), (seq1.explode_along_axis(2), (16 * u.pix, 2 * u.pix, 3 * u.pix)), ]) def test_explode_along_axis(test_input, expected): assert test_input.dimensions == expected def test_explode_along_axis_error(): with pytest.raises(ValueError): seq.explode_along_axis(1) @pytest.mark.parametrize( "test_input,expected", [(seq, (4 * u.pix, 2. * u.pix, 3. * u.pix, 4. * u.pix))]) def test_dimensions(test_input, expected): unit_tester = unittest.TestCase() unit_tester.assertEqual(test_input.dimensions, expected) @pytest.mark.parametrize( "test_input,expected", [(seq, u.Quantity([8., 3., 4], unit=u.pix))]) def test_cube_like_dimensions(test_input, expected): assert (seq.cube_like_dimensions == expected).all() @pytest.mark.parametrize("test_input", [(seq_bad_common_axis)]) def test_cube_like_dimensions_error(test_input): with pytest.raises(TypeError): seq_bad_common_axis.cube_like_dimensions @pytest.mark.parametrize( "test_input,expected", [(seq, {'pix': u.Quantity([0., 1., 2., 3., 4., 5., 6., 7.], unit=u.pix)}), (seq_time_common, {'time': np.array([datetime.datetime(2000, 1, 1, 0, 0), datetime.datetime(2000, 1, 1, 0, 1), datetime.datetime(2000, 1, 1, 0, 2), datetime.datetime(2000, 1, 1, 0, 3), datetime.datetime(2000, 1, 1, 0, 4), datetime.datetime(2000, 1, 1, 0, 5)], dtype=object)})]) def test_common_axis_extra_coords(test_input, expected): output = test_input.common_axis_extra_coords assert output.keys() == expected.keys() for key in output.keys(): try: assert output[key] == expected[key] except ValueError: assert (output[key] == expected[key]).all() @pytest.mark.parametrize("test_input", [(seq_no_extra_coords)]) def test_no_common_axis_extra_coords(test_input): assert seq_no_extra_coords.sequence_axis_extra_coords is None @pytest.mark.parametrize( "test_input,expected", [(seq, {'distance': u.Quantity(range(4), unit=u.cm), 'time': np.array([datetime.datetime(2000, 1, 1) + datetime.timedelta(minutes=i) for i in range(len(seq.data))])}), (seq2, {'distance': nan_extra_coord, 'time': nan_time_extra_coord}), (seq3, {'distance': u.Quantity(range(4), unit=u.cm), 'time': np.array([datetime.datetime(2000, 1, 1) + datetime.timedelta(minutes=i) for i in range(len(seq.data))])})]) def test_sequence_axis_extra_coords(test_input, expected): output = test_input.sequence_axis_extra_coords assert output.keys() == expected.keys() for key in output.keys(): if isinstance(output[key], u.Quantity): assert output[key].unit == expected[key].unit np.testing.assert_array_almost_equal(output[key].value, expected[key].value) else: # For non-Quantities, must check element by element due to # possible mixture of NaN and non-number elements on # arrays, e.g. datetime does not work with # np.testing.assert_array_almost_equal(). for i, output_value in enumerate(output[key]): if isinstance(output_value, float): # Check if output is nan, expected is no and vice versa. if not isinstance(expected[key][i], float): raise AssertionError("{} != {}".format(output_value, expected[key][i])) elif np.logical_xor(np.isnan(output_value), np.isnan(expected[key][i])): raise AssertionError("{0} != {1}", format(output_value, expected[key][i])) # Else assert they are equal if they are both not NaN. elif not np.isnan(output_value): assert output_value == expected[key][i] # Else, is output is not a float, assert it equals expected. else: assert output_value == expected[key][i] @pytest.mark.parametrize("test_input", [(seq_no_extra_coords)]) def test_no_sequence_axis_extra_coords(test_input): assert seq_no_extra_coords.sequence_axis_extra_coords is None @pytest.mark.parametrize("test_input", [(seq4)]) def test_sequence_axis_extra_coords_incompatible_unit_error(test_input): with pytest.raises(u.UnitConversionError): test_input.sequence_axis_extra_coords ././@PaxHeader0000000000000000000000000000003400000000000011452 xustar000000000000000028 mtime=1605787057.1544762 ndcube-1.4.2/ndcube/tests/test_plotting.py0000644000175100001640000004064500000000000021132 0ustar00vstsdocker00000000000000import pytest import datetime import copy import numpy as np import astropy.units as u from astropy.visualization.wcsaxes import WCSAxes import matplotlib import matplotlib.pyplot as plt try: from sunpy.visualization.animator import LineAnimator except ImportError: from sunpy.visualization.imageanimator import LineAnimator from ndcube import NDCube from ndcube.utils.wcs import WCS from ndcube.mixins import plotting from ndcube.visualization.animator import ImageAnimatorWCS # sample data for tests # TODO: use a fixture reading from a test file. file TBD. ht = {'CTYPE3': 'HPLT-TAN', 'CUNIT3': 'deg', 'CDELT3': 0.5, 'CRPIX3': 0, 'CRVAL3': 0, 'NAXIS3': 2, 'CTYPE2': 'WAVE ', 'CUNIT2': 'Angstrom', 'CDELT2': 0.2, 'CRPIX2': 0, 'CRVAL2': 0, 'NAXIS2': 3, 'CTYPE1': 'TIME ', 'CUNIT1': 'min', 'CDELT1': 0.4, 'CRPIX1': 0, 'CRVAL1': 0, 'NAXIS1': 4} wt = WCS(header=ht, naxis=3) hm = {'CTYPE1': 'WAVE ', 'CUNIT1': 'Angstrom', 'CDELT1': 0.2, 'CRPIX1': 0, 'CRVAL1': 10, 'NAXIS1': 4, 'CTYPE2': 'HPLT-TAN', 'CUNIT2': 'deg', 'CDELT2': 0.5, 'CRPIX2': 2, 'CRVAL2': 0.5, 'NAXIS2': 3, 'CTYPE3': 'HPLN-TAN', 'CUNIT3': 'deg', 'CDELT3': 0.4, 'CRPIX3': 2, 'CRVAL3': 1, 'NAXIS3': 2} wm = WCS(header=hm, naxis=3) spatial = { 'CTYPE1': 'HPLT-TAN', 'CUNIT1': 'deg', 'CDELT1': 0.5, 'CRPIX1': 2, 'CRVAL1': 0.5, 'NAXIS1': 3, 'CTYPE2': 'HPLN-TAN', 'CUNIT2': 'deg', 'CDELT2': 0.4, 'CRPIX2': 2, 'CRVAL2': 1, 'NAXIS2': 2 } spatial = WCS(header=spatial, naxis=2) data = np.array([[[1, 2, 3, 4], [2, 4, 5, 3], [0, -1, 2, 3]], [[2, 4, 5, 1], [10, 5, 2, 2], [10, 3, 3, 0]]]) uncertainty = np.sqrt(data) mask_cube = data < 0 cube = NDCube( data, wt, mask=mask_cube, uncertainty=uncertainty, missing_axes=[False, False, False, True], extra_coords=[('time', 0, u.Quantity(range(data.shape[0]), unit=u.s)), ('hello', 1, u.Quantity(range(data.shape[1]), unit=u.W)), ('bye', 2, u.Quantity(range(data.shape[2]), unit=u.m)), ('another time', 2, np.array( [datetime.datetime(2000, 1, 1) + datetime.timedelta(minutes=i) for i in range(data.shape[2])])), ('array coord', 2, np.arange(100, 100 + data.shape[2])) ]) cube_spatial = NDCube( data[0], spatial) cube_unit = NDCube( data, wt, mask=mask_cube, unit=u.J, uncertainty=uncertainty, missing_axes=[False, False, False, True], extra_coords=[('time', 0, u.Quantity(range(data.shape[0]), unit=u.s)), ('hello', 1, u.Quantity(range(data.shape[1]), unit=u.W)), ('bye', 2, u.Quantity(range(data.shape[2]), unit=u.m)), ('another time', 2, np.array( [datetime.datetime(2000, 1, 1) + datetime.timedelta(minutes=i) for i in range(data.shape[2])])) ]) cube_no_uncertainty = NDCube( data, wt, mask=mask_cube, missing_axes=[False, False, False, True], extra_coords=[('time', 0, u.Quantity(range(data.shape[0]), unit=u.s)), ('hello', 1, u.Quantity(range(data.shape[1]), unit=u.W)), ('bye', 2, u.Quantity(range(data.shape[2]), unit=u.m)), ('another time', 2, np.array( [datetime.datetime(2000, 1, 1) + datetime.timedelta(minutes=i) for i in range(data.shape[2])])) ]) cube_unit_no_uncertainty = NDCube( data, wt, mask=mask_cube, unit=u.J, missing_axes=[False, False, False, True], extra_coords=[('time', 0, u.Quantity(range(data.shape[0]), unit=u.s)), ('hello', 1, u.Quantity(range(data.shape[1]), unit=u.W)), ('bye', 2, u.Quantity(range(data.shape[2]), unit=u.m)), ('another time', 2, np.array( [datetime.datetime(2000, 1, 1) + datetime.timedelta(minutes=i) for i in range(data.shape[2])])) ]) cubem = NDCube( data, wm, mask=mask_cube, uncertainty=uncertainty, extra_coords=[('time', 0, u.Quantity(range(data.shape[0]), unit=u.s)), ('hello', 1, u.Quantity(range(data.shape[1]), unit=u.W)), ('bye', 2, u.Quantity(range(data.shape[2]), unit=u.m)), ('another time', 2, np.array( [datetime.datetime(2000, 1, 1) + datetime.timedelta(minutes=i) for i in range(data.shape[2])])) ]) # Derive expected data values cube_data = np.ma.masked_array(cube.data, cube.mask) # Derive expected axis_ranges. # Let False stand for ranges generated by SunPy classes and so not tested. cube_none_axis_ranges_axis2 = [False, False, np.array([0.4, 0.8, 1.2, 1.6])] cube_none_axis_ranges_axis2_s = copy.deepcopy(cube_none_axis_ranges_axis2) cube_none_axis_ranges_axis2_s[2] = cube_none_axis_ranges_axis2[2] * 60. cube_none_axis_ranges_axis2_bye = copy.deepcopy(cube_none_axis_ranges_axis2) cube_none_axis_ranges_axis2_bye[2] = cube.extra_coords["bye"]["value"].value cube_none_axis_ranges_axis2_array = copy.deepcopy(cube_none_axis_ranges_axis2) cube_none_axis_ranges_axis2_array[2] = np.arange(10, 10 + cube.data.shape[-1]) @pytest.mark.parametrize("test_input, test_kwargs, expected_values", [ (cube[0, 0], {}, (np.ma.masked_array([0.4, 0.8, 1.2, 1.6], cube[0, 0].mask), np.ma.masked_array(cube[0, 0].data, cube[0, 0].mask), "time [min]", "Data [None]", (0.4, 1.6), (1, 4))), (cube_unit[0, 0], {"axes_coordinates": "bye", "axes_units": "km", "data_unit": u.erg}, (np.ma.masked_array(cube_unit[0, 0].extra_coords["bye"]["value"].to(u.km).value, cube_unit[0, 0].mask), np.ma.masked_array(u.Quantity(cube_unit[0, 0].data, unit=cube_unit[0, 0].unit).to(u.erg).value, cube_unit[0, 0].mask), "bye [km]", "Data [erg]", (0, 0.003), (10000000, 40000000))), (cube_unit[0, 0], {"axes_coordinates": np.arange(10, 10 + cube_unit[0, 0].data.shape[0])}, (np.ma.masked_array(np.arange(10, 10 + cube_unit[0, 0].data.shape[0]), cube_unit[0, 0].mask), np.ma.masked_array(cube_unit[0, 0].data, cube_unit[0, 0].mask), " [None]", "Data [J]", (10, 10 + cube_unit[0, 0].data.shape[0] - 1), (1, 4))), (cube_no_uncertainty[0, 0], {}, (np.ma.masked_array([0.4, 0.8, 1.2, 1.6], cube_no_uncertainty[0, 0].mask), np.ma.masked_array(cube_no_uncertainty[0, 0].data, cube_no_uncertainty[0, 0].mask), "time [min]", "Data [None]", (0.4, 1.6), (1, 4))), (cube_unit_no_uncertainty[0, 0], {}, (np.ma.masked_array([0.4, 0.8, 1.2, 1.6], cube_unit_no_uncertainty[0, 0].mask), np.ma.masked_array(cube_no_uncertainty[0, 0].data, cube_unit_no_uncertainty[0, 0].mask), "time [min]", "Data [J]", (0.4, 1.6), (1, 4))), (cube_unit_no_uncertainty[0, 0], {"data_unit": u.erg}, (np.ma.masked_array([0.4, 0.8, 1.2, 1.6], cube_unit_no_uncertainty[0, 0].mask), np.ma.masked_array(u.Quantity(cube_unit[0, 0].data, unit=cube_unit[0, 0].unit).to(u.erg).value, cube_unit[0, 0].mask), "time [min]", "Data [erg]", (0.4, 1.6), (10000000, 40000000))) ]) def test_cube_plot_1D(test_input, test_kwargs, expected_values): # Unpack expected properties. expected_xdata, expected_ydata, expected_xlabel, expected_ylabel, \ expected_xlim, expected_ylim = expected_values # Run plot method. output = test_input.plot(**test_kwargs) # Check plot properties are correct. # Type assert isinstance(output, matplotlib.axes.Axes) # Check x axis data output_xdata = (output.axes.lines[0].get_xdata()) assert np.allclose(output_xdata.data, expected_xdata.data) if isinstance(output_xdata.mask, np.ndarray): np.testing.assert_array_equal(output_xdata.mask, expected_xdata.mask) else: assert output_xdata.mask == expected_xdata.mask # Check y axis data output_ydata = (output.axes.lines[0].get_ydata()) assert np.allclose(output_ydata.data, expected_ydata.data) if isinstance(output_ydata.mask, np.ndarray): np.testing.assert_array_equal(output_ydata.mask, expected_ydata.mask) else: assert output_ydata.mask == expected_ydata.mask # Check axis labels assert output.axes.get_xlabel() == expected_xlabel assert output.axes.get_ylabel() == expected_ylabel # Check axis limits output_xlim = output.axes.get_xlim() assert output_xlim[0] <= expected_xlim[0] assert output_xlim[1] >= expected_xlim[1] output_ylim = output.axes.get_ylim() assert output_ylim[0] <= expected_ylim[0] assert output_ylim[1] >= expected_ylim[1] @pytest.mark.parametrize("test_input, test_kwargs, expected_error", [ (cube[0, 0], {"axes_coordinates": np.arange(10, 10 + cube_unit[0, 0].data.shape[0]), "axes_units": u.C}, TypeError), (cube[0, 0], {"data_unit": u.C}, TypeError) ]) def test_cube_plot_1D_errors(test_input, test_kwargs, expected_error): with pytest.raises(expected_error): output = test_input.plot(**test_kwargs) @pytest.mark.parametrize("test_input, test_kwargs, expected_values", [ (cube[0], {}, (np.ma.masked_array(cube[0].data, cube[0].mask), "time [min]", "em.wl [m]", (-0.5, 3.5, -0.5, 2.5))), (cube_spatial, {'axes': WCSAxes(plt.figure(), (0, 0, 1, 1), wcs=cube_spatial.wcs)}, (cube_spatial.data, "custom:pos.helioprojective.lat [deg]", "custom:pos.helioprojective.lon [deg]", (-0.5, 3.5, -0.5, 2.5))), (cube[0], {"axes_coordinates": ["bye", None], "axes_units": [None, u.cm]}, (np.ma.masked_array(cube[0].data, cube[0].mask), "bye [m]", "em.wl [cm]", (0.0, 3.0, 2e-9, 6e-9))), (cube[0], {"axes_coordinates": [np.arange(10, 10 + cube[0].data.shape[1]), u.Quantity(np.arange(10, 10 + cube[0].data.shape[0]), unit=u.m)], "axes_units": [None, u.cm]}, (np.ma.masked_array(cube[0].data, cube[0].mask), " [None]", " [cm]", (10, 13, 1000, 1200))), (cube[0], {"axes_coordinates": [np.arange(10, 10 + cube[0].data.shape[1]), u.Quantity(np.arange(10, 10 + cube[0].data.shape[0]), unit=u.m)]}, (np.ma.masked_array(cube[0].data, cube[0].mask), " [None]", " [m]", (10, 13, 10, 12))), (cube_unit[0], {"plot_axis_indices": [0, 1], "axes_coordinates": [None, "bye"], "data_unit": u.erg}, (np.ma.masked_array((cube_unit[0].data * cube_unit[0].unit).to(u.erg).value, cube_unit[0].mask).transpose(), "em.wl [m]", "bye [m]", (2e-11, 6e-11, 0.0, 3.0))) ]) def test_cube_plot_2D(test_input, test_kwargs, expected_values): fig = plt.figure() # Unpack expected properties. expected_data, expected_xlabel, expected_ylabel, expected_extent = \ expected_values # Run plot method. output = test_input.plot(**test_kwargs) # Check plot properties are correct. assert isinstance(output, matplotlib.axes.Axes) np.testing.assert_array_equal(output.images[0].get_array(), expected_data) if isinstance(output, WCSAxes): assert output.coords[0].get_axislabel() == expected_xlabel assert output.coords[1].get_axislabel() == expected_ylabel else: assert output.axes.yaxis.get_label_text() == expected_ylabel assert output.axes.xaxis.get_label_text() == expected_xlabel assert np.allclose(output.images[0].get_extent(), expected_extent) @pytest.mark.parametrize("test_input, test_kwargs, expected_error", [ (cube[0], {"axes_coordinates": ["array coord", None], "axes_units": [u.cm, None]}, TypeError), (cube[0], {"axes_coordinates": [np.arange(10, 10 + cube[0].data.shape[1]), None], "axes_units": [u.cm, None]}, TypeError), (cube[0], {"data_unit": u.cm}, TypeError) ]) def test_cube_plot_2D_errors(test_input, test_kwargs, expected_error): with pytest.raises(expected_error): output = test_input.plot(**test_kwargs) @pytest.mark.parametrize("test_input, test_kwargs, expected_values", [ (cubem, {}, (cubem.data, [np.array([0., 2.]), [0, 3], [0, 4]], "", "")) ]) def test_cube_plot_ND_as_2DAnimation(test_input, test_kwargs, expected_values): # Unpack expected properties. expected_data, expected_axis_ranges, expected_xlabel, expected_ylabel = expected_values # Run plot method. output = test_input.plot(**test_kwargs) # Check plot properties are correct. assert isinstance(output, ImageAnimatorWCS) np.testing.assert_array_equal(output.data, expected_data) assert output.axes.xaxis.get_label_text() == expected_xlabel assert output.axes.yaxis.get_label_text() == expected_ylabel @pytest.mark.parametrize("input_values, expected_values", [ ((None, None, None, None, {"image_axes": [-1, -2], "axis_ranges": [np.arange(3), np.arange(3)], "unit_x_axis": "km", "unit_y_axis": u.s, "unit": u.W}), ([-1, -2], [np.arange(3), np.arange(3)], ["km", u.s], u.W, {})), (([-1, -2], [np.arange(3), np.arange(3)], ["km", u.s], u.W, {}), ([-1, -2], [np.arange(3), np.arange(3)], ["km", u.s], u.W, {})), (([-1], None, None, None, {"unit_x_axis": "km"}), ([-1], None, "km", None, {})), (([-1, -2], None, None, None, {"unit_x_axis": "km"}), (([-1, -2], None, ["km", None], None, {}))), (([-1, -2], None, None, None, {"unit_y_axis": "km"}), (([-1, -2], None, [None, "km"], None, {}))) ]) def test_support_101_plot_API(input_values, expected_values): # Define expected values. expected_plot_axis_indices, expected_axes_coordinates, expected_axes_units, \ expected_data_unit, expected_kwargs = expected_values # Run function output_plot_axis_indices, output_axes_coordinates, output_axes_units, \ output_data_unit, output_kwargs = plotting._support_101_plot_API(*input_values) # Check values are correct assert output_plot_axis_indices == expected_plot_axis_indices if expected_axes_coordinates is None: assert output_axes_coordinates == expected_axes_coordinates elif isinstance(expected_axes_coordinates, list): for i, ac in enumerate(output_axes_coordinates): np.testing.assert_array_equal(ac, expected_axes_coordinates[i]) assert output_axes_units == expected_axes_units assert output_data_unit == expected_data_unit assert output_kwargs == expected_kwargs @pytest.mark.parametrize("input_values", [ ([0, 1], None, None, None, {"image_axes": [-1, -2]}), (None, [np.arange(1, 4), np.arange(1, 4)], None, None, {"axis_ranges": [np.arange(3), np.arange(3)]}), (None, None, [u.s, "km"], None, {"unit_x_axis": u.W}), (None, None, [u.s, "km"], None, {"unit_y_axis": u.W}), (None, None, None, u.s, {"unit": u.W}), ([0, 1, 2], None, None, None, {"unit_x_axis": [u.s, u.km, u.W]}), ]) def test_support_101_plot_API_errors(input_values): with pytest.raises(ValueError): output = plotting._support_101_plot_API(*input_values) @pytest.mark.parametrize("test_input, test_kwargs, expected_values", [ (cube, {"plot_axis_indices": -1}, (cube_data, cube_none_axis_ranges_axis2, "time [min]", "Data [None]")), (cube_unit, {"plot_axis_indices": -1, "axes_units": u.s, "data_unit": u.erg}, (cube_data * 1e7, cube_none_axis_ranges_axis2_s, "time [s]", "Data [erg]")), (cube_unit, {"plot_axis_indices": -1, "axes_coordinates": "bye"}, (cube_data, cube_none_axis_ranges_axis2_bye, "bye [m]", "Data [J]")), (cube, {"plot_axis_indices": -1, "axes_coordinates": np.arange(10 - 0.5, 0.5 + 10 + cube.data.shape[-1])}, (cube_data, cube_none_axis_ranges_axis2_array, " [None]", "Data [None]")) ]) def test_cube_plot_ND_as_1DAnimation(test_input, test_kwargs, expected_values): # Unpack expected properties. expected_data, expected_axis_ranges, expected_xlabel, expected_ylabel = expected_values # Run plot method. output = test_input.plot(**test_kwargs) # Check plot properties are correct. assert isinstance(output, LineAnimator) np.testing.assert_array_equal(output.data, expected_data) for i, output_axis_range in enumerate(output.axis_ranges): if expected_axis_ranges[i] is not False: assert np.allclose(output_axis_range, expected_axis_ranges[i]) assert output.axes.xaxis.get_label_text() == expected_xlabel assert output.axes.yaxis.get_label_text() == expected_ylabel ././@PaxHeader0000000000000000000000000000003400000000000011452 xustar000000000000000028 mtime=1605787057.1544762 ndcube-1.4.2/ndcube/tests/test_sequence_plotting.py0000644000175100001640000011241200000000000023012 0ustar00vstsdocker00000000000000import pytest import datetime import copy import numpy as np import astropy.units as u import matplotlib from sunpy.visualization.animator.base import edges_to_centers_nd from ndcube import NDCube, NDCubeSequence from ndcube.utils.cube import _get_extra_coord_edges from ndcube.utils.wcs import WCS import ndcube.mixins.sequence_plotting # sample data for tests # TODO: use a fixture reading from a test file. file TBD. data = np.array([[[1, 2, 3, 4], [2, 4, 5, 3], [0, -1, 2, 3]], [[2, 4, 5, 1], [10, 5, 2, 2], [10, 3, 3, 0]]]) data2 = np.array([[[11, 22, 33, 44], [22, 44, 55, 33], [0, -1, 22, 33]], [[22, 44, 55, 11], [10, 55, 22, 22], [10, 33, 33, 0]]]) ht = {'CTYPE3': 'HPLT-TAN', 'CUNIT3': 'deg', 'CDELT3': 0.5, 'CRPIX3': 0, 'CRVAL3': 0, 'NAXIS3': 2, 'CTYPE2': 'WAVE ', 'CUNIT2': 'Angstrom', 'CDELT2': 0.2, 'CRPIX2': 0, 'CRVAL2': 0, 'NAXIS2': 3, 'CTYPE1': 'TIME ', 'CUNIT1': 'min', 'CDELT1': 0.4, 'CRPIX1': 0, 'CRVAL1': 0, 'NAXIS1': 4} wt = WCS(header=ht, naxis=3) hm = { 'CTYPE1': 'WAVE ', 'CUNIT1': 'Angstrom', 'CDELT1': 0.2, 'CRPIX1': 0, 'CRVAL1': 10, 'NAXIS1': 4, 'CTYPE2': 'HPLT-TAN', 'CUNIT2': 'deg', 'CDELT2': 0.5, 'CRPIX2': 2, 'CRVAL2': 0.5, 'NAXIS2': 3, 'CTYPE3': 'HPLN-TAN', 'CUNIT3': 'deg', 'CDELT3': 0.4, 'CRPIX3': 2, 'CRVAL3': 1, 'NAXIS3': 2} wm = WCS(header=hm, naxis=3) cube1 = NDCube( data, wt, missing_axes=[False, False, False, True], extra_coords=[ ('pix', 0, u.Quantity(range(data.shape[0]), unit=u.pix)), ('hi', 1, u.Quantity(range(data.shape[1]), unit=u.s)), ('distance', None, u.Quantity(0, unit=u.cm)), ('time', None, datetime.datetime(2000, 1, 1, 0, 0))]) cube1_with_unit = NDCube( data, wt, missing_axes=[False, False, False, True], unit=u.km, extra_coords=[ ('pix', 0, u.Quantity(range(data.shape[0]), unit=u.pix)), ('hi', 1, u.Quantity(range(data.shape[1]), unit=u.s)), ('distance', None, u.Quantity(0, unit=u.cm)), ('time', None, datetime.datetime(2000, 1, 1, 0, 0))]) cube1_with_mask = NDCube( data, wt, missing_axes=[False, False, False, True], mask=np.zeros_like(data, dtype=bool), extra_coords=[ ('pix', 0, u.Quantity(range(data.shape[0]), unit=u.pix)), ('hi', 1, u.Quantity(range(data.shape[1]), unit=u.s)), ('distance', None, u.Quantity(0, unit=u.cm)), ('time', None, datetime.datetime(2000, 1, 1, 0, 0))]) cube1_with_uncertainty = NDCube( data, wt, missing_axes=[False, False, False, True], uncertainty=np.sqrt(data), extra_coords=[ ('pix', 0, u.Quantity(range(data.shape[0]), unit=u.pix)), ('hi', 1, u.Quantity(range(data.shape[1]), unit=u.s)), ('distance', None, u.Quantity(0, unit=u.cm)), ('time', None, datetime.datetime(2000, 1, 1, 0, 0))]) cube1_with_unit_and_uncertainty = NDCube( data, wt, missing_axes=[False, False, False, True], unit=u.km, uncertainty=np.sqrt(data), extra_coords=[ ('pix', 0, u.Quantity(range(data.shape[0]), unit=u.pix)), ('hi', 1, u.Quantity(range(data.shape[1]), unit=u.s)), ('distance', None, u.Quantity(0, unit=u.cm)), ('time', None, datetime.datetime(2000, 1, 1, 0, 0))]) cube3 = NDCube( data2, wt, missing_axes=[False, False, False, True], extra_coords=[ ('pix', 0, u.Quantity(np.arange(1, data2.shape[0] + 1), unit=u.pix) + cube1.extra_coords['pix']['value'][-1]), ('hi', 1, u.Quantity(range(data2.shape[1]), unit=u.s)), ('distance', None, u.Quantity(2, unit=u.cm)), ('time', None, datetime.datetime(2000, 1, 1, 0, 2))]) cube3_with_unit = NDCube( data2, wt, missing_axes=[False, False, False, True], unit=u.m, extra_coords=[ ('pix', 0, u.Quantity(np.arange(1, data2.shape[0] + 1), unit=u.pix) + cube1.extra_coords['pix']['value'][-1]), ('hi', 1, u.Quantity(range(data2.shape[1]), unit=u.s)), ('distance', None, u.Quantity(2, unit=u.cm)), ('time', None, datetime.datetime(2000, 1, 1, 0, 2))]) cube3_with_mask = NDCube( data2, wt, missing_axes=[False, False, False, True], mask=np.zeros_like(data2, dtype=bool), extra_coords=[ ('pix', 0, u.Quantity(np.arange(1, data2.shape[0] + 1), unit=u.pix) + cube1.extra_coords['pix']['value'][-1]), ('hi', 1, u.Quantity(range(data2.shape[1]), unit=u.s)), ('distance', None, u.Quantity(2, unit=u.cm)), ('time', None, datetime.datetime(2000, 1, 1, 0, 2))]) cube3_with_uncertainty = NDCube( data2, wt, missing_axes=[False, False, False, True], uncertainty=np.sqrt(data2), extra_coords=[ ('pix', 0, u.Quantity(np.arange(1, data2.shape[0] + 1), unit=u.pix) + cube1.extra_coords['pix']['value'][-1]), ('hi', 1, u.Quantity(range(data2.shape[1]), unit=u.s)), ('distance', None, u.Quantity(2, unit=u.cm)), ('time', None, datetime.datetime(2000, 1, 1, 0, 2))]) cube3_with_unit_and_uncertainty = NDCube( data2, wt, missing_axes=[False, False, False, True], unit=u.m, uncertainty=np.sqrt(data2), extra_coords=[ ('pix', 0, u.Quantity(np.arange(1, data2.shape[0] + 1), unit=u.pix) + cube1.extra_coords['pix']['value'][-1]), ('hi', 1, u.Quantity(range(data2.shape[1]), unit=u.s)), ('distance', None, u.Quantity(2, unit=u.cm)), ('time', None, datetime.datetime(2000, 1, 1, 0, 2))]) cubem1 = NDCube( data, wm, extra_coords=[ ('pix', 0, u.Quantity(range(data.shape[0]), unit=u.pix)), ('hi', 1, u.Quantity(range(data.shape[1]), unit=u.s)), ('distance', None, u.Quantity(0, unit=u.cm)), ('time', None, datetime.datetime(2000, 1, 1, 0, 0))]) cubem3 = NDCube( data2, wm, extra_coords=[ ('pix', 0, u.Quantity(range(data.shape[0]), unit=u.pix)), ('hi', 1, u.Quantity(range(data.shape[1]), unit=u.s)), ('distance', None, u.Quantity(0, unit=u.cm)), ('time', None, datetime.datetime(2000, 1, 1, 0, 0))]) # Define some test NDCubeSequences. common_axis = 0 seq = NDCubeSequence(data_list=[cube1, cube3, cube1, cube3], common_axis=common_axis) seq_no_common_axis = NDCubeSequence(data_list=[cube1, cube3, cube1, cube3]) seq_with_units = NDCubeSequence( data_list=[cube1_with_unit, cube3_with_unit, cube1_with_unit, cube3_with_unit], common_axis=common_axis) seq_with_masks = NDCubeSequence( data_list=[cube1_with_mask, cube3_with_mask, cube1_with_mask, cube3_with_mask], common_axis=common_axis) seq_with_unit0 = NDCubeSequence(data_list=[cube1_with_unit, cube3, cube1_with_unit, cube3], common_axis=common_axis) seq_with_mask0 = NDCubeSequence(data_list=[cube1_with_mask, cube3, cube1_with_mask, cube3], common_axis=common_axis) seq_with_uncertainty = NDCubeSequence(data_list=[cube1_with_uncertainty, cube3_with_uncertainty, cube1_with_uncertainty, cube3_with_uncertainty], common_axis=common_axis) seq_with_some_uncertainty = NDCubeSequence( data_list=[cube1_with_uncertainty, cube3, cube1, cube3_with_uncertainty], common_axis=common_axis) seq_with_units_and_uncertainty = NDCubeSequence( data_list=[cube1_with_unit_and_uncertainty, cube3_with_unit_and_uncertainty, cube1_with_unit_and_uncertainty, cube3_with_unit_and_uncertainty], common_axis=common_axis) seq_with_units_and_some_uncertainty = NDCubeSequence( data_list=[cube1_with_unit_and_uncertainty, cube3_with_unit, cube1_with_unit, cube3_with_unit_and_uncertainty], common_axis=common_axis) seq_with_some_masks = NDCubeSequence(data_list=[cube1_with_mask, cube3, cube1, cube3_with_mask], common_axis=common_axis) seqm = NDCubeSequence(data_list=[cubem1, cubem3, cubem1, cubem3], common_axis=common_axis) # Derive some expected data arrays in plot objects. seq_data_stack = np.stack([cube.data for cube in seq_with_masks.data]) seq_mask_stack = np.stack([cube.mask for cube in seq_with_masks.data]) seq_stack = np.ma.masked_array(seq_data_stack, seq_mask_stack) seq_stack_km = np.ma.masked_array( np.stack([(cube.data * cube.unit).to(u.km).value for cube in seq_with_units.data]), seq_mask_stack) seq_data_concat = np.concatenate([cube.data for cube in seq_with_masks.data], axis=common_axis) seq_mask_concat = np.concatenate([cube.mask for cube in seq_with_masks.data], axis=common_axis) seq_concat = np.ma.masked_array(seq_data_concat, seq_mask_concat) seq_concat_km = np.ma.masked_array( np.concatenate([(cube.data * cube.unit).to(u.km).value for cube in seq_with_units.data], axis=common_axis), seq_mask_concat) # Derive expected axis_ranges for non-cube-like cases. x_axis_coords3 = np.array([0.4, 0.8, 1.2, 1.6]).reshape((1, 1, 4)) new_x_axis_coords3_shape = u.Quantity(seq.dimensions, unit=u.pix).value.astype(int) new_x_axis_coords3_shape[-1] = 1 none_axis_ranges_axis3 = [np.arange(0, len(seq.data)), np.array([0., 1.]), np.arange(0, 3), np.tile(np.array(x_axis_coords3), new_x_axis_coords3_shape)] none_axis_ranges_axis0 = [np.arange(len(seq.data)), np.array([0., 1.]), np.arange(0, 3), np.arange(0, int(seq.dimensions[-1].value))] distance0_none_axis_ranges_axis0 = \ [edges_to_centers_nd(_get_extra_coord_edges(seq.sequence_axis_extra_coords["distance"].value), 0), np.array([0., 1.]), np.arange(0, 3), np.arange(0, int(seq.dimensions[-1].value))] distance0_none_axis_ranges_axis0_mm = \ [edges_to_centers_nd(_get_extra_coord_edges(seq.sequence_axis_extra_coords["distance"].to( "mm").value), 0), np.array([0., 1.]), np.arange(0, 3), np.arange(0, int(seq.dimensions[-1].value))] userrangequantity_none_axis_ranges_axis0 = [ np.arange(int(seq.dimensions[0].value)), np.array([0., 1.]), np.arange(0, 3), np.arange(0, int(seq.dimensions[-1].value))] userrangequantity_none_axis_ranges_axis0_1e7 = [ (np.arange(int(seq.dimensions[0].value)) * u.J).to(u.erg).value, np.array([0., 1.]), np.arange(0, 3), np.arange(0, int(seq.dimensions[-1].value))] hi2_none_axis_ranges_axis2 = [ np.arange(0, len(seq.data)), np.array([0., 1.]), np.array([0, 1, 2]), np.arange(0, int(seq.dimensions[-1].value))] x_axis_coords1 = np.zeros(tuple([int(s.value) for s in seq.dimensions])) x_axis_coords1[0, 1] = 1. x_axis_coords1[1, 0] = 2. x_axis_coords1[1, 1] = 3. x_axis_coords1[2, 1] = 1. x_axis_coords1[3, 0] = 2. x_axis_coords1[3, 1] = 3. pix1_none_axis_ranges_axis1 = [ np.arange(0, len(seq.data)), x_axis_coords1, np.arange(0, 3), np.arange(0, int(seq.dimensions[-1].value))] # Derive expected extents seq_axis1_lim_deg = [0.49998731, 0.99989848] seq_axis1_lim_arcsec = [(axis1_xlim * u.deg).to(u.arcsec).value for axis1_xlim in seq_axis1_lim_deg] seq_axis2_lim_m = [seq[:, :, :, 0].data[0].axis_world_coords()[-1][0].value, seq[:, :, :, 0].data[0].axis_world_coords()[-1][-1].value] # Derive expected axis_ranges for cube-like cases. cube_like_new_x_axis_coords2_shape = u.Quantity( seq.cube_like_dimensions, unit=u.pix).value.astype(int) cube_like_new_x_axis_coords2_shape[-1] = 1 cubelike_none_axis_ranges_axis2 = [ np.arange(0, int(seq.cube_like_dimensions[0].value)), np.arange(0, 3), np.tile(x_axis_coords3, cube_like_new_x_axis_coords2_shape)] cubelike_none_axis_ranges_axis2_s = copy.deepcopy(cubelike_none_axis_ranges_axis2) cubelike_none_axis_ranges_axis2_s[2] = cubelike_none_axis_ranges_axis2_s[2] * 60. cubelike_none_axis_ranges_axis0 = [[-0.5, 7.5], np.arange(0, 3), np.arange(0, int(seq.cube_like_dimensions[-1].value))] @pytest.mark.parametrize("test_input, test_kwargs, expected_values", [ (seq[:, 0, 0, 0], {}, (np.arange(len(seq.data)), np.array([1, 11, 1, 11]), "meta.obs.sequence [None]", "Data [None]", (0, len(seq[:, 0, 0, 0].data) - 1), (min([cube.data.min() for cube in seq[:, 0, 0, 0].data]), max([cube.data.max() for cube in seq[:, 0, 0, 0].data])))), (seq_with_units[:, 0, 0, 0], {}, (np.arange(len(seq_with_units.data)), np.array([1, 0.011, 1, 0.011]), "meta.obs.sequence [None]", "Data [km]", (0, len(seq_with_units[:, 0, 0, 0].data) - 1), (min([(cube.data * cube.unit).to(seq_with_units[:, 0, 0, 0].data[0].unit).value for cube in seq_with_units[:, 0, 0, 0].data]), max([(cube.data * cube.unit).to(seq_with_units[:, 0, 0, 0].data[0].unit).value for cube in seq_with_units[:, 0, 0, 0].data])))), (seq_with_uncertainty[:, 0, 0, 0], {}, (np.arange(len(seq_with_uncertainty.data)), np.array([1, 11, 1, 11]), "meta.obs.sequence [None]", "Data [None]", (0, len(seq_with_uncertainty[:, 0, 0, 0].data) - 1), (min([cube.data for cube in seq_with_uncertainty[:, 0, 0, 0].data]), max([cube.data for cube in seq_with_uncertainty[:, 0, 0, 0].data])))), (seq_with_units_and_uncertainty[:, 0, 0, 0], {}, (np.arange(len(seq_with_units_and_uncertainty.data)), np.array([1, 0.011, 1, 0.011]), "meta.obs.sequence [None]", "Data [km]", (0, len(seq_with_units_and_uncertainty[:, 0, 0, 0].data) - 1), (min([(cube.data * cube.unit).to(seq_with_units_and_uncertainty[:, 0, 0, 0].data[0].unit).value for cube in seq_with_units_and_uncertainty[:, 0, 0, 0].data]), max([(cube.data * cube.unit).to(seq_with_units_and_uncertainty[:, 0, 0, 0].data[0].unit).value for cube in seq_with_units_and_uncertainty[:, 0, 0, 0].data])))), (seq_with_units_and_some_uncertainty[:, 0, 0, 0], {}, (np.arange(len(seq_with_units_and_some_uncertainty.data)), np.array([1, 0.011, 1, 0.011]), "meta.obs.sequence [None]", "Data [km]", (0, len(seq_with_units_and_some_uncertainty[:, 0, 0, 0].data) - 1), (min([(cube.data * cube.unit).to( seq_with_units_and_some_uncertainty[:, 0, 0, 0].data[0].unit).value for cube in seq_with_units_and_some_uncertainty[:, 0, 0, 0].data]), max([(cube.data * cube.unit).to( seq_with_units_and_some_uncertainty[:, 0, 0, 0].data[0].unit).value for cube in seq_with_units_and_some_uncertainty[:, 0, 0, 0].data])))), (seq[:, 0, 0, 0], {"axes_coordinates": "distance"}, ((seq.sequence_axis_extra_coords["distance"]), np.array([1, 11, 1, 11]), "distance [{}]".format(seq.sequence_axis_extra_coords["distance"].unit), "Data [None]", (min(seq.sequence_axis_extra_coords["distance"].value), max(seq.sequence_axis_extra_coords["distance"].value)), (min([cube.data.min() for cube in seq[:, 0, 0, 0].data]), max([cube.data.max() for cube in seq[:, 0, 0, 0].data])))), (seq[:, 0, 0, 0], {"axes_coordinates": u.Quantity(np.arange(len(seq.data)), unit=u.cm), "axes_units": u.km}, (u.Quantity(np.arange(len(seq.data)), unit=u.cm).to(u.km), np.array([1, 11, 1, 11]), "meta.obs.sequence [km]", "Data [None]", (min(u.Quantity(np.arange(len(seq.data)), unit=u.cm).to(u.km).value), max(u.Quantity(np.arange(len(seq.data)), unit=u.cm).to(u.km).value)), (min([cube.data.min() for cube in seq[:, 0, 0, 0].data]), max([cube.data.max() for cube in seq[:, 0, 0, 0].data])))) ]) def test_sequence_plot_1D_plot(test_input, test_kwargs, expected_values): # Unpack expected values expected_x_data, expected_y_data, expected_xlabel, expected_ylabel, \ expected_xlim, expected_ylim = expected_values # Run plot method output = test_input.plot(**test_kwargs) # Check values are correct assert isinstance(output, matplotlib.axes.Axes) np.testing.assert_array_equal(output.lines[0].get_xdata(), expected_x_data) np.testing.assert_array_equal(output.lines[0].get_ydata(), expected_y_data) assert output.axes.get_xlabel() == expected_xlabel assert output.axes.get_ylabel() == expected_ylabel output_xlim = output.axes.get_xlim() assert output_xlim[0] <= expected_xlim[0] assert output_xlim[1] >= expected_xlim[1] output_ylim = output.axes.get_ylim() assert output_ylim[0] <= expected_ylim[0] assert output_ylim[1] >= expected_ylim[1] @pytest.mark.parametrize("test_input, test_kwargs, expected_values", [ (seq[:, :, 0, 0], {}, (np.array([0.49998731, 0.99989848, 0.49998731, 0.99989848, 0.49998731, 0.99989848, 0.49998731, 0.99989848]), np.array([1, 2, 11, 22, 1, 2, 11, 22]), "{} [{}]".format(seq[:, :, 0, 0].cube_like_world_axis_physical_types[common_axis], "deg"), "Data [None]", tuple(seq_axis1_lim_deg), (min([cube.data.min() for cube in seq[:, :, 0, 0].data]), max([cube.data.max() for cube in seq[:, :, 0, 0].data])))), (seq_with_units[:, :, 0, 0], {}, (np.array([0.49998731, 0.99989848, 0.49998731, 0.99989848, 0.49998731, 0.99989848, 0.49998731, 0.99989848]), np.array([1, 2, 0.011, 0.022, 1, 2, 0.011, 0.022]), "{} [{}]".format(seq[:, :, 0, 0].cube_like_world_axis_physical_types[common_axis], "deg"), "Data [km]", tuple(seq_axis1_lim_deg), (min([min((cube.data * cube.unit).to(u.km).value) for cube in seq_with_units[:, :, 0, 0].data]), max([max((cube.data * cube.unit).to(u.km).value) for cube in seq_with_units[:, :, 0, 0].data])))), (seq_with_uncertainty[:, :, 0, 0], {}, (np.array([0.49998731, 0.99989848, 0.49998731, 0.99989848, 0.49998731, 0.99989848, 0.49998731, 0.99989848]), np.array([1, 2, 11, 22, 1, 2, 11, 22]), "{} [{}]".format( seq_with_uncertainty[:, :, 0, 0].cube_like_world_axis_physical_types[ common_axis], "deg"), "Data [None]", tuple(seq_axis1_lim_deg), (min([cube.data.min() for cube in seq_with_uncertainty[:, :, 0, 0].data]), max([cube.data.max() for cube in seq_with_uncertainty[:, :, 0, 0].data])))), (seq_with_some_uncertainty[:, :, 0, 0], {}, (np.array([0.49998731, 0.99989848, 0.49998731, 0.99989848, 0.49998731, 0.99989848, 0.49998731, 0.99989848]), np.array([1, 2, 11, 22, 1, 2, 11, 22]), "{} [{}]".format( seq_with_some_uncertainty[:, :, 0, 0].cube_like_world_axis_physical_types[ common_axis], "deg"), "Data [None]", tuple(seq_axis1_lim_deg), (min([cube.data.min() for cube in seq_with_some_uncertainty[:, :, 0, 0].data]), max([cube.data.max() for cube in seq_with_some_uncertainty[:, :, 0, 0].data])))), (seq_with_units_and_uncertainty[:, :, 0, 0], {}, (np.array([0.49998731, 0.99989848, 0.49998731, 0.99989848, 0.49998731, 0.99989848, 0.49998731, 0.99989848]), np.array([1, 2, 0.011, 0.022, 1, 2, 0.011, 0.022]), "{} [{}]".format( seq_with_units_and_uncertainty[:, :, 0, 0].cube_like_world_axis_physical_types[ common_axis], "deg"), "Data [km]", tuple(seq_axis1_lim_deg), (min([min((cube.data * cube.unit).to(u.km).value) for cube in seq_with_units[:, :, 0, 0].data]), max([max((cube.data * cube.unit).to(u.km).value) for cube in seq_with_units[:, :, 0, 0].data])))), (seq_with_units_and_some_uncertainty[:, :, 0, 0], {}, (np.array([0.49998731, 0.99989848, 0.49998731, 0.99989848, 0.49998731, 0.99989848, 0.49998731, 0.99989848]), np.array([1, 2, 0.011, 0.022, 1, 2, 0.011, 0.022]), "{} [{}]".format( seq_with_units_and_some_uncertainty[:, :, 0, 0].cube_like_world_axis_physical_types[ common_axis], "deg"), "Data [km]", tuple(seq_axis1_lim_deg), (min([min((cube.data * cube.unit).to(u.km).value) for cube in seq_with_units[:, :, 0, 0].data]), max([max((cube.data * cube.unit).to(u.km).value) for cube in seq_with_units[:, :, 0, 0].data])))), (seq[:, :, 0, 0], {"axes_coordinates": "pix"}, (seq[:, :, 0, 0].common_axis_extra_coords["pix"].value, np.array([1, 2, 11, 22, 1, 2, 11, 22]), "pix [pix]", "Data [None]", (min(seq[:, :, 0, 0].common_axis_extra_coords["pix"].value), max(seq[:, :, 0, 0].common_axis_extra_coords["pix"].value)), (min([cube.data.min() for cube in seq[:, :, 0, 0].data]), max([cube.data.max() for cube in seq[:, :, 0, 0].data])))), (seq[:, :, 0, 0], {"axes_coordinates": np.arange(10, 10 + seq[:, :, 0, 0].cube_like_dimensions[0].value)}, (np.arange(10, 10 + seq[:, :, 0, 0].cube_like_dimensions[0].value), np.array([1, 2, 11, 22, 1, 2, 11, 22]), "{} [{}]".format("", None), "Data [None]", (10, 10 + seq[:, :, 0, 0].cube_like_dimensions[0].value - 1), (min([cube.data.min() for cube in seq[:, :, 0, 0].data]), max([cube.data.max() for cube in seq[:, :, 0, 0].data])))) ]) def test_sequence_plot_as_cube_1D_plot(test_input, test_kwargs, expected_values): # Unpack expected values expected_x_data, expected_y_data, expected_xlabel, expected_ylabel, \ expected_xlim, expected_ylim = expected_values # Run plot method output = test_input.plot_as_cube(**test_kwargs) # Check values are correct # Check type of ouput plot object assert isinstance(output, matplotlib.axes.Axes) # Check x and y data are correct. assert np.allclose(output.lines[0].get_xdata(), expected_x_data) assert np.allclose(output.lines[0].get_ydata(), expected_y_data) # Check x and y axis labels are correct. assert output.axes.get_xlabel() == expected_xlabel assert output.axes.get_ylabel() == expected_ylabel # Check all data is contained within x and y axes limits. output_xlim = output.axes.get_xlim() assert output_xlim[0] <= expected_xlim[0] assert output_xlim[1] >= expected_xlim[1] output_ylim = output.axes.get_ylim() assert output_ylim[0] <= expected_ylim[0] assert output_ylim[1] >= expected_ylim[1] def test_sequence_plot_as_cube_error(): with pytest.raises(TypeError): seq_no_common_axis.plot_as_cube() @pytest.mark.parametrize("test_input, test_kwargs, expected_values", [ (seq[:, :, 0, 0], {}, (seq_stack[:, :, 0, 0], "custom:pos.helioprojective.lat [deg]", "meta.obs.sequence [None]", tuple(seq_axis1_lim_deg + [0, len(seq.data) - 1]))), (seq_with_units[:, :, 0, 0], {}, (seq_stack_km[:, :, 0, 0], "custom:pos.helioprojective.lat [deg]", "meta.obs.sequence [None]", tuple(seq_axis1_lim_deg + [0, len(seq.data) - 1]))), (seq[:, :, 0, 0], {"plot_axis_indices": [0, 1]}, (seq_stack[:, :, 0, 0].transpose(), "meta.obs.sequence [None]", "custom:pos.helioprojective.lat [deg]", tuple([0, len(seq.data) - 1] + seq_axis1_lim_deg))), (seq[:, :, 0, 0], {"axes_coordinates": ["pix", "distance"]}, (seq_stack[:, :, 0, 0], "pix [pix]", "distance [cm]", (min(seq[0, :, 0, 0].extra_coords["pix"]["value"].value), max(seq[0, :, 0, 0].extra_coords["pix"]["value"].value), min(seq[:, :, 0, 0].sequence_axis_extra_coords["distance"].value), max(seq[:, :, 0, 0].sequence_axis_extra_coords["distance"].value)))), # This example shows weakness of current extra coord axis values on 2D plotting! # Only the coordinates from the first cube are shown. (seq[:, :, 0, 0], {"axes_coordinates": [np.arange( 10, 10 + seq[:, :, 0, 0].dimensions[-1].value), "distance"], "axes_units": [None, u.m]}, (seq_stack[:, :, 0, 0], " [None]", "distance [m]", (10, 10 + seq[:, :, 0, 0].dimensions[-1].value - 1, min(seq[:, :, 0, 0].sequence_axis_extra_coords["distance"].to(u.m).value), max(seq[:, :, 0, 0].sequence_axis_extra_coords["distance"].to(u.m).value)))), (seq[:, :, 0, 0], {"axes_coordinates": [np.arange( 10, 10 + seq[:, :, 0, 0].dimensions[-1].value) * u.deg, None], "axes_units": [u.arcsec, None]}, (seq_stack[:, :, 0, 0], " [arcsec]", "meta.obs.sequence [None]", tuple(list( (np.arange(10, 10 + seq[:, :, 0, 0].dimensions[-1].value) * u.deg).to(u.arcsec).value) \ + [0, len(seq.data) - 1]))) ]) def test_sequence_plot_2D_image(test_input, test_kwargs, expected_values): # Unpack expected values expected_data, expected_xlabel, expected_ylabel, expected_extent = expected_values # Run plot method output = test_input.plot(**test_kwargs) # Check values are correct assert isinstance(output, matplotlib.axes.Axes) np.testing.assert_array_equal(output.images[0].get_array(), expected_data) assert output.xaxis.get_label_text() == expected_xlabel assert output.yaxis.get_label_text() == expected_ylabel assert np.allclose(output.images[0].get_extent(), expected_extent, rtol=1e-3) # Also check x and y values????? @pytest.mark.parametrize("test_input, test_kwargs, expected_error", [ (seq[:, :, 0, 0], {"axes_coordinates": [ np.arange(10, 10 + seq[:, :, 0, 0].dimensions[-1].value), None], "axes_units": [u.m, None]}, ValueError), (seq[:, :, 0, 0], {"axes_coordinates": [ None, np.arange(10, 10 + seq[:, :, 0, 0].dimensions[0].value)], "axes_units": [None, u.m]}, ValueError) ]) def test_sequence_plot_2D_image_errors(test_input, test_kwargs, expected_error): with pytest.raises(expected_error): output = test_input.plot(**test_kwargs) @pytest.mark.parametrize("test_input, test_kwargs, expected_values", [ (seq[:, :, :, 0], {}, (seq_concat[:, :, 0], "em.wl [m]", "custom:pos.helioprojective.lat [deg]", tuple(seq_axis2_lim_m + seq_axis1_lim_deg))), (seq_with_units[:, :, :, 0], {}, (seq_concat_km[:, :, 0], "em.wl [m]", "custom:pos.helioprojective.lat [deg]", tuple(seq_axis2_lim_m + seq_axis1_lim_deg))), (seq[:, :, :, 0], {"plot_axis_indices": [0, 1], "axes_coordinates": ["pix", "hi"]}, (seq_concat[:, :, 0].transpose(), "pix [pix]", "hi [s]", ((seq[:, :, :, 0].common_axis_extra_coords["pix"][0].value, seq[:, :, :, 0].common_axis_extra_coords["pix"][-1].value, seq[:, :, :, 0].data[0].extra_coords["hi"]["value"][0].value, seq[:, :, :, 0].data[0].extra_coords["hi"]["value"][-1].value)))), (seq[:, :, :, 0], {"axes_coordinates": [ np.arange(10, 10 + seq[:, :, :, 0].cube_like_dimensions[-1].value) * u.m, np.arange(10, 10 + seq[:, :, :, 0].cube_like_dimensions[0].value) * u.m]}, (seq_concat[:, :, 0], " [m]", " [m]", (10, 10 + seq[:, :, :, 0].cube_like_dimensions[-1].value - 1, 10, 10 + seq[:, :, :, 0].cube_like_dimensions[0].value - 1))), (seq[:, :, :, 0], {"axes_coordinates": [ np.arange(10, 10 + seq[:, :, :, 0].cube_like_dimensions[-1].value) * u.m, np.arange(10, 10 + seq[:, :, :, 0].cube_like_dimensions[0].value) * u.m], "axes_units": ["cm", u.cm]}, (seq_concat[:, :, 0], " [cm]", " [cm]", (10 * 100, (10 + seq[:, :, :, 0].cube_like_dimensions[-1].value - 1) * 100, 10 * 100, (10 + seq[:, :, :, 0].cube_like_dimensions[0].value - 1) * 100))) ]) def test_sequence_plot_as_cube_2D_image(test_input, test_kwargs, expected_values): # Unpack expected values expected_data, expected_xlabel, expected_ylabel, expected_extent = expected_values # Run plot method output = test_input.plot_as_cube(**test_kwargs) # Check values are correct assert isinstance(output, matplotlib.axes.Axes) np.testing.assert_array_equal(output.images[0].get_array(), expected_data) assert output.xaxis.get_label_text() == expected_xlabel assert output.yaxis.get_label_text() == expected_ylabel assert np.allclose(output.images[0].get_extent(), expected_extent, rtol=1e-3) # Also check x and y values????? @pytest.mark.parametrize("test_input, test_kwargs, expected_error", [ (seq[:, :, :, 0], {"axes_coordinates": [ np.arange(10, 10 + seq[:, :, :, 0].cube_like_dimensions[-1].value), None], "axes_units": [u.m, None]}, ValueError), (seq[:, :, :, 0], {"axes_coordinates": [ None, np.arange(10, 10 + seq[:, :, :, 0].cube_like_dimensions[0].value)], "axes_units": [None, u.m]}, ValueError) ]) def test_sequence_plot_as_cube_2D_image_errors(test_input, test_kwargs, expected_error): with pytest.raises(expected_error): output = test_input.plot_as_cube(**test_kwargs) @pytest.mark.parametrize("test_input, test_kwargs, expected_data", [ (seq, {}, seq_stack.reshape(4, 1, 2, 3, 4)), (seq_with_units, {}, seq_stack_km.reshape(4, 1, 2, 3, 4)) ]) def test_sequence_plot_ImageAnimator(test_input, test_kwargs, expected_data): # Run plot method output = test_input.plot(**test_kwargs) # Check plot object properties are correct. assert isinstance(output, ndcube.mixins.sequence_plotting.ImageAnimatorNDCubeSequence) np.testing.assert_array_equal(output.data, expected_data) @pytest.mark.parametrize("test_input, test_kwargs, expected_data", [ (seq, {}, seq_concat.reshape(1, 8, 3, 4)), (seq_with_units, {}, seq_concat_km.reshape(1, 8, 3, 4)) ]) def test_sequence_plot_as_cube_ImageAnimator(test_input, test_kwargs, expected_data): # Run plot method output = test_input.plot_as_cube(**test_kwargs) # Check plot object properties are correct. assert isinstance(output, ndcube.mixins.sequence_plotting.ImageAnimatorCubeLikeNDCubeSequence) np.testing.assert_array_equal(output.data, expected_data) @pytest.mark.parametrize("test_input, expected", [ ((seq_with_unit0.data, None), (None, None)), ((seq_with_unit0.data, u.km), (None, None)), ((seq_with_units.data, None), ([u.km, u.m, u.km, u.m], u.km)), ((seq_with_units.data, u.cm), ([u.km, u.m, u.km, u.m], u.cm))]) def test_determine_sequence_units(test_input, expected): output_seq_unit, output_unit = ndcube.mixins.sequence_plotting._determine_sequence_units( test_input[0], unit=test_input[1]) assert output_seq_unit == expected[0] assert output_unit == expected[1] def test_determine_sequence_units(): with pytest.raises(ValueError): output_seq_unit, output_unit = ndcube.mixins.sequence_plotting._determine_sequence_units( seq.data, u.m) @pytest.mark.parametrize("test_input, expected", [ ((3, 1, "time", u.s), ([1], [None, 'time', None], [None, u.s, None])), ((3, None, None, None), ([-1, -2], None, None))]) def test_prep_axes_kwargs(test_input, expected): output = ndcube.mixins.sequence_plotting._prep_axes_kwargs(*test_input) for i in range(3): assert output[i] == expected[i] @pytest.mark.parametrize("test_input, expected_error", [ ((3, [0, 1, 2], ["time", "pix"], u.s), ValueError), ((3, 0, ["time", "pix"], u.s), ValueError), ((3, 0, "time", [u.s, u.pix]), ValueError), ((3, 0, 0, u.s), TypeError), ((3, 0, "time", 0), TypeError)]) def test_prep_axes_kwargs_errors(test_input, expected_error): with pytest.raises(expected_error): output = ndcube.mixins.sequence_plotting._prep_axes_kwargs(*test_input) @pytest.mark.parametrize("test_input, test_kwargs, expected_values", [ (seq, {"plot_axis_indices": 3}, (seq_stack.data, none_axis_ranges_axis3, "time [min]", "Data [None]", (none_axis_ranges_axis3[-1].min(), none_axis_ranges_axis3[-1].max()), (seq_stack.data.min(), seq_stack.data.max()))), (seq_with_units, {"plot_axis_indices": -1, "data_unit": u.km}, (seq_stack_km.data, none_axis_ranges_axis3, "time [min]", "Data [km]", (none_axis_ranges_axis3[-1].min(), none_axis_ranges_axis3[-1].max()), (seq_stack_km.data.min(), seq_stack_km.data.max()))), (seq_with_masks, {"plot_axis_indices": 0}, (seq_stack, none_axis_ranges_axis0, "meta.obs.sequence [None]", "Data [None]", (none_axis_ranges_axis0[0].min(), none_axis_ranges_axis0[0].max()), (seq_stack.data.min(), seq_stack.data.max()))), (seq_with_some_masks, {"plot_axis_indices": 0}, (seq_stack, none_axis_ranges_axis0, "meta.obs.sequence [None]", "Data [None]", (none_axis_ranges_axis0[0].min(), none_axis_ranges_axis0[0].max()), (seq_stack.data.min(), seq_stack.data.max()))), (seq, {"plot_axis_indices": 0, "axes_coordinates": "distance"}, (seq_stack.data, distance0_none_axis_ranges_axis0, "distance [cm]", "Data [None]", (seq.sequence_axis_extra_coords["distance"].value.min(), seq.sequence_axis_extra_coords["distance"].value.max()), (seq_stack.data.min(), seq_stack.data.max()))), (seq, {"plot_axis_indices": 0, "axes_coordinates": "distance", "axes_units": "mm"}, (seq_stack.data, distance0_none_axis_ranges_axis0_mm, "distance [mm]", "Data [None]", (seq.sequence_axis_extra_coords["distance"].to("mm").value.min(), seq.sequence_axis_extra_coords["distance"].to("mm").value.max()), (seq_stack.data.min(), seq_stack.data.max()))), (seq, {"plot_axis_indices": 0, "axes_coordinates": _get_extra_coord_edges(userrangequantity_none_axis_ranges_axis0[0]) * u.J}, (seq_stack.data, userrangequantity_none_axis_ranges_axis0, " [J]", "Data [None]", (userrangequantity_none_axis_ranges_axis0[0].min(), userrangequantity_none_axis_ranges_axis0[0].max()), (seq_stack.data.min(), seq_stack.data.max()))), (seq, {"plot_axis_indices": 0, "axes_units": u.erg, "axes_coordinates": _get_extra_coord_edges(userrangequantity_none_axis_ranges_axis0[0]) * u.J}, (seq_stack.data, userrangequantity_none_axis_ranges_axis0_1e7, " [erg]", "Data [None]", (userrangequantity_none_axis_ranges_axis0_1e7[0].min(), userrangequantity_none_axis_ranges_axis0_1e7[0].max()), (seq_stack.data.min(), seq_stack.data.max()))), (seq, {"plot_axis_indices": 2, "axes_coordinates": "hi"}, (seq_stack.data, hi2_none_axis_ranges_axis2, "hi [s]", "Data [None]", (hi2_none_axis_ranges_axis2[2].min(), hi2_none_axis_ranges_axis2[2].max()), (seq_stack.data.min(), seq_stack.data.max()))), (seq, {"plot_axis_indices": 1, "axes_coordinates": "pix"}, (seq_stack.data, pix1_none_axis_ranges_axis1, "pix [pix]", "Data [None]", (pix1_none_axis_ranges_axis1[1].min(), pix1_none_axis_ranges_axis1[1].max()), (seq_stack.data.min(), seq_stack.data.max()))) ]) def test_sequence_plot_LineAnimator(test_input, test_kwargs, expected_values): # Unpack expected values expected_data, expected_axis_ranges, expected_xlabel, \ expected_ylabel, expected_xlim, expected_ylim = expected_values # Run plot method. output = test_input.plot(**test_kwargs) # Check right type of plot object is produced. assert isinstance(output, ndcube.mixins.sequence_plotting.LineAnimatorNDCubeSequence) # Check data being plotted is correct np.testing.assert_array_equal(output.data, expected_data) if isinstance(expected_data, np.ma.core.MaskedArray): np.testing.assert_array_equal(output.data.mask, expected_data.mask) # Check values of axes and sliders is correct. for i in range(len(output.axis_ranges)): _test_axis_ranges(output.axis_ranges[i], expected_axis_ranges[i]) # Check plot axis labels and limits are correct assert output.xlabel == expected_xlabel assert output.ylabel == expected_ylabel assert output.xlim == expected_xlim assert output.ylim == expected_ylim @pytest.mark.parametrize("test_input, test_kwargs, expected_values", [ (seq, {"plot_axis_indices": 2, "axes_units": u.s}, (seq_concat.data, cubelike_none_axis_ranges_axis2_s, "time [s]", "Data [None]", (cubelike_none_axis_ranges_axis2_s[2].min(), cubelike_none_axis_ranges_axis2_s[2].max()), (seq_concat.data.min(), seq_concat.data.max()))), (seq, {"plot_axis_indices": 0}, (seq_concat.data, cubelike_none_axis_ranges_axis0, "custom:pos.helioprojective.lat [deg]", "Data [None]", (0, 7), (seq_concat.data.min(), seq_concat.data.max()))), (seq_with_masks, {"plot_axis_indices": 0}, (seq_concat.data, cubelike_none_axis_ranges_axis0, "custom:pos.helioprojective.lat [deg]", "Data [None]", (0, 7), (seq_concat.data.min(), seq_concat.data.max()))), (seq_with_some_masks, {"plot_axis_indices": -3}, (seq_concat.data, cubelike_none_axis_ranges_axis0, "custom:pos.helioprojective.lat [deg]", "Data [None]", (0, 7), (seq_concat.data.min(), seq_concat.data.max()))), (seqm, {"plot_axis_indices": 0}, (seq_concat.data, cubelike_none_axis_ranges_axis0, "custom:pos.helioprojective.lon [deg]", "Data [None]", (0, 7), (seq_concat.data.min(), seq_concat.data.max()))) ]) def test_sequence_plot_as_cube_LineAnimator(test_input, test_kwargs, expected_values): # Unpack expected values expected_data, expected_axis_ranges, expected_xlabel, \ expected_ylabel, expected_xlim, expected_ylim = expected_values # Run plot method. output = test_input.plot_as_cube(**test_kwargs) # Check right type of plot object is produced. assert isinstance(output, ndcube.mixins.sequence_plotting.LineAnimatorCubeLikeNDCubeSequence) # Check data being plotted is correct np.testing.assert_array_equal(output.data, expected_data) if isinstance(expected_data, np.ma.core.MaskedArray): np.testing.assert_array_equal(output.data.mask, expected_data.mask) # Check values of axes and sliders is correct. for i in range(len(output.axis_ranges)): _test_axis_ranges(output.axis_ranges[i], expected_axis_ranges[i]) # Check plot axis labels and limits are correct assert output.xlabel == expected_xlabel assert output.ylabel == expected_ylabel assert output.xlim == expected_xlim assert output.ylim == expected_ylim def _test_axis_ranges(axis_ranges, expected_ranges): if callable(axis_ranges): assert np.allclose(axis_ranges(np.arange(len(expected_ranges))), expected_ranges) else: assert np.allclose(axis_ranges, expected_ranges) ././@PaxHeader0000000000000000000000000000003400000000000011452 xustar000000000000000028 mtime=1605787057.1544762 ndcube-1.4.2/ndcube/tests/test_utils_collection.py0000644000175100001640000000215600000000000022640 0ustar00vstsdocker00000000000000 import pytest import astropy.units as u from ndcube.utils import collection as collection_utils @pytest.mark.parametrize("data_dimensions1,data_dimensions2,data_axes1,data_axes2", [ ([3., 4., 5.]*u.pix, [3., 5., 15.]*u.pix, (0, 2), (0, 1))]) def test_assert_aligned_axes_compatible(data_dimensions1, data_dimensions2, data_axes1, data_axes2): collection_utils.assert_aligned_axes_compatible(data_dimensions1, data_dimensions2, data_axes1, data_axes2) @pytest.mark.parametrize("data_dimensions1,data_dimensions2,data_axes1,data_axes2", [ ([3., 4., 5.]*u.pix, [3., 5., 15.]*u.pix, (0, 1), (0, 1)), ([3., 4., 5.]*u.pix, [3., 5., 15.]*u.pix, (0, 1), (0, 1, 2))]) def test_assert_aligned_axes_compatible_error(data_dimensions1, data_dimensions2, data_axes1, data_axes2): with pytest.raises(ValueError): collection_utils.assert_aligned_axes_compatible(data_dimensions1, data_dimensions2, data_axes1, data_axes2) ././@PaxHeader0000000000000000000000000000003400000000000011452 xustar000000000000000028 mtime=1605787057.1544762 ndcube-1.4.2/ndcube/tests/test_utils_cube.py0000644000175100001640000001322000000000000021415 0ustar00vstsdocker00000000000000import pytest import unittest import numpy as np import astropy.units as u from ndcube import utils missing_axes_none = [False] * 3 missing_axes_0_2 = [True, False, True] missing_axes_1 = [False, True, False] axes_length = 3 extra_coords_dict = {"time": {"axis": 0, "value": u.Quantity(range(axes_length), unit=u.pix)}, "hello": {"axis": 1, "value": u.Quantity(range(axes_length), unit=u.pix)}} extra_coords_input = [('time', 0, u.Quantity(range(axes_length), unit=u.pix)), ('hello', 1, u.Quantity(range(axes_length), unit=u.pix))] extra_coords_dict_wcs = {"time": {"wcs axis": 0, "value": u.Quantity(range(axes_length), unit=u.pix)}, "hello": {"wcs axis": 1, "value": u.Quantity(range(axes_length), unit=u.pix)}} @pytest.mark.parametrize( "test_input,expected", [((None, missing_axes_none), None), ((0, missing_axes_none), 2), ((1, missing_axes_none), 1), ((0, missing_axes_0_2), 1), ((1, missing_axes_1), 0), ((-1, missing_axes_0_2), 1), ((-2, missing_axes_1), 2), ((-1, missing_axes_none), 0)]) def test_data_axis_to_wcs_axis(test_input, expected): assert utils.cube.data_axis_to_wcs_axis(*test_input) == expected @pytest.mark.parametrize("test_input", [(-2, missing_axes_0_2), (1, missing_axes_0_2)]) def test_data_axis_to_wcs_axis_error(test_input): with pytest.raises(IndexError): utils.cube.data_axis_to_wcs_axis(*test_input) @pytest.mark.parametrize( "test_input,expected", [((None, missing_axes_none), None), ((0, missing_axes_none), 2), ((1, missing_axes_none), 1), ((1, missing_axes_0_2), 0), ((0, missing_axes_1), 1), ((-1, missing_axes_0_2), None), ((-2, missing_axes_0_2), 0), ((-2, missing_axes_1), None), ((-3, missing_axes_1), 1), ((-1, missing_axes_none), 0)]) def test_wcs_axis_to_data_axis(test_input, expected): assert utils.cube.wcs_axis_to_data_axis(*test_input) == expected @pytest.mark.parametrize("test_input", [(-10, missing_axes_0_2), (10, missing_axes_0_2)]) def test_wcs_axis_to_data_axis_error(test_input): with pytest.raises(IndexError): utils.cube.data_axis_to_wcs_axis(*test_input) def test_select_order(): lists = [['TIME', 'WAVE', 'HPLT-TAN', 'HPLN-TAN'], ['WAVE', 'HPLT-TAN', 'UTC', 'HPLN-TAN'], ['HPLT-TAN', 'TIME', 'HPLN-TAN'], ['HPLT-TAN', 'DEC--TAN', 'WAVE'], [], ['UTC', 'TIME', 'WAVE', 'HPLT-TAN']] results = [ [0, 1, 2, 3], [2, 0, 1, 3], [1, 0, 2], # Second order is initial order [2, 0, 1], [], [1, 0, 2, 3] ] for (l, r) in zip(lists, results): assert utils.cube.select_order(l) == r @pytest.mark.parametrize("test_input", [ ([('name', 0)], [False, False], (1, 2)), ([(0, 0, 0)], [False, False], (1, 2)), ([('name', '0', 0)], [False, False], (1, 2)), ([('name', 0, [0, 1])], [False, False], (1, 2)) ]) def test_format_input_extra_coords_to_extra_coords_wcs_axis_value(test_input): with pytest.raises(ValueError): utils.cube._format_input_extra_coords_to_extra_coords_wcs_axis(*test_input) @pytest.mark.parametrize("test_input,expected", [ ((extra_coords_dict, missing_axes_none), extra_coords_input), ((extra_coords_dict_wcs, missing_axes_none), [('time', 2, u.Quantity(range(axes_length), unit=u.pix)), ('hello', 1, u.Quantity(range(axes_length), unit=u.pix))]), ((extra_coords_dict_wcs, missing_axes_1), [('time', 1, u.Quantity(range(axes_length), unit=u.pix)), ('hello', None, u.Quantity(range(axes_length), unit=u.pix))]) ]) def test_convert_extra_coords_dict_to_input_format(test_input, expected): output = utils.cube.convert_extra_coords_dict_to_input_format(*test_input) if len(output) != len(expected): raise AssertionError(f"{output} != {expected}") for output_tuple in output: j = 0 while j < len(expected): if output_tuple[0] == expected[j][0]: assert len(output_tuple) == len(expected[j]) print(output_tuple) print(expected[j]) for k, el in enumerate(output_tuple): try: assert el == expected[j][k] except ValueError as err: if err.args[0] == "The truth value of an array with more than" + \ " one element is ambiguous. Use a.any() or a.all()": assert (el == expected[j][k]).all() else: raise err j = len(expected) + 1 else: j += 1 if j == len(expected): raise AssertionError(f"{output} != {expected}") def test_convert_extra_coords_dict_to_input_format_error(): with pytest.raises(KeyError): utils.cube.convert_extra_coords_dict_to_input_format( {"time": {"not axis": 0, "value": []}}, missing_axes_none) @pytest.mark.parametrize("test_input, expected", [ ((5, False), np.asarray([0, 1, 2, 3, 4])), ((6, True), np.asarray([-0.5, 0.5, 1.5, 2.5, 3.5, 4.5, 5.5])) ]) def test_pixel_centers_or_edges(test_input, expected): output = utils.cube._pixel_centers_or_edges(*test_input) assert isinstance(output, np.ndarray) np.testing.assert_allclose(output, expected) @pytest.mark.parametrize("test_input, expected", [ ((5, False), 5), ((6, True), 7) ]) def test_get_dimension_for_pixel(test_input, expected): output = utils.cube._get_dimension_for_pixel(*test_input) assert output == expected ././@PaxHeader0000000000000000000000000000003400000000000011452 xustar000000000000000028 mtime=1605787057.1544762 ndcube-1.4.2/ndcube/tests/test_utils_sequence.py0000644000175100001640000002071200000000000022313 0ustar00vstsdocker00000000000000import pytest import unittest import numpy as np from ndcube import utils # sample data for tests tuple_item0 = (0, slice(0, 3)) tuple_item1 = (slice(0, 2), slice(0, 3), slice(None)) tuple_item2 = (slice(3, 1, -1), slice(0, 3), slice(None)) tuple_item3 = (slice(4, None, -2), slice(0, 3), slice(None)) n_cubes = 4 @pytest.mark.parametrize("test_input,expected", [ ((1, n_cubes), [utils.sequence.SequenceItem(sequence_index=1, cube_item=slice(None))]), ((slice(None), 2), [utils.sequence.SequenceItem(sequence_index=0, cube_item=slice(None)), utils.sequence.SequenceItem(sequence_index=1, cube_item=slice(None))]), ((slice(0, 2), 3), [utils.sequence.SequenceItem(sequence_index=0, cube_item=slice(None)), utils.sequence.SequenceItem(sequence_index=1, cube_item=slice(None))]), ((slice(1, 4, 2), 5), [utils.sequence.SequenceItem(sequence_index=1, cube_item=slice(None)), utils.sequence.SequenceItem(sequence_index=3, cube_item=slice(None))]), ((slice(3, 1, -1), 5), [utils.sequence.SequenceItem(sequence_index=3, cube_item=slice(None)), utils.sequence.SequenceItem(sequence_index=2, cube_item=slice(None))]), ((tuple_item0, n_cubes), [utils.sequence.SequenceItem(sequence_index=0, cube_item=tuple_item0[1])]), ((tuple_item1, n_cubes), [utils.sequence.SequenceItem(sequence_index=0, cube_item=tuple_item1[1:]), utils.sequence.SequenceItem(sequence_index=1, cube_item=tuple_item1[1:])]), ((tuple_item2, n_cubes), [utils.sequence.SequenceItem(sequence_index=3, cube_item=tuple_item1[1:]), utils.sequence.SequenceItem(sequence_index=2, cube_item=tuple_item1[1:])]), ((tuple_item3, n_cubes), [utils.sequence.SequenceItem(sequence_index=4, cube_item=tuple_item1[1:]), utils.sequence.SequenceItem(sequence_index=2, cube_item=tuple_item1[1:]), utils.sequence.SequenceItem(sequence_index=0, cube_item=tuple_item1[1:])]) ]) def test_convert_item_to_sequence_items(test_input, expected): unit_tester = unittest.TestCase() unit_tester.assertEqual( utils.sequence.convert_item_to_sequence_items(*test_input), expected) def test_convert_item_to_sequence_items_error(): with pytest.raises(TypeError): utils.sequence.convert_item_to_sequence_items('item') @pytest.mark.parametrize("test_input,expected", [ # Test int cube_like_items. ((0, 0, np.array([3])), [utils.sequence.SequenceItem(sequence_index=0, cube_item=0)]), ((5, 0, np.array([3, 3])), [utils.sequence.SequenceItem(sequence_index=1, cube_item=2)]), # Below test reveals function doesn't work with negative int indexing. # ((-1, 0, np.array([3, 3])), [utils.sequence.SequenceItem(sequence_index=1, cube_item=2)]), # Test slice cube_like_items. ((slice(0, 2), 0, np.array([3])), [utils.sequence.SequenceItem(sequence_index=0, cube_item=slice(0, 2, 1))]), ((slice(1, 4), 0, np.array([3, 3])), [utils.sequence.SequenceItem(sequence_index=0, cube_item=slice(1, 3, 1)), utils.sequence.SequenceItem(sequence_index=1, cube_item=slice(0, 1, 1))]), ((slice(1, 7, 2), 0, np.array([3, 5])), [utils.sequence.SequenceItem(sequence_index=0, cube_item=slice(1, 3, 2)), utils.sequence.SequenceItem(sequence_index=1, cube_item=slice(1, 4, 2))]), ((slice(1, 7, 3), 0, np.array([3, 5])), [utils.sequence.SequenceItem(sequence_index=0, cube_item=slice(1, 2, 3)), utils.sequence.SequenceItem(sequence_index=1, cube_item=slice(1, 4, 3))]), # Below test reveals function doesn't work with negative stepping. # ((slice(6, 1, -1), 0, np.array([3, 5])), # [utils.sequence.SequenceItem(sequence_index=1, cube_item=slice(3, 0, -1)), # utils.sequence.SequenceItem(sequence_index=0, cube_item=slice(2, 1, -1))]), # Test tuple cube_like_items (((0, 0, slice(1, 10)), 0, np.array([3, 5])), [utils.sequence.SequenceItem(sequence_index=0, cube_item=(0, 0, slice(1, 10, None)))]), (((0, 0, slice(1, 10)), 1, np.array([3, 5])), [utils.sequence.SequenceItem(sequence_index=0, cube_item=(0, 0, slice(1, 10, None)))]), (((slice(2, 10), 0, slice(1, 10)), 0, np.array([3, 5, 5])), [utils.sequence.SequenceItem(sequence_index=0, cube_item=(slice(2, 3, 1), 0, slice(1, 10, None))), utils.sequence.SequenceItem(sequence_index=1, cube_item=(slice(0, 5, 1), 0, slice(1, 10, None))), utils.sequence.SequenceItem(sequence_index=2, cube_item=(slice(0, 2, 1), 0, slice(1, 10, None)))]), (((0, slice(2, 10), slice(1, 10)), 1, np.array([3, 5, 5])), [utils.sequence.SequenceItem(sequence_index=0, cube_item=(0, slice(2, 3, 1), slice(1, 10, None))), utils.sequence.SequenceItem(sequence_index=1, cube_item=(0, slice(0, 5, 1), slice(1, 10, None))), utils.sequence.SequenceItem(sequence_index=2, cube_item=(0, slice(0, 2, 1), slice(1, 10, None)))]), ]) def test_convert_cube_like_item_to_sequence_items(test_input, expected): unit_tester = unittest.TestCase() unit_tester.assertEqual( utils.sequence.convert_cube_like_item_to_sequence_items(*test_input), expected) @pytest.mark.parametrize("test_input", [ (0, 1, np.array(3)), (slice(None), 1, np.array(3)), ((0, 1), 2, np.array([3, 3, 3])), (('item', 2), 0, np.array([3, 3, 3])) ]) def test_convert_cube_like_item_to_sequence_items_value_error(test_input): with pytest.raises(ValueError): utils.sequence.convert_cube_like_item_to_sequence_items(*test_input) def test_convert_cube_like_item_to_sequence_items_type_error(): with pytest.raises(TypeError): utils.sequence.convert_cube_like_item_to_sequence_items('item', 1, np.array(3)) @pytest.mark.parametrize("test_input,expected", [ ((5, np.array([8] * 4)), utils.sequence.SequenceSlice(0, 5)), ((8, np.array([8] * 4)), utils.sequence.SequenceSlice(1, 0)), ((20, np.array([8] * 4)), utils.sequence.SequenceSlice(2, 4)), ((50, np.array([8] * 4)), utils.sequence.SequenceSlice(3, 8)), ]) def test_convert_cube_like_index_to_sequence_slice(test_input, expected): assert utils.sequence._convert_cube_like_index_to_sequence_slice( *test_input) == expected @pytest.mark.parametrize("test_input,expected", [((slice(2, 5), np.array([8] * 4)), [utils.sequence.SequenceSlice(0, slice(2, 5, 1))]), ((slice(5, 15), np.array([8] * 4)), [ utils.sequence.SequenceSlice(0, slice(5, 8, 1)), utils.sequence.SequenceSlice(1, slice(0, 7, 1)) ]), ((slice(5, 16), np.array([8] * 4)), [ utils.sequence.SequenceSlice(0, slice(5, 8, 1)), utils.sequence.SequenceSlice(1, slice(0, 8, 1)) ]), ((slice(5, 23), np.array([8] * 4)), [ utils.sequence.SequenceSlice(0, slice(5, 8, 1)), utils.sequence.SequenceSlice(1, slice(0, 8, 1)), utils.sequence.SequenceSlice(2, slice(0, 7, 1)) ]), ((slice(5, 100), np.array([8] * 4)), [ utils.sequence.SequenceSlice(0, slice(5, 8, 1)), utils.sequence.SequenceSlice(1, slice(0, 8, 1)), utils.sequence.SequenceSlice(2, slice(0, 8, 1)), utils.sequence.SequenceSlice(3, slice(0, 8, 1)) ])]) def test_convert_cube_like_slice_to_sequence_slices(test_input, expected): assert utils.sequence._convert_cube_like_slice_to_sequence_slices(*test_input) == expected @pytest.mark.parametrize( "test_input,expected", [((slice(0, 10), 20), slice(0, 10, 1)), ((slice(0, 10, 2), 20), slice(0, 10, 2)), ((slice(None, 0, -1), 20), slice(20, 0, -1))]) def test_convert_slice_nones_to_ints(test_input, expected): assert utils.sequence.convert_slice_nones_to_ints(*test_input) == expected @pytest.mark.parametrize("test_input,expected", [ ((utils.sequence.SequenceSlice(0, 0), 1), utils.sequence.SequenceItem(0, (slice(None), 0))) ]) def test_convert_sequence_slice_to_sequence_item(test_input, expected): unit_tester = unittest.TestCase() unit_tester.assertEqual( utils.sequence._convert_sequence_slice_to_sequence_item(*test_input), expected) ././@PaxHeader0000000000000000000000000000003400000000000011452 xustar000000000000000028 mtime=1605787057.1544762 ndcube-1.4.2/ndcube/tests/test_utils_wcs.py0000644000175100001640000001352200000000000021300 0ustar00vstsdocker00000000000000import pytest import unittest import numpy as np import astropy.wcs from ndcube import utils from ndcube.tests import helpers ht = {'CTYPE3': 'HPLT-TAN', 'CUNIT3': 'deg', 'CDELT3': 0.5, 'CRPIX3': 0, 'CRVAL3': 0, 'NAXIS3': 2, 'CTYPE2': 'WAVE ', 'CUNIT2': 'Angstrom', 'CDELT2': 0.2, 'CRPIX2': 0, 'CRVAL2': 0, 'NAXIS2': 3, 'CTYPE1': 'TIME ', 'CUNIT1': 'min', 'CDELT1': 0.4, 'CRPIX1': 0, 'CRVAL1': 0, 'NAXIS1': 4} wt = utils.wcs.WCS(header=ht, naxis=3) ht_with_celestial = { 'CTYPE4': 'HPLN-TAN', 'CUNIT4': 'deg', 'CDELT4': 1, 'CRPIX4': 0, 'CRVAL4': 0, 'NAXIS4': 1, 'CNAME4': 'redundant axis', 'CROTA4': 0, 'CTYPE3': 'HPLT-TAN', 'CUNIT3': 'deg', 'CDELT3': 0.5, 'CRPIX3': 0, 'CRVAL3': 0, 'NAXIS3': 2, 'CTYPE2': 'WAVE ', 'CUNIT2': 'Angstrom', 'CDELT2': 0.2, 'CRPIX2': 0, 'CRVAL2': 0, 'NAXIS2': 3, 'CTYPE1': 'TIME ', 'CUNIT1': 'min', 'CDELT1': 0.4, 'CRPIX1': 0, 'CRVAL1': 0, 'NAXIS1': 4} hm = {'CTYPE1': 'WAVE ', 'CUNIT1': 'Angstrom', 'CDELT1': 0.2, 'CRPIX1': 0, 'CRVAL1': 10, 'NAXIS1': 4, 'CTYPE2': 'HPLT-TAN', 'CUNIT2': 'deg', 'CDELT2': 0.5, 'CRPIX2': 2, 'CRVAL2': 0.5, 'NAXIS2': 3, 'CTYPE3': 'HPLN-TAN', 'CUNIT3': 'deg', 'CDELT3': 0.4, 'CRPIX3': 2, 'CRVAL3': 1, 'NAXIS3': 2} wm = utils.wcs.WCS(header=hm, naxis=3) hm_reindexed_102 = { 'CTYPE2': 'WAVE ', 'CUNIT2': 'Angstrom', 'CDELT2': 0.2, 'CRPIX2': 0, 'CRVAL2': 10, 'NAXIS2': 4, 'CTYPE1': 'HPLT-TAN', 'CUNIT1': 'deg', 'CDELT1': 0.5, 'CRPIX1': 2, 'CRVAL1': 0.5, 'NAXIS1': 3, 'CTYPE3': 'HPLN-TAN', 'CUNIT3': 'deg', 'CDELT3': 0.4, 'CRPIX3': 2, 'CRVAL3': 1, 'NAXIS3': 2} wm_reindexed_102 = utils.wcs.WCS(header=hm_reindexed_102, naxis=3) @pytest.fixture def axis_correlation_matrix(): return _axis_correlation_matrix() def _axis_correlation_matrix(): shape = (4, 4) acm = np.zeros(shape, dtype=bool) for i in range(min(shape)): acm[i, i] = True acm[0, 1] = True acm[1, 0] = True acm[-1, 0] = True return acm @pytest.fixture def test_wcs(): return TestWCS() class TestWCS(): def __init__(self): self.world_axis_physical_types = [ 'custom:pos.helioprojective.lon', 'custom:pos.helioprojective.lat', 'em.wl', 'time'] self.axis_correlation_matrix = _axis_correlation_matrix() @pytest.mark.parametrize("test_input,expected", [(ht, True), (hm, False)]) def test_wcs_needs_augmenting(test_input, expected): assert utils.wcs.WCS._needs_augmenting(test_input) is expected @pytest.mark.parametrize("test_input,expected", [((ht, 3), ht_with_celestial)]) def test_wcs_augment(test_input, expected): unit_tester = unittest.TestCase() unit_tester.assertEqual(utils.wcs.WCS._augment(*test_input), expected) @pytest.mark.parametrize( "test_input,expected", [({}, False), ([slice(1, 5), slice(-1, -5, -2)], True)]) def test_all_slice(test_input, expected): assert utils.wcs._all_slice(test_input) == expected @pytest.mark.parametrize( "test_input,expected", [({}, []), ((slice(1, 2), slice(1, 3), 2, slice(2, 4), 8), [slice(1, 2, None), slice(1, 3, None), slice(2, 3, None), slice(2, 4, None), slice(8, 9, None)])]) def test_slice_list(test_input, expected): assert utils.wcs._slice_list(test_input) == expected @pytest.mark.parametrize("test_input,expected", [ ((wm, np.array([1, 0, 2])), wm_reindexed_102), ((wm, np.array([1, 0, -1])), wm_reindexed_102) ]) def test_reindex_wcs(test_input, expected): print(utils.wcs.reindex_wcs(*test_input)) print(expected) helpers.assert_wcs_are_equal(utils.wcs.reindex_wcs(*test_input), expected) @pytest.mark.parametrize("test_input", [ (TypeError, wm, 0), (TypeError, wm, np.array(['spam', 'eggs', 'ham'])), ]) def test_reindex_wcs_errors(test_input): with pytest.raises(test_input[0]): utils.wcs.reindex_wcs(*test_input[1:]) @pytest.mark.parametrize("test_input,expected", [ ((wm, 0, [False, False, False]), (0, 1)), ((wm, 1, [False, False, False]), (0, 1)), ((wm, 2, [False, False, False]), (2,)), ((wm, 1, [False, False, True]), (1,)) ]) def test_get_dependent_data_axes(test_input, expected): output = utils.wcs.get_dependent_data_axes(*test_input) assert output == expected @pytest.mark.parametrize("test_input,expected", [ ((wm, 0), (0,)), ((wm, 1), (1, 2)), ((wm, 2), (1, 2)), ]) def test_get_dependent_wcs_axes(test_input, expected): output = utils.wcs.get_dependent_wcs_axes(*test_input) assert output == expected @pytest.mark.parametrize("test_input,expected", [ (wm, np.array([[True, False, False], [False, True, True], [False, True, True]])), (wt, np.array([[True, False, False, False], [False, True, False, False], [False, False, True, True], [False, False, True, True]])), (wm_reindexed_102, np.array([[True, False, True], [False, True, False], [True, False, True]])) ]) def test_axis_correlation_matrix(test_input, expected): assert (utils.wcs.axis_correlation_matrix(test_input) == expected).all() def test_convert_between_array_and_pixel_axes(): test_input = np.array([1, 4, -2]) naxes = 5 expected = np.array([3, 0, 1]) output = utils.wcs.convert_between_array_and_pixel_axes(test_input, naxes) assert all(output == expected) def test_pixel_axis_to_world_axes(axis_correlation_matrix): output = utils.wcs.pixel_axis_to_world_axes(0, axis_correlation_matrix) expected = np.array([0, 1, 3]) assert all(output == expected) @pytest.mark.parametrize("test_input,expected", [('wl', 2), ('em.wl', 2)]) def test_physical_type_to_world_axis(test_input, expected): world_axis_physical_types = ['custom:pos.helioprojective.lon', 'custom:pos.helioprojective.lat', 'em.wl', 'time'] output = utils.wcs.physical_type_to_world_axis(test_input, world_axis_physical_types) assert output == expected ././@PaxHeader0000000000000000000000000000003400000000000011452 xustar000000000000000028 mtime=1605787217.3871849 ndcube-1.4.2/ndcube/utils/0000755000175100001640000000000000000000000015646 5ustar00vstsdocker00000000000000././@PaxHeader0000000000000000000000000000003400000000000011452 xustar000000000000000028 mtime=1605787057.1544762 ndcube-1.4.2/ndcube/utils/__init__.py0000644000175100001640000000017500000000000017762 0ustar00vstsdocker00000000000000# Licensed under a 3-clause BSD style license - see LICENSE.rst from . import cube from . import sequence from . import wcs ././@PaxHeader0000000000000000000000000000003400000000000011452 xustar000000000000000028 mtime=1605787057.1544762 ndcube-1.4.2/ndcube/utils/collection.py0000644000175100001640000001657400000000000020370 0ustar00vstsdocker00000000000000import numbers import numpy as np import astropy.units as u def _sanitize_aligned_axes(keys, data, aligned_axes): if aligned_axes is None: return None # If aligned_axes set to "all", assume all axes are aligned in order. elif isinstance(aligned_axes, str) and aligned_axes.lower() == "all": # Check all cubes are of same shape cube0_dims = data[0].dimensions cubes_same_shape = all([all([d.dimensions[i] == dim for i, dim in enumerate(cube0_dims)]) for d in data]) if cubes_same_shape is not True: raise ValueError( "All cubes in data not of same shape. Please set aligned_axes kwarg.") sanitized_axes = tuple([tuple(range(len(cube0_dims)))] * len(data)) else: # Else, sanitize user-supplied aligned axes. sanitized_axes = _sanitize_user_aligned_axes(data, aligned_axes) return dict(zip(keys, sanitized_axes)) def _sanitize_user_aligned_axes(data, aligned_axes): """ Converts input aligned_axes to standard format. aligned_axes can be supplied by the user in a few ways: *. A tuple of tuples of ints, where each tuple corresponds to a cube in the collection, and each int designates the an aligned axis in numpy order. In this case, the axis represented by the 0th int in the 0th tuple is aligned with the 0th int in the 1st tuple and so on. *. A single tuple of ints if all aligned axes are in the same order. *. A single int if only one axis is aligned and if the aligned axis in each cube is in the same order. """ aligned_axes_error_message = ("aligned_axes must contain ints or " "a tuple of ints for each element in data.") if isinstance(data[0].dimensions, tuple): cube0_dims = np.array(data[0].dimensions, dtype=object)[np.array(aligned_axes[0])] else: cube0_dims = data[0].dimensions[np.array(aligned_axes[0])] # If user entered a single int or string, convert to length 1 tuple of int. if isinstance(aligned_axes, int): aligned_axes = (aligned_axes,) if not isinstance(aligned_axes, tuple): raise ValueError(aligned_axes_error_message) # Check type of each element. axes_all_ints = all([isinstance(axis, int) for axis in aligned_axes]) axes_all_tuples = all([isinstance(axis, tuple) for axis in aligned_axes]) # If all elements are int, duplicate tuple so there is one for each cube. n_cubes = len(data) if axes_all_ints: n_aligned_axes = len(aligned_axes) aligned_axes = tuple([aligned_axes for i in range(n_cubes)]) # If all elements are tuple, ensure there is a tuple for each cube and # all elements of each sub-tuple are ints. elif axes_all_tuples: if len(aligned_axes) != n_cubes: raise ValueError("aligned_axes must have a tuple for each element in data.") n_aligned_axes = len(aligned_axes[0]) # Ensure all elements of sub-tuples are ints, # each tuple has the same number of aligned axes, # number of aligned axes are <= number of cube dimensions, # and the dimensions of the aligned axes in each cube are the same. subtuples_are_ints = [False] * n_cubes aligned_axes_same_lengths = [False] * n_cubes if not all([len(axes) == n_aligned_axes for axes in aligned_axes]): raise ValueError("Each element in aligned_axes must have same length.") for i in range(n_cubes): # Check each cube has at least as many dimensions as there are aligned axes # and that all cubes have enough dimensions to accommodate aligned axes. n_cube_dims = len(data[i].dimensions) max_aligned_axis = max(aligned_axes[i]) if n_cube_dims < max([max_aligned_axis, n_aligned_axes]): raise ValueError( "Each cube in data must have at least as many axes as aligned axes " "and aligned axis indices must be less than number of cube axes.\n" f"Cube number: {i};\n" f"Number of cube dimensions: {n_cube_dims};\n" f"No. aligned axes: {n_aligned_axes};\n" f"Highest aligned axis: {max_aligned_axis}") subtuple_types = [False] * n_aligned_axes cube_lengths_equal = [False] * n_aligned_axes for j, axis in enumerate(aligned_axes[i]): subtuple_types[j] = isinstance(axis, numbers.Integral) cube_lengths_equal[j] = data[i].dimensions[axis] == cube0_dims[j] subtuples_are_ints[i] = all(subtuple_types) aligned_axes_same_lengths[i] = all(cube_lengths_equal) if not all(subtuples_are_ints): raise ValueError(aligned_axes_error_message) if not all(aligned_axes_same_lengths): raise ValueError("Aligned cube/sequence axes must be of same length.") else: raise ValueError(aligned_axes_error_message) # Ensure all aligned axes are of same length. check_dimensions = set([len(set([cube.dimensions[cube_aligned_axes[j]] for cube, cube_aligned_axes in zip(data, aligned_axes)])) for j in range(n_aligned_axes)]) if check_dimensions != {1}: raise ValueError("Aligned axes are not all of same length.") return aligned_axes def _update_aligned_axes(drop_aligned_axes_indices, aligned_axes, first_key): # Remove dropped axes from aligned_axes. MUST BE A BETTER WAY TO DO THIS. if len(drop_aligned_axes_indices) <= 0: new_aligned_axes = tuple(aligned_axes.values()) elif len(drop_aligned_axes_indices) == len(aligned_axes[first_key]): new_aligned_axes = None else: new_aligned_axes = [] for key in aligned_axes.keys(): cube_aligned_axes = np.array(aligned_axes[key]) for drop_axis_index in drop_aligned_axes_indices: drop_axis = cube_aligned_axes[drop_axis_index] cube_aligned_axes = np.delete(cube_aligned_axes, drop_axis_index) w = np.where(cube_aligned_axes > drop_axis) cube_aligned_axes[w] -= 1 w = np.where(drop_aligned_axes_indices > drop_axis_index) drop_aligned_axes_indices[w] -= 1 new_aligned_axes.append(tuple(cube_aligned_axes)) new_aligned_axes = tuple(new_aligned_axes) return new_aligned_axes def assert_aligned_axes_compatible(data_dimensions1, data_dimensions2, data_axes1, data_axes2): """ Checks whether two sets of aligned axes are compatible. Parameters ---------- data_dimensions1: sequence of ints The dimension lengths of data cube 1. data_dimensions2: sequence of ints The dimension lengths of data cube 2. data_axes1: `tuple` of `int` The aligned axes of data cube 1. data_axes2: `tuple` of `int` The aligned axes of data cube 2. """ # Confirm same number of aligned axes. if len(data_axes1) != len(data_axes2): raise ValueError("Number of aligned axes must be equal: " f"{len(data_axes1)} != {len(data_axes2)}") # Confirm dimension lengths of each aligned axis is the same. if not all(data_dimensions1[np.array(data_axes1)] == data_dimensions2[np.array(data_axes2)]): raise ValueError("All corresponding aligned axes between cubes must be of same length.") ././@PaxHeader0000000000000000000000000000003400000000000011452 xustar000000000000000028 mtime=1605787057.1544762 ndcube-1.4.2/ndcube/utils/cube.py0000644000175100001640000002226700000000000017147 0ustar00vstsdocker00000000000000 """ Utilities for ndcube. """ import numbers import numpy as np from astropy.units import Quantity __all__ = ['wcs_axis_to_data_axis', 'data_axis_to_wcs_axis', 'select_order', 'convert_extra_coords_dict_to_input_format', 'get_axis_number_from_axis_name'] def data_axis_to_wcs_axis(data_axis, missing_axes): """ Converts a data axis number to the corresponding wcs axis number. """ if data_axis is None: result = None else: if data_axis < 0: data_axis = np.invert(missing_axes).sum() + data_axis if data_axis > np.invert(missing_axes).sum() - 1 or data_axis < 0: raise IndexError("Data axis out of range. Number data axes = {}".format( np.invert(missing_axes).sum())) result = len(missing_axes) - np.where(np.cumsum( [b is False for b in missing_axes][::-1]) == data_axis + 1)[0][0] - 1 return result def wcs_axis_to_data_axis(wcs_axis, missing_axes): """ Converts a wcs axis number to the corresponding data axis number. """ if wcs_axis is None: result = None else: if wcs_axis < 0: wcs_axis = len(missing_axes) + wcs_axis if wcs_axis > len(missing_axes) - 1 or wcs_axis < 0: raise IndexError("WCS axis out of range. Number WCS axes = {}".format( len(missing_axes))) if missing_axes[wcs_axis]: result = None else: data_ordered_wcs_axis = len(missing_axes) - wcs_axis - 1 result = data_ordered_wcs_axis - sum(missing_axes[::-1][:data_ordered_wcs_axis]) return result def select_order(axtypes): """ Returns indices of the correct data order axis priority given a list of WCS CTYPEs. For example, given ['HPLN-TAN', 'TIME', 'WAVE'] it will return [1, 2, 0] because index 1 (time) has the lowest priority, followed by wavelength and finally solar-x. Parameters ---------- axtypes: str list The list of CTYPEs to be modified. """ order = sorted([(0, t) if t in ['TIME', 'UTC'] else (1, t) if t == 'WAVE' else (2, t) if t == 'HPLT-TAN' else (axtypes.index(t) + 3, t) for t in axtypes]) result = [axtypes.index(s) for (_, s) in order] return result def _format_input_extra_coords_to_extra_coords_wcs_axis(extra_coords, missing_axes, data_shape): extra_coords_wcs_axis = {} coord_format_error = ("Coord must have three properties supplied, " "name (str), axis (int), values (Quantity or array-like)." " Input coord: {0}") coord_0_format_error = ("1st element of extra coordinate tuple must be a " "string giving the coordinate's name.") coord_1_format_error = ("2nd element of extra coordinate tuple must be None " "or an int giving the data axis " "to which the coordinate corresponds.") coord_len_error = ("extra coord ({0}) must have same length as data axis " "to which it is assigned: coord length, {1} != data axis length, {2}") for coord in extra_coords: # Check extra coord has the right number and types of info. if len(coord) != 3: raise ValueError(coord_format_error.format(coord)) if not isinstance(coord[0], str): raise ValueError(coord_0_format_error.format(coord)) if coord[1] is not None and not isinstance(coord[1], numbers.Integral): raise ValueError(coord_1_format_error) # Unless extra coord corresponds to a missing axis, check length # of coord is same is data axis to which is corresponds. if coord[1] is not None: if not missing_axes[::-1][coord[1]]: if len(coord[2]) != data_shape[coord[1]]: raise ValueError(coord_len_error.format(coord[0], len(coord[2]), data_shape[coord[1]])) # Determine wcs axis corresponding to data axis of coord extra_coords_wcs_axis[coord[0]] = { "wcs axis": data_axis_to_wcs_axis(coord[1], missing_axes), "value": coord[2]} return extra_coords_wcs_axis def convert_extra_coords_dict_to_input_format(extra_coords, missing_axes): """ Converts NDCube.extra_coords attribute to format required as input for new NDCube. Parameters ---------- extra_coords: dict An NDCube.extra_coords instance. Returns ------- input_format: `list` Infomation on extra coords in format required by `NDCube.__init__`. """ coord_names = list(extra_coords.keys()) result = [] for name in coord_names: coord_keys = list(extra_coords[name].keys()) if "wcs axis" in coord_keys and "axis" not in coord_keys: axis = wcs_axis_to_data_axis(extra_coords[name]["wcs axis"], missing_axes) elif "axis" in coord_keys and "wcs axis" not in coord_keys: axis = extra_coords[name]["axis"] else: raise KeyError("extra coords dict can have keys 'wcs axis' or 'axis'. Not both.") result.append((name, axis, extra_coords[name]["value"])) return result def get_axis_number_from_axis_name(axis_name, world_axis_physical_types): """ Returns axis number (numpy ordering) given a substring unique to a world axis type string. Parameters ---------- axis_name: `str` Name or substring of name of axis as defined by NDCube.world_axis_physical_types world_axis_physical_types: iterable of `str` Output from NDCube.world_axis_physical_types for relevant cube, i.e. iterable of string axis names. Returns ------- axis_index[0]: `int` Axis number (numpy ordering) corresponding to axis name """ axis_index = [axis_name in world_axis_type for world_axis_type in world_axis_physical_types] axis_index = np.arange(len(world_axis_physical_types))[axis_index] if len(axis_index) != 1: raise ValueError("User defined axis with a string that is not unique to " "a physical axis type. {} not in any of {}".format( axis_name, world_axis_physical_types)) return axis_index[0] def _pixel_centers_or_edges(axis_length, edges): """ Returns a range of pixel_values or pixel_edges. Parameters ---------- axis_length: `int` The length of the axis edges: `bool` Boolean to signify whether pixel_edge or pixel_value requested False stands for pixel_value, while True stands for pixel_edge Returns ------- `np.ndarray` The axis_values for the given input """ if edges is False: axis_values = np.arange(axis_length) else: axis_values = np.arange(-0.5, axis_length + 0.5) return axis_values def _get_dimension_for_pixel(axis_length, edges): """ Returns the dimensions for the given edges. Parameters ---------- axis_length : `int` The length of the axis edges : `bool` Boolean to signify whether pixel_edge or pixel_value requested False stands for pixel_value, while True stands for pixel_edge """ return axis_length + 1 if edges else axis_length def _get_extra_coord_edges(value, axis=-1): """Gets the pixel_edges from the pixel_values Parameters ---------- value : `astropy.units.Quantity` or array-like The Quantity object containing the values for a given `extra_coords` axis : `int` The axis about which pixel_edges needs to be calculated Default value is -1, which is the last axis for a ndarray """ # Checks for corner cases if not isinstance(value, np.ndarray): value = np.array(value) # Get the shape of the Quantity object shape = value.shape if len(shape) == 1: shape = len(value) if isinstance(value, Quantity): edges = np.zeros(shape + 1) * value.unit else: edges = np.zeros(shape + 1) # Calculate the pixel_edges from the given pixel_values edges[1:-1] = value[:-1] + (value[1:] - value[:-1]) / 2 edges[0] = value[0] - (value[1] - value[0]) / 2 edges[-1] = value[-1] + (value[-1] - value[-2]) / 2 else: # Edit the shape of the new ndarray to increase the length # by one for a given axis shape = list(shape) shape[axis] += 1 shape = tuple(shape) if isinstance(value, Quantity): edges = np.zeros(shape) * value.unit else: edges = np.zeros(shape) # Shift the axis which is point of interest to last axis value = np.moveaxis(value, axis, -1) edges = np.moveaxis(edges, axis, -1) # Calculate the pixel_edges from the given pixel_values edges[..., 1:-1] = value[..., :-1] + (value[..., 1:] - value[..., :-1]) / 2 edges[..., 0] = value[..., 0] - (value[..., 1] - value[..., 0]) / 2 edges[..., -1] = value[..., -1] + (value[..., -1] - value[..., -2]) / 2 # Revert the shape of the edges array edges = np.moveaxis(edges, -1, axis) return edges ././@PaxHeader0000000000000000000000000000003400000000000011452 xustar000000000000000028 mtime=1605787057.1544762 ndcube-1.4.2/ndcube/utils/sequence.py0000644000175100001640000006764300000000000020050 0ustar00vstsdocker00000000000000 """ Utilities for ndcube sequence. """ from copy import deepcopy from collections import namedtuple from functools import singledispatch import numpy as np import astropy.units as u __all__ = ['SequenceSlice', 'SequenceItem', 'slice_sequence', 'convert_item_to_sequence_items', 'convert_cube_like_item_to_sequence_items', 'convert_slice_nones_to_ints'] SequenceSlice = namedtuple("SequenceSlice", "sequence_index common_axis_item") """ Define SequenceSlice named tuple of length 2. Its attributes are: sequence_index: an int giving the index of a cube within an NDCubeSequence. common_axis_item: slice of int index of to be to be applied to the common axis of the cube. """ SequenceItem = namedtuple("SequenceItem", "sequence_index cube_item") """ Define SequenceItem named tuple of length 2. Its attributes are: sequence_index: an int giving the index of a cube within an NDCubeSequence. cube_item: item (int, slice, tuple) to be applied to cube identified by sequence_index attribute. """ def slice_sequence(cubesequence, item): """ Slice an NDCubeSequence given a slicing/index item. Parameters ---------- cubesequence: `ndcube.NDCubeSequence` The cubesequence to slice. item: `int`, `slice`, or `tuple` of `int` and/or `slice`. An slice/index item compatible with input to NDCubeSequence.__getitem__. Returns ------- result: `NDCubeSequence` or `NDCube` The sliced cube sequence. """ if item is None or (isinstance(item, tuple) and None in item): raise IndexError("None indices not supported") # Convert item to list of SequenceSlices sequence_items = convert_item_to_sequence_items(item, len(cubesequence.data)) return slice_sequence_by_sequence_items(cubesequence, sequence_items) @singledispatch def convert_item_to_sequence_items(item, n_cubes=None, cube_item=None): """ Converts NDCubeSequence __getitem__ item to list of SequenceSlice objects. Parameters ---------- item: `int`, `slice`, or `tuple` of `int` and/or `slice`. An slice/index item compatible with input to NDCubeSequence.__getitem__. n_cubes: `int` Number of cubes in NDCubeSequence being sliced. Must be supplied, but not used if item type is `int` or `slice`. Returns ------- result: `list` of SequenceItem `namedtuple`. The slice/index items for each relevant NDCube within the NDCubeSequence which together represent the original input slice/index item. """ # If type if the first input of this function does not match the # type of first input of one of the below registered functions, # raise an error. Otherwise one of the below registered functions # is executed. raise TypeError("Unrecognized slice type: {0}", item) @convert_item_to_sequence_items.register(int) def _get_sequence_items_from_int_item(int_item, n_cubes=None, cube_item=slice(None)): """ Converts int index of an NDCubeSequence to list of SequenceSlices. Parameters ---------- int_item: `int` index of NDCube within NDCubeSequence to be slices out. n_cubes: `None` Not used. Exists in API to be consistent with API of convert_item_to_sequence_items() to which it this function is registered under single dispatch. cube_item: `int`, `slice`, or `tuple` Item to be applied to selected NDCube. Returns ------- result: `list` of SequenceItem `namedtuple` The slice/index items for each relevant NDCube within the NDCubeSequence which together represent the original input slice/index item. """ return [SequenceItem(int_item, cube_item)] @convert_item_to_sequence_items.register(slice) def _get_sequence_items_from_slice_item(slice_item, n_cubes, cube_item=slice(None)): """ Converts slice item of an NDCubeSequence to list of SequenceSlices. Parameters ---------- slice_item: `slice` Indicates which NDCubes within NDCubeSequence are to be slices out. n_cubes: `int` Number of cubes in NDCubeSequence being sliced. cube_item: `int`, `slice`, or `tuple` Item to be applied to each selected NDCube. Returns ------- sequence_items: `list` of SequenceItem `namedtuple`. The slice/index items for each relevant NDCube within the NDCubeSequence which together represent the original input slice/index item. """ # If there are None types in slice, replace with correct entries based on sign of step. no_none_slice = convert_slice_nones_to_ints(slice_item, n_cubes) # Derive SequenceItems for each cube. Recall that # once convert_slice_nones_to_ints() has been applied, a None will # only be present to signify the beginning of the array when the # step is negative. Therefore, if the stop parmeter of the above # slice object is None, set the stop condition of the below for # loop to -1. if no_none_slice.stop is None: stop = -1 else: stop = no_none_slice.stop # If slice has interval length 1, make sequence index length 1 slice to # ensure dimension is not dropped in accordance with slicing convention. if abs(stop - no_none_slice.start) == 1: sequence_items = [SequenceItem(slice_item, cube_item)] else: sequence_items = [SequenceItem(i, cube_item) for i in range(no_none_slice.start, stop, no_none_slice.step)] return sequence_items @convert_item_to_sequence_items.register(tuple) def _get_sequence_items_from_tuple_item(tuple_item, n_cubes, cube_item=None): """ Converts NDCubeSequence slice item tuple to list of SequenceSlice objects. Parameters ---------- tuple_item: `tuple` of `int` and/or `slice`. Index/slice for different dimensions of NDCubeSequence. The first entry applies to the sequence axis while subsequent entries make up the slicing item to be applied to the NDCubes. n_cubes: `int` Number of cubes in NDCubeSequence being sliced. cube_item: `None` Not used. Exists in API to be consistent with API of convert_item_to_sequence_items() to which it this function is registered under single dispatch. Returns ------- sequence_items: `list` of SequenceItem `namedtuple`. The slice/index items for each relevant NDCube within the NDCubeSequence which together represent the original input slice/index item. """ # Define slice to be applied to cubes. if len(tuple_item[1:]) == 1: cube_item = tuple_item[1] else: cube_item = tuple_item[1:] # Based on type of sequence index, define sequence slices. sequence_items = convert_item_to_sequence_items(tuple_item[0], n_cubes=n_cubes, cube_item=cube_item) return sequence_items def slice_sequence_by_sequence_items(cubesequence, sequence_items): """ Slices an NDCubeSequence given a list of SequenceSlice objects. Parameters ---------- cubesequence: `ndcube.NDCubeSequence` The cubesequence to slice. sequence_items: `list` of `SequenceItem` Slices to be applied to each relevant NDCube in the sequence. Returns ------- result: `NDCubeSequence` or `NDCube` The sliced cube sequence. """ result = deepcopy(cubesequence) if len(sequence_items) == 1: # If sequence item is interval length 1 slice, ensure an NDCubeSequence # is returned in accordance with slicing convention. # Due to code up to this point, if sequence item is a slice, it can only # be an interval length 1 slice. if isinstance(sequence_items[0].sequence_index, slice): result.data = [result.data[sequence_items[0].sequence_index.start][sequence_items[0].cube_item]] else: result = result.data[sequence_items[0].sequence_index][sequence_items[0].cube_item] else: data = [result.data[sequence_item.sequence_index][sequence_item.cube_item] for sequence_item in sequence_items] result.data = data return result def _index_sequence_as_cube(cubesequence, item): """ Enables NDCubeSequence to be indexed as if it were a single NDCube. This is only possible if cubesequence._common_axis is set, i.e. if the cubes are sequenced in order along one of the cube axes. For example, if cubesequence._common_axis is 1 where the first axis is time, and the cubes are sequenced chronologically such that the last time slice of one cube is directly followed in time by the first time slice of the next cube, then this function allows the NDCubeSequence to be indexed as though all cubes were combined into one ordered along the time axis. Parameters ---------- cubesequence: `ndcube.NDCubeSequence` The cubesequence to get the item from item: `int`, `slice` or `tuple` of `int` and/or `slice`. The item to get from the cube. If tuple length must be <= number of dimensions in single cube. Example ------- >>> # Say we have three Cubes each cube has common_axis=1 is time and shape=(3,3,3) >>> data_list = [cubeA, cubeB, cubeC] # doctest: +SKIP >>> cs = NDCubeSequence(data_list, meta=None, common_axis=1) # doctest: +SKIP >>> # return zeroth time slice of cubeB in via normal CubeSequence indexing. >>> cs[1,:,0,:] # doctest: +SKIP >>> # Return same slice using this function >>> _index_sequence_as_cube(cs, (slice(0, cubeB.shape[0]), 0, ... (slice(0, cubeB.shape[2])) # doctest: +SKIP """ # Convert index_as_cube item to a list of regular NDCubeSequence # items of each relevant cube. common_axis_cube_lengths = np.array([c.data.shape[cubesequence._common_axis] for c in cubesequence.data]) sequence_items = convert_cube_like_item_to_sequence_items(item, cubesequence._common_axis, common_axis_cube_lengths) # Use sequence items to slice NDCubeSequence. return slice_sequence_by_sequence_items(cubesequence, sequence_items) def convert_cube_like_item_to_sequence_items(cube_like_item, common_axis, common_axis_cube_lengths): """ Converts an input item to NDCubeSequence.index_as_cube to a list od SequenceSlice objects. Parameters ---------- cube_like_item: `int`, `slice`, of `tuple` of `int and/or `slice`. Item compatible with input to NDCubeSequence.index_as_cube. common_axis: `int` Data axis of NDCubes common to NDCubeSequence common_axis_cube_lengths: `np.array` Length of each cube in sequence along the common axis. Returns ------- sequence_items: `list` of SequenceItem `namedtuple`. The slice/index items for each relevant NDCube within the NDCubeSequence which together represent the original input slice/index item. """ invalid_item_error_message = "Invalid index/slice input." # Case 1: Item is int and common axis is 0. if isinstance(cube_like_item, int): if common_axis != 0: raise ValueError("Input can only be indexed with an int if " "CubeSequence's common axis is 0. common " "axis = {}".format(common_axis)) else: # Derive list of SequenceSlice objects that describes the # cube_like_item in regular slicing notation. sequence_slices = [_convert_cube_like_index_to_sequence_slice( cube_like_item, common_axis_cube_lengths)] all_axes_item = None # Case 2: Item is slice and common axis is 0. elif isinstance(cube_like_item, slice): if common_axis != 0: raise ValueError("Input can only be sliced with a single slice if " "CubeSequence's common axis is 0. common " "axis = {}".format(common_axis)) else: # Derive list of SequenceSlice objects that describes the # cube_like_item in regular slicing notation. # First ensure None types within slice are replaced with appropriate ints. sequence_slices = _convert_cube_like_slice_to_sequence_slices( cube_like_item, common_axis_cube_lengths) all_axes_item = None # Case 3: Item is tuple. elif isinstance(cube_like_item, tuple): # Check item is long enough to include common axis. if len(cube_like_item) < common_axis + 1: raise ValueError("Input item not long enough to include common axis." "Must have length > {}".format(common_axis)) # Based on type of slice/index in the common axis position of # the cube_like_item, derive list of SequenceSlice objects that # describes the cube_like_item in regular slicing notation. if isinstance(cube_like_item[common_axis], int): sequence_slices = [_convert_cube_like_index_to_sequence_slice( cube_like_item[common_axis], common_axis_cube_lengths)] elif isinstance(cube_like_item[common_axis], slice): sequence_slices = _convert_cube_like_slice_to_sequence_slices( cube_like_item[common_axis], common_axis_cube_lengths) else: raise ValueError(invalid_item_error_message) all_axes_item = cube_like_item else: raise TypeError("Unrecognized item type.") # Convert the sequence slices, that only describe the slicing along # the sequence axis and common axis to sequence items which # additionally describe how the non-common cube axes should be sliced. sequence_items = [_convert_sequence_slice_to_sequence_item( sequence_slice, common_axis, cube_like_item=all_axes_item) for sequence_slice in sequence_slices] return sequence_items def _convert_cube_like_index_to_sequence_slice(cube_like_index, common_axis_cube_lengths): """ Converts a cube-like index of an NDCubeSequence to indices along the sequence and common axes. Parameters ---------- cube_like_index: `int` Cube-like index of NDCubeSequence common_axis_cube_lengths: iterable of `int` Length of each cube along common axis. Returns ------- sequence_slice: SequenceSlice `namedtuple`. First element gives index of cube along sequence axis. Second element each index along common axis of relevant cube. """ # Derive cumulative lengths of cubes along common axis. cumul_common_axis_cube_lengths = np.cumsum(common_axis_cube_lengths) # If cube_like_index is within 0th cube in sequence, it is # simple to determine the sequence and common axis indices. try: index_in_0th_cube = cube_like_index < cumul_common_axis_cube_lengths[0] except TypeError as err: none_not_int_error_messages = [ "'>' not supported between instances of 'int' and 'NoneType'", "unorderable types: int() > NoneType()"] if err.args[0] in none_not_int_error_messages: index_in_0th_cube = True else: raise err if index_in_0th_cube: sequence_index = 0 common_axis_index = cube_like_index # Else use more in-depth method. else: # Determine the index of the relevant cube within the sequence # from the cumulative common axis cube lengths. sequence_index = np.where(cumul_common_axis_cube_lengths <= cube_like_index)[0][-1] if cube_like_index > cumul_common_axis_cube_lengths[-1] - 1: # If the cube is out of range then return the last common axis index. common_axis_index = common_axis_cube_lengths[-1] else: # Else use simple equation to derive the relevant common axis index. common_axis_index = cube_like_index - cumul_common_axis_cube_lengths[sequence_index] # sequence_index should be plus one as the sequence_index earlier is # previous index if it is not already the last cube index. if sequence_index < cumul_common_axis_cube_lengths.size - 1: sequence_index += 1 # Return sequence and cube indices. Ensure they are int, rather # than np.int64 to avoid confusion in checking type elsewhere. if common_axis_index is not None: common_axis_index = int(common_axis_index) return SequenceSlice(int(sequence_index), common_axis_index) def _convert_cube_like_slice_to_sequence_slices(cube_like_slice, common_axis_cube_lengths): """ Converts common axis slice input to NDCubeSequence.index_as_cube to a list of sequence indices. Parameters ---------- cube_like_slice: `slice` Slice along common axis in NDCubeSequence.index_as_cube item. common_axis_cube_lengths: iterable of `int` Length of each cube along common axis. Returns ------- sequence_slices: `list` of SequenceSlice `namedtuple`. List sequence slices (sequence axis, common axis) for each element along common axis represented by input cube_like_slice. """ # Ensure any None attributes in input slice are filled with appropriate ints. cumul_common_axis_cube_lengths = np.cumsum(common_axis_cube_lengths) # Determine sequence indices of cubes included in cube-like slice. cube_like_indices = np.arange(cumul_common_axis_cube_lengths[-1])[cube_like_slice] n_cube_like_indices = len(cube_like_indices) one_step_sequence_slices = np.empty(n_cube_like_indices, dtype=object) # Define array of ints for all indices along common axis. # This is restricted to range of interest below. sequence_int_indices = np.zeros(n_cube_like_indices, dtype=int) for i in range(n_cube_like_indices): one_step_sequence_slices[i] = _convert_cube_like_index_to_sequence_slice( cube_like_indices[i], common_axis_cube_lengths) sequence_int_indices[i] = one_step_sequence_slices[i].sequence_index unique_index = np.sort(np.unique(sequence_int_indices, return_index=True)[1]) unique_sequence_indices = sequence_int_indices[unique_index] # Convert start and stop cube-like indices to sequence indices. first_sequence_index = _convert_cube_like_index_to_sequence_slice(cube_like_slice.start, common_axis_cube_lengths) last_sequence_index = _convert_cube_like_index_to_sequence_slice(cube_like_slice.stop, common_axis_cube_lengths) # Since the last index of any slice represents # 'up to but not including this element', if the last sequence index # is the first element of a new cube, elements from the last cube # will not appear in the sliced sequence. Therefore for ease of # slicing, we can redefine the final sequence index as the penultimate # cube and its common axis index as beyond the range of the # penultimate cube's length along the common axis. if (last_sequence_index.sequence_index > first_sequence_index.sequence_index and last_sequence_index.common_axis_item == 0): last_sequence_index = SequenceSlice( last_sequence_index.sequence_index - 1, common_axis_cube_lengths[last_sequence_index.sequence_index - 1]) # Iterate through relevant cubes and determine slices for each. # Do last cube outside loop as its end index may not correspond to # the end of the cube's common axis. if cube_like_slice.step is None: step = 1 else: step = cube_like_slice.step sequence_slices = [] common_axis_start_index = first_sequence_index.common_axis_item j = 0 while j < len(unique_sequence_indices) - 1: # Let i be the index along the sequence axis of the next relevant cube. i = unique_sequence_indices[j] # Determine last common axis index for this cube. common_axis_last_index = common_axis_cube_lengths[i] - ( (common_axis_cube_lengths[i] - common_axis_start_index) % step) # Generate SequenceSlice for this cube and append to list. sequence_slices.append(SequenceSlice( i, slice(common_axis_start_index, min(common_axis_last_index + 1, common_axis_cube_lengths[i]), step))) # Determine first common axis index for next cube. if common_axis_cube_lengths[i] == common_axis_last_index: common_axis_start_index = step - 1 else: common_axis_start_index = \ step - (((common_axis_cube_lengths[i] - common_axis_last_index) % step) + cumul_common_axis_cube_lengths[unique_sequence_indices[j + 1] - 1] - cumul_common_axis_cube_lengths[i]) # Iterate counter. j += 1 # Create slice for last cube manually. sequence_slices.append(SequenceSlice( unique_sequence_indices[j], slice(common_axis_start_index, last_sequence_index.common_axis_item, step))) return sequence_slices def _convert_sequence_slice_to_sequence_item(sequence_slice, common_axis, cube_like_item=None): """ Converts sequence/cube index to a SequenceSlice object. Parameters ---------- sequence_slice: SequenceSlice `namedtuple` 0th element gives index of cube along sequence axis. 1st element each index along common axis of relevant cube. Must be same format as output from _convert_cube_like_index_to_sequence_slice. common_axis: `int` Common axis as defined in NDCubeSequence. cube_like_item: `None` or `tuple` of `slice` and/or `int` objects (Optional) The original item input to `NDCubeSequence.index_as_cube` including the slices/indices of non-common axes of cubes within sequence. If None, a tuple of slice(None) objects is generated long enough so that the last element in the tuple corresponds to the common axis and is set to the 1st (0-based counting) the sequence_index input, above. This tuple is then set to the cube_item attribute of the output `SequenceSlice` object. Returns ------- sequence_item: SequenceItem `namedtuple`. Describes sequence index of an NDCube within an NDCubeSequence and the slice/index item to be applied to the whole NDCube. """ if cube_like_item is None and common_axis == 0: sequence_item = SequenceItem(sequence_slice.sequence_index, sequence_slice.common_axis_item) else: # Create mutable version of cube_like_item. try: cube_item_list = list(cube_like_item) except TypeError as err: if err.args[0] == "'NoneType' object is not iterable": cube_item_list = [] else: raise err # Make sure cube_like_item is long enough to include common axis while len(cube_item_list) <= common_axis: cube_item_list.append(slice(None)) # Create new sequence slice cube_item_list[common_axis] = sequence_slice.common_axis_item sequence_item = SequenceItem(sequence_slice.sequence_index, tuple(cube_item_list)) return sequence_item def convert_slice_nones_to_ints(slice_item, target_length): """ Converts None types within a slice to the appropriate ints based on object to be sliced. The one case where a None is left in the slice object is when the step is negative and the stop parameter is None, since this scenario cannot be represented with an int stop parameter. Parameters ---------- slice_item: `slice` Slice for which Nones should be converted. target_length: `int` Length of object to which slice will be applied. Returns ------- new_slice: `slice` Slice with Nones replaced with ints. """ if slice_item.step is None: step = 1 else: step = slice_item.step start = slice_item.start stop = slice_item.stop if step < 0: if slice_item.start is None: start = int(target_length) stop = slice_item.stop else: if not slice_item.start: start = 0 if not slice_item.stop: stop = int(target_length) return slice(start, stop, step) def _get_axis_extra_coord_names_and_units(cube_list, axis): """ Retrieve all extra coord names and units assigned to a data axis along a sequence of cubes. Parameters ---------- cube_list: `list` of `ndcube.NDCube` The sequence of cubes from which to extract the extra coords. axis: `int` Number of axis (in data/numpy ordering convention). Returns ------- axis_coord_names: `ndarray` of `str` Names of all extra coordinates in sequence. axis_coord_units: `ndarray` of `astropy.unit.unit` Units of extra coordinates. """ # Define empty lists to hold results. axis_coord_names = [] axis_coord_units = [] # Extract extra coordinate names and units (if extra coord is a # quantity) from each cube. for cube in cube_list: all_extra_coords = cube.extra_coords if all_extra_coords is not None: all_extra_coords_keys = list(all_extra_coords.keys()) for coord_key in all_extra_coords_keys: if all_extra_coords[coord_key]["axis"] == axis: axis_coord_names.append(coord_key) if isinstance(all_extra_coords[coord_key]["value"], u.Quantity): axis_coord_units.append(all_extra_coords[coord_key]["value"].unit) else: axis_coord_units.append(None) # Extra coords common between cubes will be repeated. Get rid of # duplicate names and then only keep the units corresponding to # the first occurence of that name. if len(axis_coord_names) > 0: axis_coord_names, ind = np.unique(np.asarray(axis_coord_names), return_index=True) axis_coord_units = np.asarray(axis_coord_units)[ind] else: axis_coord_names = None axis_coord_units = None return axis_coord_names, axis_coord_units def _get_int_axis_extra_coords(cube_list, axis_coord_names, axis_coord_units, axis): """ Retrieve all extra coord names and units assigned to a data axis along a sequence of cubes. Parameters ---------- cube_list: `list` of `ndcube.NDCube` The sequence of cubes from which to extract the extra coords. axis_coord_names: `ndarray` of `str` Names of all extra coordinates in sequence. axis_coord_units: `ndarray` of `astropy.unit.unit` Units of extra coordinates. axis: `int` Number of axis (in data/numpy ordering convention). Returns ------- axis_extra_coords: `dict` Extra coords along given axis. """ # Define empty dictionary which will hold the extra coord # values not assigned a cube data axis. axis_extra_coords = {} # Iterate through cubes and populate values of each extra coord # not assigned a cube data axis. cube_extra_coords = [cube.extra_coords for cube in cube_list] for i, coord_key in enumerate(axis_coord_names): coord_values = [] for j, cube in enumerate(cube_list): # Construct list of coord values from each cube for given extra coord. try: if isinstance(cube_extra_coords[j][coord_key]["value"], u.Quantity): cube_coord_values = \ cube_extra_coords[j][coord_key]["value"].to(axis_coord_units[i]).value else: cube_coord_values = np.asarray(cube_extra_coords[j][coord_key]["value"]) coord_values = coord_values + list(cube_coord_values) except KeyError: # If coordinate not in cube, set coordinate values to NaN. coord_values = coord_values + [np.nan] * cube.dimensions[axis].value # Enter sequence extra coord into dictionary if axis_coord_units[i]: axis_extra_coords[coord_key] = coord_values * axis_coord_units[i] else: axis_extra_coords[coord_key] = np.asarray(coord_values) return axis_extra_coords ././@PaxHeader0000000000000000000000000000003400000000000011452 xustar000000000000000028 mtime=1605787217.3911848 ndcube-1.4.2/ndcube/utils/tests/0000755000175100001640000000000000000000000017010 5ustar00vstsdocker00000000000000././@PaxHeader0000000000000000000000000000003400000000000011452 xustar000000000000000028 mtime=1605787057.1544762 ndcube-1.4.2/ndcube/utils/tests/__init__.py0000644000175100001640000000000000000000000021107 0ustar00vstsdocker00000000000000././@PaxHeader0000000000000000000000000000003400000000000011452 xustar000000000000000028 mtime=1605787057.1544762 ndcube-1.4.2/ndcube/utils/wcs.py0000644000175100001640000005444600000000000017031 0ustar00vstsdocker00000000000000# Author: Ankit Baruah and Daniel Ryan """ Miscellaneous WCS utilities. """ import re from copy import deepcopy from collections import UserDict import numbers import numpy as np from astropy import wcs from astropy.wcs._wcs import InconsistentAxisTypesError from ndcube.utils import cube as utils_cube __all__ = ['WCS', 'reindex_wcs', 'wcs_ivoa_mapping', 'get_dependent_data_axes', 'get_dependent_data_axes', 'axis_correlation_matrix', 'append_sequence_axis_to_wcs'] class TwoWayDict(UserDict): @property def inv(self): """ The inverse dictionary. """ return {v: k for k, v in self.items()} # Define a two way dictionary to hold translations between WCS axis # types and International Virtual Observatory Alliance vocabulary. # See http://www.ivoa.net/documents/REC/UCD/UCDlist-20070402.html wcs_to_ivoa = { "HPLT": "custom:pos.helioprojective.lat", "HPLN": "custom:pos.helioprojective.lon", "TIME": "time", "WAVE": "em.wl", "RA--": "pos.eq.ra", "DEC-": "pos.eq.dec", "FREQ": "em.freq", "STOKES": "phys.polarization.stokes", "PIXEL": "instr.pixel", "XPIXEL": "custom:instr.pixel.x", "YPIXEL": "custom:instr.pixel.y", "ZPIXEL": "custom:instr.pixel.z" } wcs_ivoa_mapping = TwoWayDict() for key in wcs_to_ivoa.keys(): wcs_ivoa_mapping[key] = wcs_to_ivoa[key] class WCS(wcs.WCS): def __init__(self, header=None, naxis=None, **kwargs): """ Initiates a WCS object with additional functionality to add dummy axes. Not all WCS axes are independent. Some, e.g. latitude and longitude, are dependent and one cannot be used without the other. Therefore this WCS class has the ability to determine whether a dependent axis is missing and can augment the WCS axes with a dummy axis to enable the translations to work. Parameters ---------- header: FITS header or `dict` with appropriate FITS keywords. naxis: `int` Number of axis described by the header. """ self.oriented = False self.was_augmented = WCS._needs_augmenting(header) if self.was_augmented: header = WCS._augment(header, naxis) if naxis is not None: naxis = naxis + 1 super().__init__(header=header, naxis=naxis, **kwargs) @classmethod def _needs_augmenting(cls, header): """ Determines whether a missing dependent axis is missing from the WCS object. WCS cannot be created with only one spacial dimension. If WCS detects that returns that it needs to be augmented. Parameters ---------- header: FITS header or `dict` with appropriate FITS keywords. """ try: wcs.WCS(header=header) except InconsistentAxisTypesError as err: if re.search(r'Unmatched celestial axes', str(err)): return True return False @classmethod def _augment(cls, header, naxis): """ Augments WCS with a dummy axis to take the place of a missing dependent axis. """ newheader = deepcopy(header) new_wcs_axes_params = {'CRPIX': 0, 'CDELT': 1, 'CRVAL': 0, 'CNAME': 'redundant axis', 'CTYPE': 'HPLN-TAN', 'CROTA': 0, 'CUNIT': 'deg', 'NAXIS': 1} axis = str(max(newheader.get('NAXIS', 0), naxis) + 1) for param in new_wcs_axes_params: attr = new_wcs_axes_params[param] newheader[param + axis] = attr try: print(wcs.WCS(header=newheader).get_axis_types()) except InconsistentAxisTypesError as err: projection = re.findall(r'expected [^,]+', str(err))[0][9:] newheader['CTYPE' + axis] = projection return newheader def _wcs_slicer(wcs, missing_axes, item): """ Returns the new sliced wcs and changed missing axis. Paramters --------- wcs: `astropy.wcs.WCS` or `ndcube.utils.wcs.WCS` WCS object to be sliced. missing_axes: `list` of `bool` Indicates which axes of the WCS are "missing", i.e. do not correspond to a data axis. item: `int`, `slice` or `tuple` of `int` and/or `slice`. Slicing item. Note that unlike in other places in this package, the item has the same axis ordering as the WCS object, i.e. the reverse of the data order. Returns ------- new_wcs: `astropy.wcs.WCS` or `ndcube.utils.wcs.WCS` Sliced WCS object. missing_axes: `list` of `bool` Altered missing axis list. Note the ordering has been reversed to reflect the data (numpy) axis ordering convention. """ # normal slice. item_checked = [] if isinstance(item, slice): index = 0 # creating a new tuple of slice where if the axis is dead i.e missing # then slice(0,1) added else slice(None, None, None) is appended and # if the check of missing_axes gives that this is the index where it # needs to be appended then it gets appended there. for _bool in missing_axes: if not _bool: if index != 1: item_checked.append(item) index += 1 else: item_checked.append(slice(None, None, None)) else: item_checked.append(slice(0, 1)) new_wcs = wcs.slice(item_checked) # item is int then slicing axis. elif isinstance(item, numbers.Integral): # using index to keep track of whether the int(which is converted to # slice(int_value, int_value+1)) is already added or not. It checks # the dead axis i.e missing_axes to check if it is dead than slice(0,1) # is appended in it. if the index value has reached 1 then the # slice(None, None, None) is added. index = 0 for i, _bool in enumerate(missing_axes): if not _bool: if index != 1: item_checked.append(slice(item, item + 1)) missing_axes[i] = True index += 1 else: item_checked.append(slice(None, None, None)) else: item_checked.append(slice(0, 1)) new_wcs = wcs.slice(item_checked) # if it a tuple like [0:2, 0:3, 2] or [0:2, 1:3] elif isinstance(item, tuple): # Ellipsis slicing is currently not supported. # Raise an error if user tries to slice by ellipsis. if Ellipsis in item: raise NotImplementedError("Slicing FITS-WCS by ellipsis not supported.") # this is used to not exceed the range of the item tuple # if the check of the missing_axes which is False if not dead # is a success than the the item of the tuple is added one by # one and if the end of tuple is reached than slice(None, None, None) # is appended. index = 0 for _bool in missing_axes: if not _bool: if index is not len(item): item_checked.append(item[index]) index += 1 else: item_checked.append(slice(None, None, None)) else: item_checked.append(slice(0, 1)) # if all are slice in the item tuple if _all_slice(item_checked): new_wcs = wcs.slice(item_checked) # if all are not slices some of them are int then else: # this will make all the item in item_checked as slice. item_ = _slice_list(item_checked) new_wcs = wcs.slice(item_) for i, it in enumerate(item_checked): # If an axis is sliced out, i.e. it's item is an int, # set missing axis to True. # numbers.Integral captures all int types, int, np.int64, etc. if isinstance(it, numbers.Integral): missing_axes[i] = True else: raise NotImplementedError("Slicing FITS-WCS by {} not supported.".format(type(item))) # returning the reverse list of missing axis as in the item here was reverse of # what was inputed so we had a reverse missing_axes. return new_wcs, missing_axes[::-1] def _all_slice(obj): """ Returns True if all the elements in the object are slices else return False. """ result = False if not isinstance(obj, (tuple, list)): return result result |= all(isinstance(o, slice) for o in obj) return result def _slice_list(obj): """ Return list of all the slices. Example ------- >>> _slice_list((slice(1,2), slice(1,3), 2, slice(2,4), 8)) [slice(1, 2, None), slice(1, 3, None), slice(2, 3, None), slice(2, 4, None), slice(8, 9, None)] """ result = [] if not isinstance(obj, (tuple, list)): return result for i, o in enumerate(obj): if isinstance(o, int): result.append(slice(o, o + 1)) elif isinstance(o, slice): result.append(o) return result def reindex_wcs(wcs, inds): # From astropy.spectral_cube.wcs_utils """ Re-index a WCS given indices. The number of axes may be reduced. Parameters ---------- wcs: sunpy.wcs.wcs.WCS The WCS to be manipulated inds: np.array(dtype='int') The indices of the array to keep in the output. e.g. swapaxes: [0,2,1,3] dropaxes: [0,1,3] """ if not isinstance(inds, np.ndarray): raise TypeError("Indices must be an ndarray") if inds.dtype.kind != 'i': raise TypeError('Indices must be integers') outwcs = WCS(naxis=len(inds)) wcs_params_to_preserve = ['cel_offset', 'dateavg', 'dateobs', 'equinox', 'latpole', 'lonpole', 'mjdavg', 'mjdobs', 'name', 'obsgeo', 'phi0', 'radesys', 'restfrq', 'restwav', 'specsys', 'ssysobs', 'ssyssrc', 'theta0', 'velangl', 'velosys', 'zsource'] for par in wcs_params_to_preserve: setattr(outwcs.wcs, par, getattr(wcs.wcs, par)) cdelt = wcs.wcs.cdelt try: outwcs.wcs.pc = wcs.wcs.pc[inds[:, None], inds[None, :]] except AttributeError: outwcs.wcs.pc = np.eye(wcs.naxis) outwcs.wcs.crpix = wcs.wcs.crpix[inds] outwcs.wcs.cdelt = cdelt[inds] outwcs.wcs.crval = wcs.wcs.crval[inds] outwcs.wcs.cunit = [wcs.wcs.cunit[i] for i in inds] outwcs.wcs.ctype = [wcs.wcs.ctype[i] for i in inds] outwcs.wcs.cname = [wcs.wcs.cname[i] for i in inds] outwcs._naxis = [wcs._naxis[i] for i in inds] return outwcs def get_dependent_data_axes(wcs_object, data_axis, missing_axes): """ Given a data axis index, return indices of dependent data axes. Both input and output axis indices are in the numpy ordering convention (reverse of WCS ordering convention). The returned axis indices include the input axis. Returned axis indices do NOT include any WCS axes that do not have a corresponding data axis, i.e. "missing" axes. Parameters ---------- wcs_object: `astropy.wcs.WCS` or `ndcube.utils.wcs.WCS` The WCS object describing the axes. data_axis: `int` Index of axis (in numpy ordering convention) for which dependent axes are desired. missing_axes: iterable of `bool` Indicates which axes of the WCS are "missing", i.e. do not correspond to a data axis. Returns ------- dependent_data_axes: `tuple` of `int` Sorted indices of axes dependent on input data_axis in numpy ordering convention. """ # In order to correctly account for "missing" axes in this process, # we must determine what axes are dependent based on WCS axis indices. # Convert input data axis index to WCS axis index. wcs_axis = utils_cube.data_axis_to_wcs_axis(data_axis, missing_axes) # Determine dependent axes, including "missing" axes, using WCS ordering. wcs_dependent_axes = np.asarray(get_dependent_wcs_axes(wcs_object, wcs_axis)) # Remove "missing" axes from output. non_missing_wcs_dependent_axes = wcs_dependent_axes[ np.invert(missing_axes)[wcs_dependent_axes]] # Convert dependent axes back to numpy/data ordering. dependent_data_axes = tuple(np.sort([utils_cube.wcs_axis_to_data_axis(i, missing_axes) for i in non_missing_wcs_dependent_axes])) return dependent_data_axes def get_dependent_wcs_axes(wcs_object, wcs_axis): """ Given a WCS axis index, return indices of dependent WCS axes. Both input and output axis indices are in the WCS ordering convention (reverse of numpy ordering convention). The returned axis indices include the input axis. Returned axis indices DO include WCS axes that do not have a corresponding data axis, i.e. "missing" axes. Parameters ---------- wcs_object: `astropy.wcs.WCS` or `ndcube.utils.wcs.WCS` The WCS object describing the axes. wcs_axis: `int` Index of axis (in WCS ordering convention) for which dependent axes are desired. Returns ------- dependent_data_axes: `tuple` of `int` Sorted indices of axes dependent on input data_axis in WCS ordering convention. """ # Pre-compute dependent axes. The matrix returned by # axis_correlation_matrix is (n_world, n_pixel) but we want to know # which pixel coordinates are linked to which other pixel coordinates. # So to do this we take a column from the matrix and find if there are # any entries in common with all other columns in the matrix. matrix = axis_correlation_matrix(wcs_object) world_dep = matrix[:, wcs_axis:wcs_axis + 1] dependent_wcs_axes = tuple(np.sort(np.nonzero((world_dep & matrix).any(axis=0))[0])) return dependent_wcs_axes def axis_correlation_matrix(wcs_object): """ Return True/False matrix indicating which WCS axes are dependent on others. Parameters ---------- wcs_object: `astropy.wcs.WCS` or `ndcube.utils.wcs.WCS` The WCS object describing the axes. Returns ------- matrix: `numpy.ndarray` of `bool` Square True/False matrix indicating which axes are dependent. For example, whether WCS axis 0 is dependent on WCS axis 1 is given by matrix[0, 1]. """ n_world = len(wcs_object.wcs.ctype) n_pixel = wcs_object.naxis # If there are any distortions present, we assume that there may be # correlations between all axes. Maybe if some distortions only apply # to the image plane we can improve this for distortion_attribute in ('sip', 'det2im1', 'det2im2'): if getattr(wcs_object, distortion_attribute): return np.ones((n_world, n_pixel), dtype=bool) # Assuming linear world coordinates along each axis, the correlation # matrix would be given by whether or not the PC matrix is zero matrix = wcs_object.wcs.get_pc() != 0 # We now need to check specifically for celestial coordinates since # these can assume correlations because of spherical distortions. For # each celestial coordinate we copy over the pixel dependencies from # the other celestial coordinates. celestial = (wcs_object.wcs.axis_types // 1000) % 10 == 2 celestial_indices = np.nonzero(celestial)[0] for world1 in celestial_indices: for world2 in celestial_indices: if world1 != world2: matrix[world1] |= matrix[world2] matrix[world2] |= matrix[world1] return matrix def append_sequence_axis_to_wcs(wcs_object): """ Appends a 1-to-1 dummy axis to a WCS object. """ dummy_number = wcs_object.naxis + 1 wcs_header = wcs_object.to_header() wcs_header.append((f"CTYPE{dummy_number}", "ITER", "A unitless iteration-by-one axis.")) wcs_header.append((f"CRPIX{dummy_number}", 0., "Pixel coordinate of reference point")) wcs_header.append((f"CDELT{dummy_number}", 1., "Coordinate increment at reference point")) wcs_header.append((f"CRVAL{dummy_number}", 0., "Coordinate value at reference point")) wcs_header.append((f"CUNIT{dummy_number}", "pix", "Coordinate value at reference point")) wcs_header["WCSAXES"] = dummy_number return WCS(wcs_header) def convert_between_array_and_pixel_axes(axis, naxes): """Reflects axis index about center of number of axes. This is used to convert between array axes in numpy order and pixel axes in WCS order. Works in both directions. Parameters ---------- axis: `numpy.ndarray` of `int` The axis number(s) before reflection. naxes: `int` The number of array axes. Returns ------- reflected_axis: `numpy.ndarray` of `int` The axis number(s) after reflection. """ # Check type of input. if not isinstance(axis, np.ndarray): raise TypeError("input must be of array type. Got type: {type(axis)}") if axis.dtype.char not in np.typecodes['AllInteger']: raise TypeError("input dtype must be of int type. Got dtype: {axis.dtype})") # Convert negative indices to positive equivalents. axis[axis < 0] += naxes if any(axis > naxes - 1): raise IndexError("Axis out of range. " f"Number of axes = {naxes}; Axis numbers requested = {axes}") # Reflect axis about center of number of axes. reflected_axis = naxes - 1 - axis return reflected_axis def pixel_axis_to_world_axes(pixel_axis, axis_correlation_matrix): """ Retrieves the indices of the world axis physical types corresponding to a pixel axis. Parameters ---------- pixel_axis: `int` The pixel axis index/indices for which the world axes are desired. axis_correlation_matrix: `numpy.ndarray` of `bool` 2D boolean correlation matrix defining the dependence between the pixel and world axes. Format same as `astropy.wcs.BaseLowLevelWCS.axis_correlation_matrix`. Returns ------- world_axes: `numpy.ndarray` The world axis indices corresponding to the pixel axis. """ return np.arange(axis_correlation_matrix.shape[0])[axis_correlation_matrix[:, pixel_axis]] def physical_type_to_world_axis(physical_type, world_axis_physical_types): """ Returns world axis index of a physical type based on WCS world_axis_physical_types. Input can be a substring of a physical type, so long as it is unique. Parameters ---------- physical_type: `str` The physical type or a substring unique to a physical type. world_axis_physical_types: sequence of `str` All available physical types. Ordering must be same as `astropy.wcs.BaseLowLevelWCS.world_axis_physical_types` Returns ------- world_axis: `numbers.Integral` The world axis index of the physical type. """ # Find world axis index described by physical type. widx = np.where(world_axis_physical_types == physical_type)[0] # If physical type does not correspond to entry in world_axis_physical_types, # check if it is a substring of any physical types. if len(widx) == 0: widx = [physical_type in world_axis_physical_type for world_axis_physical_type in world_axis_physical_types] widx = np.arange(len(world_axis_physical_types))[widx] if len(widx) != 1: raise ValueError( "Input does not uniquely correspond to a physical type." f" Expected unique substring of one of {world_axis_physical_types}." f" Got: {physical_type}") # Return axes with duplicates removed. return widx[0] def reduced_axis_correlation_matrix(axis_correlation_matrix, missing_axes, return_world_indices=False): """ Return axis correlation matrix with missing axes removed. This is needed because ndcube 1.3.x does not use astropy.nddata to slice WCSs. Will be removed in ndcube 2.0. Parameters ---------- axis_correlation_matrix: `numpy.ndarray` of `bool` 2D boolean correlation matrix defining the dependence between the pixel and world axes. Format same as `astropy.wcs.BaseLowLevelWCS.axis_correlation_matrix`. missing_axes: `list` of `bool` Denotes axes in WCS for which the corresponding data axis is missing. return_world_indices: `bool` If True, the indices of the world_axis_physical_types corresponding to the reduced matrix are returned. Default=False Returns ------- reduced_matrix: `numpy.ndarray` of `bool` axis_correlation_matrix with missing axes removed. """ # Remove missing pixel axes pixel_axes = np.invert(missing_axes) reduced_matrix = axis_correlation_matrix[:, pixel_axes] # Remove world axes which now no longer correspond to a pixel axis. world_axes = [reduced_matrix[i].any() for i in range(axis_correlation_matrix.shape[0])] reduced_matrix = reduced_matrix[world_axes] if return_world_indices: return reduced_matrix, world_axes else: return reduced_matrix def reduced_correlation_matrix_and_world_physical_types( axis_correlation_matrix, world_axis_physical_types, missing_axes): """ Return axis correlation matrix with missing axes removed. This is needed because ndcube 1.3.x does not use astropy.nddata to slice WCSs. Will be removed in ndcube 2.0. Parameters ---------- axis_correlation_matrix: `numpy.ndarray` of `bool` 2D boolean correlation matrix defining the dependence between the pixel and world axes. Format same as `astropy.wcs.BaseLowLevelWCS.axis_correlation_matrix`. world_axis_physical_types: iterable of `str` The physical types corresponding to the world axes of the axis correlation matrix. missing_axes: `list` of `bool` Denotes axes in WCS for which the corresponding data axis is missing. Returns ------- reduced_matrix: `numpy.ndarray` of `bool` axis_correlation_matrix with missing axes removed. reduced_physical_types: `numpy.ndarray` of `str` The physical types corresponding to the reduced matrix. """ reduced_matrix, world_axes = reduced_axis_correlation_matrix( axis_correlation_matrix, missing_axes, return_world_indices=True) reduced_physical_types = np.array(world_axis_physical_types)[world_axes] return reduced_matrix, reduced_physical_types ././@PaxHeader0000000000000000000000000000003400000000000011452 xustar000000000000000028 mtime=1605787057.1544762 ndcube-1.4.2/ndcube/version.py0000644000175100001640000000034600000000000016550 0ustar00vstsdocker00000000000000# This file is for compatibility with astropy_helpers from pkg_resources import get_distribution, DistributionNotFound try: version = get_distribution("ndcube").version except DistributionNotFound: version = 'unknown.dev' ././@PaxHeader0000000000000000000000000000003400000000000011452 xustar000000000000000028 mtime=1605787217.3911848 ndcube-1.4.2/ndcube/visualization/0000755000175100001640000000000000000000000017407 5ustar00vstsdocker00000000000000././@PaxHeader0000000000000000000000000000003400000000000011452 xustar000000000000000028 mtime=1605787057.1544762 ndcube-1.4.2/ndcube/visualization/__init__.py0000644000175100001640000000000000000000000021506 0ustar00vstsdocker00000000000000././@PaxHeader0000000000000000000000000000003400000000000011452 xustar000000000000000028 mtime=1605787217.3911848 ndcube-1.4.2/ndcube/visualization/animator/0000755000175100001640000000000000000000000021221 5ustar00vstsdocker00000000000000././@PaxHeader0000000000000000000000000000003400000000000011452 xustar000000000000000028 mtime=1605787057.1544762 ndcube-1.4.2/ndcube/visualization/animator/__init__.py0000644000175100001640000000006200000000000023330 0ustar00vstsdocker00000000000000from ndcube.visualization.animator.image import * ././@PaxHeader0000000000000000000000000000003400000000000011452 xustar000000000000000028 mtime=1605787057.1544762 ndcube-1.4.2/ndcube/visualization/animator/image.py0000644000175100001640000001076500000000000022666 0ustar00vstsdocker00000000000000from astropy.wcs.wcsapi import BaseLowLevelWCS from sunpy.visualization.animator import ImageAnimator __all__ = ["ImageAnimatorWCS"] class ImageAnimatorWCS(ImageAnimator): """ Animates N-dimensional data with an associated World Coordinate System. The following keyboard shortcuts are defined in the viewer: * 'left': previous step on active slider. * 'right': next step on active slider. * 'top': change the active slider up one. * 'bottom': change the active slider down one. * 'p': play/pause active slider. This viewer can have user defined buttons added by specifying the labels and functions called when those buttons are clicked as keyword arguments. Parameters ---------- data: `numpy.ndarray` The data to be visualized. wcs : `~astropy.wcs.wcsapi.BaseLowLevelWCS` The WCS object describing the physical coordinates of the data. image_axes: `list`, optional A list of the axes order that make up the image. unit_x_axis: `astropy.units.Unit`, optional The unit of X axis. unit_y_axis: `astropy.units.Unit`, optional The unit of Y axis. axis_ranges: `list`, optional Defaults to `None` and array indices will be used for all axes. The `list` should contain one element for each axis of the input data array. For the image axes a ``[min, max]`` pair should be specified which will be passed to `matplotlib.pyplot.imshow` as an extent. For the slider axes a ``[min, max]`` pair can be specified or an array the same length as the axis which will provide all values for that slider. Notes ----- Extra keywords are passed to `~sunpy.visualization.animator.ArrayAnimator`. """ def __init__(self, data, wcs, image_axes=[-1, -2], unit_x_axis=None, unit_y_axis=None, axis_ranges=None, **kwargs): if not isinstance(wcs, BaseLowLevelWCS): raise ValueError("A WCS object should be provided that implements the astropy WCS API.") if wcs.pixel_n_dim is not data.ndim: raise ValueError("Dimensionality of the data and WCS object do not match.") self.wcs = wcs list_slices_wcsaxes = [0 for i in range(self.wcs.pixel_n_dim)] list_slices_wcsaxes[image_axes[0]] = 'x' list_slices_wcsaxes[image_axes[1]] = 'y' self.slices_wcsaxes = list_slices_wcsaxes[::-1] self.unit_x_axis = unit_x_axis self.unit_y_axis = unit_y_axis # Using `super()` here causes an error with the @deprecated decorator. ImageAnimator.__init__(self, data, image_axes=image_axes, axis_ranges=axis_ranges, **kwargs) def _get_main_axes(self): axes = self.fig.add_axes([0.1, 0.1, 0.8, 0.8], projection=self.wcs, slices=self.slices_wcsaxes) self._set_unit_in_axis(axes) return axes def _set_unit_in_axis(self, axes): x_index = self.slices_wcsaxes.index("x") y_index = self.slices_wcsaxes.index("y") if self.unit_x_axis is not None: axes.coords[x_index].set_format_unit(self.unit_x_axis) axes.coords[x_index].set_ticks(exclude_overlapping=True) if self.unit_y_axis is not None: axes.coords[y_index].set_format_unit(self.unit_y_axis) axes.coords[y_index].set_ticks(exclude_overlapping=True) def plot_start_image(self, ax): """ Sets up a plot of initial image. """ imshow_args = {'interpolation': 'nearest', 'origin': 'lower', } imshow_args.update(self.imshow_kwargs) im = ax.imshow(self.data[self.frame_index], **imshow_args) if self.if_colorbar: self._add_colorbar(im) return im def update_plot(self, val, im, slider): """ Updates plot based on slider/array dimension being iterated. """ ind = int(val) ax_ind = self.slider_axes[slider.slider_ind] self.frame_slice[ax_ind] = ind list_slices_wcsaxes = list(self.slices_wcsaxes) list_slices_wcsaxes[self.wcs.pixel_n_dim-ax_ind-1] = val self.slices_wcsaxes = list_slices_wcsaxes if val != slider.cval: self.axes.reset_wcs(wcs=self.wcs, slices=self.slices_wcsaxes) self._set_unit_in_axis(self.axes) im.set_array(self.data[self.frame_index]) slider.cval = val # Update slider label to reflect real world values in axis_ranges. super().update_plot(val, im, slider) ././@PaxHeader0000000000000000000000000000003400000000000011452 xustar000000000000000028 mtime=1605787057.1544762 ndcube-1.4.2/pyproject.toml0000644000175100001640000000474100000000000016170 0ustar00vstsdocker00000000000000[build-system] requires = ["setuptools", "setuptools_scm", "wheel", "numpy==1.14.5"] build-backend = 'setuptools.build_meta' [ tool.sunpy-bot ] # disable astropy checks changelog_check = false autoclose_stale_pull_request = false # SunPy Checks check_towncrier_changelog = true check_milestone = true post_pr_comment = true all_passed_message = """ Thanks for the pull request @{pr_handler.user}! Everything looks great! """ [ tool.sunpy-bot.towncrier_changelog ] verify_pr_number = true changelog_skip_label = "No Changelog Entry Needed" help_url = "https://github.com/sunpy/ndcube/blob/towncrier/changelog/README.rst" missing_file_message = """ * I didn't detect a changelog file in this pull request. Please add a changelog file to the `changelog/` directory following the instructions in the changelog [README](https://github.com/sunpy/ndcube/blob/master/changelog/README.rst). """ wrong_type_message = """ * The changelog file you added is not one of the allowed types. Please use one of the types described in the changelog [README](https://github.com/sunpy/ndcube/blob/master/changelog/README.rst) """ wrong_number_message = """ * The number in the changelog file you added does not match the number of this pull request. Please rename the file. """ [ tool.sunpy-bot.milestone_checker ] missing_message = """ * This pull request does not have a milestone assigned to it. Only maintainers can change this, so you don't need to worry about it. :smile: """ [tool.towncrier] package = "ndcube" filename = "CHANGELOG.rst" directory = "changelog/" issue_format = "`#{issue} `__" [[tool.towncrier.type]] directory = "breaking" name = "Backwards Incompatible Changes" showcontent = true [[tool.towncrier.type]] directory = "api" name = "API Changes" showcontent = true [[tool.towncrier.type]] directory = "removal" name = "Deprecations and Removals" showcontent = true [[tool.towncrier.type]] directory = "feature" name = "Features" showcontent = true [[tool.towncrier.type]] directory = "bugfix" name = "Bug Fixes" showcontent = true [[tool.towncrier.type]] directory = "doc" name = "Improved Documentation" showcontent = true [[tool.towncrier.type]] directory = "trivial" name = "Trivial/Internal Changes" showcontent = true ././@PaxHeader0000000000000000000000000000003400000000000011452 xustar000000000000000028 mtime=1605787057.1544762 ndcube-1.4.2/setup.cfg0000644000175100001640000000557600000000000015104 0ustar00vstsdocker00000000000000[metadata] name = ndcube author = The SunPy Community author_email = sunpy@googlegroups.com description = A package for multi-dimensional contiguious and non-contiguious coordinate aware arrays. long_description = file: README.rst license = BSD 2-Clause url = http://docs.sunpy.org/projects/ndcube/ edit_on_github = True github_project = sunpy/ndcube python_requires = ">=3.6" [options] packages = find: include_package_data = True setup_requires = setuptools_scm install_requires = sunpy>=1.0.3 astropy>=3.1 matplotlib>=2 [options.extras_require] tests = pytest<5.1 pytest-astropy pytest-cov pytest-mock tox docs = sphinx>=1.7 sphinx-astropy sunpy-sphinx-theme towncrier [options.package_data] ndcube.data = * [build_docs] source-dir = docs build-dir = docs/_build all_files = 1 [tool:pytest] minversion = 3.1 testpaths = "ndcube" "docs" norecursedirs = ".tox" "build" "docs[\/]_build" "docs[\/]generated" "*.egg-info" "astropy_helpers" "examples" doctest_plus = enabled doctest_optionflags = NORMALIZE_WHITESPACE FLOAT_CMP ELLIPSIS addopts = --doctest-rst -p no:warnings markers = online: marks this test function as needing online connectivity. figure: marks this test function as using hash-based Matplotlib figure verification. This mark is not meant to be directly applied, but is instead automatically applied when a test function uses the @sunpy.tests.helpers.figure_test decorator. # Disable internet access for tests not marked remote_data remote_data_strict = True [ah_bootstrap] auto_use = True [flake8] exclude = extern,sphinx,*parsetab.py,astropy_helpers,ah_bootstrap.py,conftest.py,docs/conf.py,setup.py max-line-length = 110 [pycodestyle] exclude = extern,sphinx,*parsetab.py,astropy_helpers,ah_bootstrap.py,conftest.py,docs/conf.py,setup.py max_line_length = 110 [isort] line_length = 110 not_skip = __init__.py sections = FUTURE, STDLIB, THIRDPARTY, ASTROPY, FIRSTPARTY, LOCALFOLDER default_section = THIRDPARTY known_first_party = ndcube known_astropy = astropy, asdf, sunpy multi_line_output = 0 balanced_wrapping = True include_trailing_comma = False length_sort = False length_sort_stdlib = True [coverage:run] omit = ndcube/conftest.py ndcube/cython_version* ndcube/*setup* ndcube/extern/* ndcube/*/tests/* ndcube/version* ndcube/__init__* ndcube/_sunpy_init* */ndcube/conftest.py */ndcube/cython_version* */ndcube/*setup* */ndcube/extern/* */ndcube/*/tests/* */ndcube/version* */ndcube/__init__* */ndcube/_sunpy_init* [coverage:report] exclude_lines = # Have to re-enable the standard pragma pragma: no cover # Don't complain about packages we have installed except ImportError # Don't complain if tests don't hit assertions raise AssertionError raise NotImplementedError # Don't complain about script hooks def main\(.*\): # Ignore branches that don't pertain to this version of Python pragma: py{ignore_python_version} ././@PaxHeader0000000000000000000000000000003400000000000011452 xustar000000000000000028 mtime=1605787057.1544762 ndcube-1.4.2/setup.py0000755000175100001640000000312300000000000014762 0ustar00vstsdocker00000000000000#!/usr/bin/env python import os import sys from itertools import chain from setuptools import setup from setuptools.config import read_configuration # Append cwd for pip 19 sys.path.append(os.path.abspath(".")) import ah_bootstrap # noqa from astropy_helpers.setup_helpers import register_commands, get_package_info # noqa ################################################################################ # Override the default Astropy Test Command ################################################################################ cmdclass = register_commands() try: from sunpy.tests.setup_command import SunPyTest # Overwrite the Astropy Testing framework cmdclass['test'] = type('SunPyTest', (SunPyTest,), {'package_name': 'ndcube'}) except Exception: # Catch everything, if it doesn't work, we still want ndcube to install. pass ################################################################################ # Programmatically generate some extras combos. ################################################################################ extras = read_configuration("setup.cfg")['options']['extras_require'] # Dev is everything extras['dev'] = list(chain(*extras.values())) # All is everything but tests and docs exclude_keys = ("tests", "docs", "dev") ex_extras = dict(filter(lambda i: i[0] not in exclude_keys, extras.items())) # Concatenate all the values together for 'all' extras['all'] = list(chain.from_iterable(ex_extras.values())) package_info = get_package_info() setup(extras_require=extras, use_scm_version=True, cmdclass=cmdclass, **package_info)