Latest Threat Research:SANDWORM_MODE: Shai-Hulud-Style npm Worm Hijacks CI Workflows and Poisons AI Toolchains.Details
Socket
Book a DemoInstallSign in
Socket

pyvo

Package Overview
Dependencies
Maintainers
5
Versions
42
Alerts
File Explorer

Advanced tools

Socket logo

Install Socket

Detect and block malicious and high-risk dependencies

Install

pyvo - npm Package Compare versions

Comparing version
1.5.2
to
1.5.3
+18
.github/dependabot.yml
# To get started with Dependabot version updates, you'll need to specify which
# package ecosystems to update and where the package manifests are located.
# Please see the documentation for all configuration options:
# https://docs.github.com/github/administering-a-repository/configuration-options-for-dependency-updates
version: 2
updates:
- package-ecosystem: "github-actions"
directory: "/"
schedule:
interval: "monthly"
groups:
actions:
patterns:
- "*"
labels:
- "no-changelog-entry-needed"
- "infrastructure"
.. _pyvo-samp:
******************
SAMP (`pyvo.samp`)
******************
SAMP, the Simple Application Messaging Protocol, lets you send and
receive tables, datasets, or points of interest between various clients
on a desktop.
While the main SAMP support still resides in astropy (it might be moved
over to pyvo in later versions), there are a few wrapper functions in
pyvo that make a few common SAMP tasks simpler or more robust.
Most importantly, pyvo lets you manage the SAMP connection in a context
manager, which means you will not have to remember to close your
connections to avoid ugly artefacts in the SAMP hub.
In addition, there are convenience functions for sending out data; in all
cases, you can pass a ``client_name`` to only send a message to the
specific client; a simple way to obtain client names is to inspect
TOPCAT's SAMP status if you use TOPCAT's built-in SAMP hub. If, on the
other hand, you pass ``None`` as ``client_name`` (or omit the
parameter), you will send a broadcast.
Sending tables has the additional difficulty over sending other datasets
that you will have to make the table data accessible to the receiving
client. The solution chosen by pyvo.samp at this time will only work if
the sending and receiving applications share a file system. This seems
a reasonable expectation and saves a bit of a potential security headache.
Using pyvo.samp, sending an astropy table ``t`` to TOPCAT would look
like this::
import pyvo
with pyvo.samp.connection(client_name="pyvo magic") as conn:
pyvo.samp.send_table_to(
conn,
t,
name="my-results",
client_name="topcat")
Reference/API
=============
.. automodapi:: pyvo.samp
:no-inheritance-diagram:
"""
Miscellenaneous utilities for writing tests.
"""
from astropy.io.votable import tree
from pyvo.dal import query as dalquery
try:
TABLE_ELEMENT = tree.TableElement
except AttributeError:
TABLE_ELEMENT = tree.Table
def create_votable(field_descs, records):
"""returns a VOTableFile with a a single table containing records,
described by field_descs.
"""
votable = tree.VOTableFile()
resource = tree.Resource(type="results")
votable.resources.append(resource)
table = TABLE_ELEMENT(votable)
resource.tables.append(table)
table.fields.extend(
tree.Field(votable, **desc) for desc in field_descs)
table.create_arrays(len(records))
for index, rec in enumerate(records):
table.array[index] = rec
return votable
def create_dalresults(
field_descs,
records,
*,
resultsClass=dalquery.DALResults):
"""returns a DALResults instance for a query returning records
described by field_descs.
The arguments are as for create_votable.
"""
return resultsClass(
create_votable(field_descs, records),
url="http://testing.pyvo/test-url")
+1
-1

@@ -17,5 +17,5 @@ name: Changelog check

- name: Check change log entry
uses: scientific-python/action-check-changelogfile@865ff8154dd94f008f08de6bb8d8c1f661113658
uses: scientific-python/action-check-changelogfile@1fc669db9618167166d5a16c10282044f51805c0 # 0.3
env:
CHANGELOG_FILENAME: CHANGES.rst
GITHUB_TOKEN: ${{ secrets.GITHUB_TOKEN }}

@@ -24,5 +24,5 @@ # This test job is separated out into its own workflow to be able to trigger separately

steps:
- uses: actions/checkout@v4
- uses: actions/checkout@d632683dd7b4114ad314bca15554477dd762a938 # v4.2.0
- name: Set up Python 3.12
uses: actions/setup-python@v5
uses: actions/setup-python@f677139bbe7f9c59b41e40162b753c062f5d49a3 # v5.2.0
with:

@@ -36,5 +36,18 @@ python-version: "3.12"

- name: Upload coverage to codecov
uses: codecov/codecov-action@v3
uses: codecov/codecov-action@b9fd7d16f6d7d1b5d2bec1a2887e65ceed900238 # v4.6.0
with:
file: ./coverage.xml
verbose: true
py313:
runs-on: ubuntu-latest
steps:
- uses: actions/checkout@d632683dd7b4114ad314bca15554477dd762a938 # v4.2.0
- name: Set up Python 3.13
uses: actions/setup-python@f677139bbe7f9c59b41e40162b753c062f5d49a3 # v5.2.0
with:
python-version: "3.13-dev"
- name: Install tox
run: python -m pip install --upgrade tox
- name: Run tests against dev dependencies
run: tox -e py313-test

@@ -42,7 +42,7 @@ # Developer version testing is in separate workflow

- name: Checkout code
uses: actions/checkout@v4
uses: actions/checkout@d632683dd7b4114ad314bca15554477dd762a938 # v4.2.0
with:
fetch-depth: 0
- name: Set up Python
uses: actions/setup-python@v5
uses: actions/setup-python@f677139bbe7f9c59b41e40162b753c062f5d49a3 # v5.2.0
with:

@@ -64,7 +64,7 @@ python-version: ${{ matrix.python-version }}

- name: Checkout code
uses: actions/checkout@v4
uses: actions/checkout@d632683dd7b4114ad314bca15554477dd762a938 # v4.2.0
with:
fetch-depth: 0
- name: Set up Python
uses: actions/setup-python@v5
uses: actions/setup-python@f677139bbe7f9c59b41e40162b753c062f5d49a3 # v5.2.0
with:

@@ -83,5 +83,5 @@ python-version: '3.10'

steps:
- uses: actions/checkout@v4
- uses: actions/checkout@d632683dd7b4114ad314bca15554477dd762a938 # v4.2.0
- name: Set up Python 3.8
uses: actions/setup-python@v5
uses: actions/setup-python@f677139bbe7f9c59b41e40162b753c062f5d49a3 # v5.2.0
with:

@@ -93,1 +93,17 @@ python-version: 3.8

run: tox -e codestyle
linkcheck:
if: github.event_name == 'schedule' && github.repository == 'astropy/pyvo'
runs-on: ubuntu-latest
strategy:
fail-fast: false
steps:
- uses: actions/checkout@d632683dd7b4114ad314bca15554477dd762a938 # v4.2.0
- name: Set up Python 3.10
uses: actions/setup-python@f677139bbe7f9c59b41e40162b753c062f5d49a3 # v5.2.0
with:
python-version: '3.10'
- name: Install tox
run: python -m pip install --upgrade tox
- name: Check docs links
run: tox -e linkcheck

@@ -0,1 +1,42 @@

1.5.3 (2024-10-14)
==================
Bug Fixes
---------
- ``cachedataset()`` and friends again produce reasonable file extensions.
[#553]
- Path separators are no longer taken over from image titles to file
system paths. [#557]
- Added `'sia1'` as servicetype for registry searches. [#583]
- Adding ``session`` kwarg to allow to pass a session along when turning
an Interface into a service via ``Interface.to_service``. [#590]
- Include port number if it is present in endpoint access URL. [#582]
- Where datalink records are made from table rows, the table row is
now accessible as datalinks.original_row. [#559]
- Tables returned by RegistryResource.get_tables() now have a utype
attribute. [#576]
- Registry Spatial constraint now supports Astropy Quantities for the radius
argument. [#594]
- ``iter_metadata()`` no longer crashes on tables with a datalink RESOURCE
and without obscore attributes. [#599]
- Avoid assuming that ``'access_url'`` always exits. [#570]
Deprecations and Removals
-------------------------
- Removed usage of the astropy TestRunner, therefore the unadvertised
``pyvo.test()`` functionality. [#606]
1.5.2 (2024-05-22)

@@ -2,0 +43,0 @@ ==================

@@ -8,2 +8,4 @@ # Licensed under a 3-clause BSD style license - see LICENSE.rst

import tempfile
import numpy as np
from astropy.utils import minversion

@@ -42,2 +44,7 @@ try:

# Keep this until we require numpy to be >=2.0
if minversion(np, "2.0.0.dev0+git20230726"):
np.set_printoptions(legacy="1.25")
def pytest_configure(config):

@@ -44,0 +51,0 @@ """Configure Pytest with Astropy.

@@ -5,3 +5,3 @@ PyVO Contributors

PyVO is an open source project developed through GitHub at
https://github.com/pyvirtobs; community contributions are welcome.
https://github.com/astropy/pyvo; community contributions are welcome.

@@ -8,0 +8,0 @@ This project began in 2012 as a product of the US Virtual Astronomical

@@ -79,2 +79,4 @@ .. _pyvo-data-access:

.. doctest::
>>> from astropy.time import Time

@@ -175,2 +177,12 @@ >>> time = Time(('2015-01-01T00:00:00', '2018-01-01T00:00:00'),

While DALResultsTable has some convenience functions, is is often
convenient to directly obtain a proper astropy Table using the
``to_table`` method:
.. doctest-remote-data::
>>> result.to_table().columns[:3]
<TableColumns names=('source_id','ra','dec')>
To explore more query examples, you can try either the ``description``

@@ -182,3 +194,3 @@ attribute for some services. For other services like this one, try

>>> print(tap_service.examples[0]['QUERY'])
>>> print(tap_service.examples[1]['QUERY'])
SELECT TOP 50 l.id, l.pmra as lpmra, l.pmde as lpmde,

@@ -199,18 +211,12 @@ g.source_id, g.pmra as gpmra, g.pmdec as gpmde

Furthermore, one can find the names of the tables using:
TAPServices let you do extensive metadata inspection. For instance,
to see the tables available on the Simbad TAP service, say:
.. doctest-remote-data::
>>> print([tab_name for tab_name in tap_service.tables.keys()]) # doctest: +IGNORE_WARNINGS
['ivoa.obs_radio', 'ivoa.obscore', 'tap_schema.columns', 'tap_schema.tables',..., 'taptest.main', 'veronqsos.data', 'vlastripe82.stripe82']
>>> simbad = vo.dal.TAPService("http://simbad.cds.unistra.fr/simbad/sim-tap")
>>> print([tab_name for tab_name in simbad.tables.keys()]) # doctest: +IGNORE_WARNINGS
['TAP_SCHEMA.schemas', 'TAP_SCHEMA.tables', 'TAP_SCHEMA.columns', 'TAP_SCHEMA.keys', ... 'mesVelocities', 'mesXmm', 'otypedef', 'otypes', 'ref']
And also the names of the columns from a known table, for instance
the first three columns:
.. doctest-remote-data::
>>> result.table.columns[:3] # doctest: +IGNORE_WARNINGS
<TableColumns names=('source_id','ra','dec')>
If you know a TAP service's access URL, you can directly pass it to

@@ -300,3 +306,3 @@ :py:class:`~pyvo.dal.TAPService` to obtain a service object.

>>> job.phase # doctest: +IGNORE_OUTPUT
>>> job.phase # doctest: +IGNORE_OUTPUT
'EXECUTING'

@@ -355,3 +361,3 @@

>>> tap_results = tap_service.search("SELECT * FROM ivoa.obscore", maxrec=100000) # doctest: +SHOW_WARNINGS
>>> tap_results = tap_service.search("SELECT * FROM arihip.main", maxrec=5) # doctest: +SHOW_WARNINGS
DALOverflowWarning: Partial result set. Potential causes MAXREC, async storage space, etc.

@@ -531,4 +537,6 @@

>>> ssa_service = vo.dal.SSAService("https://irsa.ipac.caltech.edu/SSA")
>>> ssa_service = vo.dal.SSAService("http://archive.stsci.edu/ssap/search2.php?id=BEFS&")
>>> ssa_results = ssa_service.search(pos=pos, diameter=size)
>>> ssa_results[0].getdataurl()
'http://archive.stsci.edu/pub/vospectra/...'

@@ -628,3 +636,3 @@ SSA queries can be further constrained by the ``band`` and ``time`` parameters.

>>> print(job.phase)
>>> print(job.phase) # doctest: +IGNORE_OUTPUT
EXECUTING

@@ -694,3 +702,6 @@

>>> tap_service = vo.dal.TAPService("http://dc.g-vo.org/tap")
>>> resultset = tap_service.search("SELECT TOP 10 * FROM ivoa.obscore")
>>> resultset = tap_service.search("SELECT * FROM ivoa.obscore"
... " WHERE obs_collection='CALIFA' AND"
... " 1=CONTAINS(s_region, CIRCLE(23, 42, 5))"
... " ORDER BY obs_publisher_did")
>>> print(resultset.fieldnames)

@@ -732,12 +743,11 @@ ('dataproduct_type', 'dataproduct_subtype', 'calib_level',

... print(row['s_fov'])
0.05027778
0.05027778
0.05027778
0.05027778
0.05027778
0.05027778
0.06527778
0.06527778
0.06527778
0.06527778
0.01
0.01
0.01
0.01
0.01
0.01
0.01
0.01
0.01

@@ -749,3 +759,3 @@ The total number of rows in the answer is available as its ``len()``:

>>> print(len(resultset))
10
9

@@ -757,8 +767,8 @@ If the row contains datasets, they are exposed by several retrieval methods:

>>> url = row.getdataurl()
>>> fileobj = row.getdataset()
>>> obj = row.getdataobj()
>>> row.getdataurl()
'http://dc.zah.uni-heidelberg.de/getproduct/califa/datadr3/V500/NGC0551.V500.rscube.fits'
>>> type(row.getdataset())
<class 'urllib3.response.HTTPResponse'>
Returning the access url, the file-like object or the appropriate python object
to further work on.
Returning the access url or the a file-like object to further work on.

@@ -801,22 +811,58 @@ As with general numpy arrays, accessing individual columns via names gives an

Multiple datasets
-----------------
PyVO supports multiple datasets exposed on record level through the datalink.
To get an iterator yielding specific datasets, call
:py:meth:`pyvo.dal.adhoc.DatalinkResults.bysemantics` with the identifier
identifying the dataset you want it to return.
Datalink
--------
.. remove skip once https://github.com/astropy/pyvo/issues/361 is fixed
.. doctest-skip::
Datalink lets operators associate multiple artefacts with a dataset.
Examples include linking raw data, applicable or applied calibration
data, derived datasets such as extracted sources, extra documentation,
and much more.
>>> preview = next(row.getdatalink().bysemantics('#preview')).getdataset()
Datalink can both be used on result rows of queries and from
datalink-valued URLs. The typical use is to call ``iter_datalinks()``
on some DAL result; this will iterate over all datalinks pyVO finds in a
document and yields :py:class:`pyvo.dal.adhoc.DatalinkResults` instances
for them. In those, you can, for instance, pick out items by semantics,
where the standard vocabulary datalink documents use is documented at
http://www.ivoa.net/rdf/datalink/core. Here is how to find URLs for
previews:
.. note::
Since the creation of datalink objects requires a network roundtrip, it is
recommended to call ``getdatalink`` only once.
.. doctest-remote-data::
>>> rows = vo.dal.TAPService("http://dc.g-vo.org/tap"
... ).run_sync("select top 5 * from califadr3.cubes order by califaid")
>>> for dl in rows.iter_datalinks(): # doctest: +IGNORE_WARNINGS
... print(next(dl.bysemantics("#preview"))["access_url"])
http://dc.zah.uni-heidelberg.de/getproduct/califa/datadr3/V1200/IC5376.V1200.rscube.fits?preview=True
http://dc.zah.uni-heidelberg.de/getproduct/califa/datadr3/COMB/IC5376.COMB.rscube.fits?preview=True
http://dc.zah.uni-heidelberg.de/getproduct/califa/datadr3/V500/IC5376.V500.rscube.fits?preview=True
http://dc.zah.uni-heidelberg.de/getproduct/califa/datadr3/COMB/UGC00005.COMB.rscube.fits?preview=True
http://dc.zah.uni-heidelberg.de/getproduct/califa/datadr3/V1200/UGC00005.V1200.rscube.fits?preview=True
Of course one can also build a datalink object from its url.
The call to ``next`` in this example picks the first link marked
*preview*. For previews, this may be enough, but in general there can
be multiple links for a given semantics value for one dataset.
It is sometimes useful to go back to the original row the datalink was
generated from; use the ``original_row`` attribute for that (which may
be None if pyvo does not know what row the datalink came from):
.. doctest-remote-data::
>>> dl.original_row["obs_title"]
'CALIFA V1200 UGC00005'
Consider ``original_row`` read only. We do not define what happens when
you modify it.
Rows from tables supporting datalink also have a ``getdatalink()``
method that returns a ``DatalinkResults`` instance. In general, this is
less flexible than using ``iter_datalinks``, and it may also cause more
network traffic because each such call will cause a network request.
When one has a link to a Datalink document – for instance, from an
obscore or SIAP service, where the media type is
application/x-votable;content=datalink –, one can build a
DatalinkResults using
:py:meth:`~pyvo.dal.adhoc.DatalinkResults.from_result_url`:
.. doctest-remote-data::
>>> from pyvo.dal.adhoc import DatalinkResults

@@ -826,3 +872,6 @@ >>> # In this example you know the URL from somewhere

>>> datalink = DatalinkResults.from_result_url(url)
>>> next(datalink.bysemantics("#this")).content_type
'application/fits'
Server-side processing

@@ -834,4 +883,4 @@ ----------------------

Datalink
^^^^^^^^
Generic Datalink Processing Service
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
Generic access to processing services is provided through the datalink

@@ -846,4 +895,4 @@ interface.

.. note::
most times there is only one processing service per result, and thats all you
need.
Most datalink documents only have one processing service per dataset,
which is why there is the ``get_first_proc`` shortcut mentioned below.

@@ -850,0 +899,0 @@

@@ -44,3 +44,3 @@ PyVO

git clone http://github.com/pyvirtobs/pyvo
git clone http://github.com/astropy/pyvo
cd pyvo

@@ -138,3 +138,4 @@ python setup.py install

auth/index
samp
utils/index
utils/prototypes

@@ -37,4 +37,6 @@ .. _pyvo-registry:

The main interface for the module is :py:meth:`pyvo.registry.search`;
the examples below assume::
the examples below assume:
.. doctest::
>>> from pyvo import registry

@@ -50,3 +52,3 @@

* :py:class:`~pyvo.registry.Servicetype` (``servicetype``): constrain to
one of tap, ssa, sia, conesearch (or full ivoids for other service
one of tap, ssa, sia1, sia2, conesearch (or full ivoids for other service
types). This is the constraint you want

@@ -106,3 +108,4 @@ to use for service discovery.

>>> resources = registry.search(registry.Waveband("Radio", "Millimeter"))
>>> resources = registry.search(registry.Waveband("Radio", "Millimeter"),
... registry.Author("%Miller%"))

@@ -113,3 +116,4 @@ is equivalent to:

>>> resources = registry.search(waveband=["Radio", "Millimeter"])
>>> resources = registry.search(waveband=["Radio", "Millimeter"],
... author='%Miller%')

@@ -134,3 +138,3 @@ There is also :py:meth:`~pyvo.registry.get_RegTAP_query`, accepting the

>>> resources = registry.search(registry.UCD("src.redshift"),
... registry.Freetext("supernova"))
... registry.Freetext("AGB"))

@@ -145,9 +149,11 @@ After that, ``resources`` is an instance of

>>> resources.to_table() # doctest: +IGNORE_OUTPUT
<Table length=158>
title ... interfaces
str67 ... str24
--------------------------------------------------------------- ... ------------------------
Asiago Supernova Catalogue (Barbon et al., 1999-) ... conesearch, tap#aux, web
Asiago Supernova Catalogue (Version 2008-Mar) ... conesearch, tap#aux, web
Sloan Digital Sky Survey-II Supernova Survey (Sako+, 2018) ... conesearch, tap#aux, web
<Table length=9>
ivoid ...
...
object ...
--------------------------------- ...
ivo://cds.vizier/j/a+a/392/1 ...
ivo://cds.vizier/j/a+a/566/a95 ...
ivo://cds.vizier/j/aj/151/146 ...
ivo://cds.vizier/j/apj/727/14 ...
...

@@ -160,12 +166,28 @@

>>> from astropy.coordinates import SkyCoord
>>> registry.search(registry.Servicetype("tap"),
... registry.Spatial((SkyCoord("23d +3d"), 3), intersect="enclosed"),
... includeaux=True) # doctest: +IGNORE_OUTPUT
<DALResultsTable length=1>
ivoid res_type short_name res_title ... intf_types intf_roles alt_identifier
...
object object object object ... object object object
------------------------------ ----------------- ------------- ------------------------------------------- ... ------------ ---------- --------------------------------
ivo://cds.vizier/j/apj/835/123 vs:catalogservice J/ApJ/835/123 Globular clusters in NGC 474 from CFHT obs. ... vs:paramhttp std doi:10.26093/cds/vizier.18350123
>>> registry.search(registry.Freetext("Wolf-Rayet"),
... registry.Spatial((SkyCoord("23d +3d"), 3), intersect="enclosed"))
<DALResultsTable length=2>
ivoid ...
...
object ...
------------------------------- ...
ivo://cds.vizier/j/a+a/688/a104 ...
ivo://cds.vizier/j/apj/938/73 ...
Astropy Quantities are also supported for the radius angle of a SkyCoord-defined circular region:
.. doctest-remote-data::
>>> from astropy.coordinates import SkyCoord
>>> from astropy import units as u
>>> registry.search(registry.Freetext("Wolf-Rayet"),
... registry.Spatial((SkyCoord("23d +3d"), 180*u.Unit('arcmin')), intersect="enclosed"))
<DALResultsTable length=2>
ivoid ...
...
object ...
------------------------------- ...
ivo://cds.vizier/j/a+a/688/a104 ...
ivo://cds.vizier/j/apj/938/73 ...
Where ``intersect`` can take the following values:

@@ -200,11 +222,11 @@ * 'covers' is the default and returns resources that cover the geometry provided,

>>> resources["II/283"].get_service("conesearch").search(pos=(120, 73), sr=1)
>>> voresource = resources["J/ApJ/727/14"]
>>> voresource.get_service(service_type="conesearch").search(pos=(257.41, 64.345), sr=0.01)
<DALResultsTable length=1>
_RAJ2000 _DEJ2000 _r recno ... NED RAJ2000 DEJ2000
deg deg ...
float64 float64 float64 int32 ... str3 str12 str12
------------ ------------ -------- ----- ... ---- ------------ ------------
117.98645833 73.00961111 0.588592 986 ... NED 07 51 56.750 +73 00 34.60
_r recno f_ID ID RAJ2000 ... SED DR7 Sloan Simbad
deg ...
float64 int32 str1 str18 float64 ... str3 str3 str5 str6
-------- ----- ---- ------------------ --------- ... ---- ---- ----- ------
0.000618 1 P 170938.52+642044.1 257.41049 ... SED DR7 Sloan Simbad
To operate TAP services, you need to know what tables make up a

@@ -217,8 +239,9 @@ resource; you could construct a TAP service and access its ``tables``

>>> tables = resources["II/283"].get_tables() # doctest: +IGNORE_WARNINGS
>>> tables = resources["J/ApJ/727/14"].get_tables() # doctest: +IGNORE_WARNINGS
>>> list(tables.keys())
['II/283/sncat']
>>> sorted(c.name for c in tables['II/283/sncat'].columns)
['band', 'bmag', 'deg', 'dej2000', 'disc', 'epmax', 'galaxy', 'hrv', 'i', 'logd25', 'maxmag', 'mtype', 'n_bmag', 'n_sn', 'n_x', 'n_y', 'ned', 'pa', 'rag', 'raj2000', 'recno', 'simbad', 'sn', 't', 'type', 'u_epmax', 'u_maxmag', 'u_sn', 'u_y', 'u_z', 'x', 'y', 'z']
['J/ApJ/727/14/table2']
>>> sorted(c.name for c in tables["J/ApJ/727/14/table2"].columns)
['[24]', '[70]', 'dej2000', 'dr7', 'e_[24]', 'e_[70]', 'e_l15', 'e_l24', 'e_n3', 'e_n4', 'e_s11', 'e_s7', 'f_id', 'gmag', 'id', 'imag', 'l15', 'l24', 'n3', 'n4', 'raj2000', 'recno', 'rmag', 's11', 's7', 'sed', 'simbad', 'sloan', 'umag', 'y03', 'z', 'zmag']
In this case, this is a table with one of VizieR's somewhat funky names.

@@ -229,12 +252,9 @@ To run a TAP query based on this metadata, do something like:

>>> resources["II/283"].get_service("tap#aux").run_sync(
... 'SELECT sn, z FROM "J/A+A/437/789/table2" WHERE z>0.04')
<DALResultsTable length=4>
SN z
object float64
------ -------
1992bh 0.045
1992bp 0.079
1993ag 0.049
1993O 0.051
>>> resources["J/ApJ/727/14"].get_service(service_type="tap#aux").run_sync(
... 'SELECT id, z FROM "J/ApJ/727/14/table2" WHERE z>0.09 and umag<18')
<DALResultsTable length=1>
ID z
object float64
------------------ -------
171319.90+635428.0 0.09043

@@ -249,3 +269,3 @@ A special sort of access mode is ``web``, which represents some facility related

>>> resources["II/283"].get_service("web").search() # doctest: +IGNORE_OUTPUT
>>> resources["J/ApJ/727/14"].get_service(service_type="web").search() # doctest: +IGNORE_OUTPUT

@@ -334,3 +354,3 @@ Note that for interactive data discovery in the VO Registry, you may

>>>
>>> archives = vo.regsearch(servicetype='image', waveband='x-ray')
>>> archives = vo.regsearch(servicetype='sia1', waveband='x-ray')
>>> pos = SkyCoord.from_name('Cas A')

@@ -382,3 +402,3 @@ >>> len(archives) # doctest: +IGNORE_OUTPUT

>>> colls = vo.regsearch(keywords=['NVSS'], servicetype='sia')
>>> colls = vo.regsearch(keywords=['NVSS'], servicetype='sia1')
>>> for coll in colls:

@@ -407,3 +427,3 @@ ... print(coll.res_title, coll.access_url)

>>> import pyvo as vo
>>> colls = vo.regsearch(keywords=["NVSS"], servicetype='sia')
>>> colls = vo.regsearch(keywords=["NVSS"], servicetype='sia1')
>>> nvss = colls["NVSS"]

@@ -472,3 +492,3 @@ >>> nvss.res_title

>>> colls = vo.regsearch(keywords=["NVSS"], servicetype='sia')
>>> colls = vo.regsearch(keywords=["NVSS"], servicetype='sia1')
>>> for coll in colls:

@@ -484,3 +504,3 @@ ... print(coll.ivoid)

>>> nvss = vo.registry.search(ivoid='ivo://nasa.heasarc/skyview/nvss')[0].get_service('sia')
>>> nvss = vo.registry.search(ivoid='ivo://nasa.heasarc/skyview/nvss')[0].get_service('sia1')
>>> nvss.search(pos=(350.85, 58.815),size=0.25,format="image/fits")

@@ -518,3 +538,3 @@ <DALResultsTable length=1>

>>> res.get_contact()
'GAVO Data Center Team (++49 6221 54 1837) <gavo@ari.uni-heidelberg.de>'
'GAVO Data Centre Team (+49 6221 54 1837) <gavo@ari.uni-heidelberg.de>'

@@ -613,3 +633,3 @@ Finally, the registry has an idea of what kind of tables are published

... pass # svc_rec.ivoid, msg, svc_rec.get_contact()))
... if i == 5:
... if i == 2:
... break

@@ -616,0 +636,0 @@ >>> total_result = vstack(results) # doctest: +IGNORE_WARNINGS

Metadata-Version: 2.1
Name: pyvo
Version: 1.5.2
Version: 1.5.3
Summary: Astropy affiliated package for accessing Virtual Observatory data and services

@@ -5,0 +5,0 @@ Author: the PyVO Developers

Metadata-Version: 2.1
Name: pyvo
Version: 1.5.2
Version: 1.5.3
Summary: Astropy affiliated package for accessing Virtual Observatory data and services

@@ -5,0 +5,0 @@ Author: the PyVO Developers

@@ -15,2 +15,3 @@ .gitignore

tox.ini
.github/dependabot.yml
.github/workflows/changelog.yml

@@ -25,2 +26,3 @@ .github/workflows/ci_devtests.yml

docs/requirements.txt
docs/samp.rst
docs/_templates/autosummary/base.rst

@@ -42,3 +44,2 @@ docs/_templates/autosummary/class.rst

pyvo/_astropy_init.py
pyvo/conftest.py
pyvo/samp.py

@@ -209,2 +210,3 @@ pyvo/version.py

pyvo/utils/prototype.py
pyvo/utils/testing.py
pyvo/utils/url.py

@@ -211,0 +213,0 @@ pyvo/utils/vocabularies.py

# Licensed under a 3-clause BSD style license - see LICENSE.rst
import os
__all__ = ['__version__', 'test']
__all__ = ['__version__']

@@ -10,5 +9,1 @@ try:

__version__ = ''
# Create the test function for self test
from astropy.tests.runner import TestRunner
test = TestRunner.make_test_runner_in(os.path.dirname(__file__))

@@ -107,3 +107,3 @@ # Licensed under a 3-clause BSD style license - see LICENSE.rst

"""
Mixin for adhoc:service functionallity for results classes.
Mixin for adhoc:service functionality for results classes.
"""

@@ -173,3 +173,110 @@

"""
def _iter_datalinks_from_dlblock(self, datalink_service):
"""yields datalinks from the current rows using a datalink
service RESOURCE.
"""
remaining_ids = [] # remaining IDs to processed
current_batch = None # retrieved but not returned yet
current_ids = [] # retrieved but not returned
processed_ids = [] # retrived and returned IDs
batch_size = None # size of the batch
for row in self:
if not current_ids:
if batch_size is None:
# first call.
self.query = DatalinkQuery.from_resource(
[_ for _ in self],
self._datalink,
session=self._session,
original_row=row)
remaining_ids = self.query['ID']
if not remaining_ids:
# we are done
return
if batch_size:
# subsequent calls are limitted to batch size
self.query['ID'] = remaining_ids[:batch_size]
current_batch = self.query.execute(post=True)
current_ids = list(OrderedDict.fromkeys(
[_ for _ in current_batch.to_table()['ID']]))
if not current_ids:
raise DALServiceError(
'Could not retrieve datalinks for: {}'.format(
', '.join([_ for _ in remaining_ids])))
batch_size = len(current_ids)
id1 = current_ids.pop(0)
processed_ids.append(id1)
remaining_ids.remove(id1)
yield current_batch.clone_byid(
id1,
original_row=row)
@staticmethod
def _guess_access_format(row):
"""returns a guess for the format of what __guess_access_url will
return.
This tries a few heuristics based on how obscore or SIA records might
be marked up. If will return None if row does not look as if
it contained an access format. Note that the heuristics are
tried in sequence; for now, we do not define the sequence of
heuristics.
"""
if hasattr(row, "access_format"):
return row.access_format
if "access_format" in row:
return row["access_format"]
access_format = row.getbyutype("obscore:access.format"
) or row.getbyutype("ssa:Access.Format")
if access_format:
return access_format
access_format = row.getbyucd("meta.code.mime"
) or row.getbyucd("VOX:Image_Format")
if access_format:
return access_format
@staticmethod
def _guess_access_url(row):
"""returns a guess for a URI to a data product in row.
This tries a few heuristics based on how obscore or SIA records might
be marked up. If will return None if row does not look as if
it contained a product access url. Note that the heuristics are
tried in sequence; for now, we do not define the sequence of
heuristics.
"""
if hasattr(row, "access_url"):
return row.access_url
if "access_url" in row:
return row["access_url"]
access_url = row.getbyutype("obscore:access.reference"
) or row.getbyutype("ssa:Access.Reference")
if access_url:
return access_url
access_url = row.getbyucd("meta.ref.url"
) or row.getbyucd("VOX:Image_AccessReference")
if access_url:
return access_url
def _iter_datalinks_from_product_rows(self):
"""yield datalinks from self's rows if they describe datalink-valued
products.
"""
for row in self:
# TODO: we should be more careful about whitespace, case
# and perhaps more parameters in the following comparison
if self._guess_access_format(row) == DATALINK_MIME_TYPE:
access_url = self._guess_access_url(row)
if access_url is not None:
yield DatalinkResults.from_result_url(
access_url,
original_row=row)
def iter_datalinks(self):

@@ -191,40 +298,10 @@ """

self._datalink = None
remaining_ids = [] # remaining IDs to processed
current_batch = None # retrieved but not returned yet
current_ids = [] # retrieved but not returned
processed_ids = [] # retrived and returned IDs
batch_size = None # size of the batch
for row in self:
if self._datalink:
if not current_ids:
if batch_size is None:
# first call.
self.query = DatalinkQuery.from_resource(
[_ for _ in self], self._datalink, session=self._session)
remaining_ids = self.query['ID']
if not remaining_ids:
# we are done
return
if batch_size:
# subsequent calls are limitted to batch size
self.query['ID'] = remaining_ids[:batch_size]
current_batch = self.query.execute(post=True)
current_ids = list(OrderedDict.fromkeys(
[_ for _ in current_batch.to_table()['ID']]))
if not current_ids:
raise DALServiceError(
'Could not retrieve datalinks for: {}'.format(
', '.join([_ for _ in remaining_ids])))
batch_size = len(current_ids)
id1 = current_ids.pop(0)
processed_ids.append(id1)
remaining_ids.remove(id1)
yield current_batch.clone_byid(id1)
elif row.access_format == DATALINK_MIME_TYPE:
yield DatalinkResults.from_result_url(row.getdataurl())
else:
yield None
if self._datalink is None:
yield from self._iter_datalinks_from_product_rows()
else:
yield from self._iter_datalinks_from_dlblock(self._datalink)
class DatalinkRecordMixin:

@@ -372,2 +449,4 @@ """

"""
original_row = kwargs.pop("original_row", None)
input_params = _get_input_params_from_resource(resource)

@@ -409,3 +488,7 @@ # get params outside of any group

return cls(accessurl, session=session, **query_params)
return cls(
accessurl,
session=session,
original_row=original_row,
**query_params)

@@ -428,2 +511,4 @@ def __init__(

"""
self.original_row = keywords.pop("original_row", None)
super().__init__(baseurl, session=session, **keywords)

@@ -450,4 +535,7 @@

"""
return DatalinkResults(self.execute_votable(post=post),
url=self.queryurl, session=self._session)
return DatalinkResults(
self.execute_votable(post=post),
url=self.queryurl,
original_row=self.original_row,
session=self._session)

@@ -498,2 +586,6 @@

def __init__(self, *args, **kwargs):
self.original_row = kwargs.pop("original_row", None)
super().__init__(*args, **kwargs)
def getrecord(self, index):

@@ -514,3 +606,3 @@ """

-------
REc
Rec
a dictionary-like wrapper containing the result record metadata.

@@ -581,6 +673,6 @@

def clone_byid(self, id):
def clone_byid(self, id, *, original_row=None):
"""
return a clone of the object with results and corresponding
resources matching a given id
resources matching a given id

@@ -610,3 +702,3 @@ Returns

copy_tb.resources.remove(x)
return DatalinkResults(copy_tb)
return DatalinkResults(copy_tb, original_row=original_row)

@@ -643,3 +735,9 @@ def getdataset(self, timeout=None):

@classmethod
def from_result_url(cls, result_url, *, session=None, original_row=None):
res = super().from_result_url(result_url, session=session)
res.original_row = original_row
return res
class SodaRecordMixin:

@@ -646,0 +744,0 @@ """

@@ -28,4 +28,7 @@ # Licensed under a 3-clause BSD style license - see LICENSE.rst

>>> mime2extension('application/fits')
'fits'
>>> mime2extension('image/jpeg')
'jpg'
>>> mime2extension('application/x-zed', 'dat')
'dat'

@@ -50,9 +53,12 @@ Parameters

if isinstance(mimetype, str):
mimetype = mimetype.encode('utf-8')
if isinstance(mimetype, bytes):
mimetype = mimetype.decode('utf-8')
ext = mimetypes.guess_extension(mimetype, strict=False)
return ext
if ext is None:
return default
return ext.lstrip(".")
def mime_object_maker(url, mimetype, session=None):

@@ -59,0 +65,0 @@ """

@@ -519,6 +519,7 @@ # Licensed under a 3-clause BSD style license - see LICENSE.rst

"""
utype = utype.lower()
try:
iterchain = (
self.getdesc(fieldname) for fieldname in self.fieldnames)
iterchain = (field for field in iterchain if field.utype == utype)
iterchain = (field for field in iterchain if (field.utype or "").lower() == utype)
return next(iterchain).name

@@ -884,2 +885,5 @@ except StopIteration:

base = base.replace("/", "_"
).replace("\\", "_")
# be efficient when writing a bunch of files into the same directory

@@ -886,0 +890,0 @@ # in succession

@@ -259,3 +259,3 @@ # Licensed under a 3-clause BSD style license - see LICENSE.rst

"""
a class for preparing an query to a Cone Search service. Query constraints
a class for preparing a query to a Cone Search service. Query constraints
are added via its service type-specific methods. The various execute()

@@ -262,0 +262,0 @@ functions will submit the query and return the results.

@@ -332,3 +332,3 @@ # Licensed under a 3-clause BSD style license - see LICENSE.rst

"""
a class for preparing an query to an SIA service. Query constraints
a class for preparing a query to an SIA service. Query constraints
are added via its service type-specific methods. The various execute()

@@ -919,3 +919,3 @@ functions will submit the query and return the results.

def suggest_extension(self, default=None):
def suggest_extension(self, default='dat'):
"""

@@ -922,0 +922,0 @@ returns a recommended filename extension for the dataset described

@@ -226,3 +226,3 @@ # Licensed under a 3-clause BSD style license - see LICENSE.rst

"""
a class for preparing an query to an SLA service. Query constraints
a class for preparing a query to an SLA service. Query constraints
are added via its service type-specific methods. The various execute()

@@ -229,0 +229,0 @@ functions will submit the query and return the results.

@@ -295,3 +295,3 @@ # Licensed under a 3-clause BSD style license - see LICENSE.rst

"""
a class for preparing an query to an SSA service. Query constraints
a class for preparing a query to an SSA service. Query constraints
are added via its service type-specific properties and methods. Once

@@ -416,3 +416,3 @@ all the constraints are set, one of the various execute() functions

valerr = ValueError(
'Radius must be exactly one value, expressing degrees')
'Diameter must be exactly one value, expressing degrees')

@@ -419,0 +419,0 @@ try:

@@ -1014,4 +1014,4 @@ # Licensed under a 3-clause BSD style license - see LICENSE.rst

"""
a class for preparing an query to an TAP service. Query constraints
are added via its service type-specific methods. The various execute()
a class for preparing a query to a TAP service. Query constraints
are added via service type-specific methods. The various execute()
functions will submit the query and return the results.

@@ -1018,0 +1018,0 @@

@@ -12,3 +12,5 @@ #!/usr/bin/env python

from pyvo.dal.adhoc import DatalinkResults
from pyvo.utils import vocabularies
from pyvo.dal.sia2 import SIA2Results
from pyvo.dal.tap import TAPResults
from pyvo.utils import testing, vocabularies

@@ -44,2 +46,13 @@ from astropy.utils.data import get_pkg_data_contents, get_pkg_data_filename

@pytest.fixture()
def datalink_product(mocker):
def callback(request, context):
return get_pkg_data_contents('data/datalink/datalink.xml')
with mocker.register_uri(
'GET', 'http://example.com/datalink.xml', content=callback
) as matcher:
yield matcher
@pytest.fixture()
def obscore_datalink(mocker):

@@ -115,2 +128,4 @@ def callback(request, context):

assert datalinks.original_row["accsize"] == 100800
row = datalinks[0]

@@ -138,3 +153,5 @@ assert row.semantics == "#progenitor"

assert len([_ for _ in results.iter_datalinks()]) == 3
dls = list(results.iter_datalinks())
assert len(dls) == 3
assert dls[0].original_row["obs_collection"] == "MACHO"

@@ -150,2 +167,4 @@

datalinks = DatalinkResults.from_result_url('http://example.com/proc')
assert datalinks.original_row is None
res = [r["access_url"] for r in datalinks.bysemantics("#this")]

@@ -200,1 +219,97 @@ assert len(res) == 1

assert res[3].endswith("when-will-it-be-back")
@pytest.mark.filterwarnings("ignore::astropy.io.votable.exceptions.E02")
@pytest.mark.usefixtures('datalink_product', 'datalink_vocabulary')
class TestIterDatalinksProducts:
"""Tests for producing datalinks from tables containing links to
datalink documents.
"""
def test_no_access_format(self):
res = testing.create_dalresults([
{"name": "access_url", "datatype": "char", "arraysize": "*",
"utype": "obscore:access.reference"}],
[("http://foo.bar/baz.jpeg",)],
resultsClass=TAPResults)
assert list(res.iter_datalinks()) == []
def test_obscore_utype(self):
res = testing.create_dalresults([
{"name": "data_product", "datatype": "char", "arraysize": "*",
"utype": "obscore:access.reference"},
{"name": "content_type", "datatype": "char", "arraysize": "*",
"utype": "obscore:access.format"},],
[("http://example.com/datalink.xml",
"application/x-votable+xml;content=datalink")],
resultsClass=TAPResults)
links = list(res.iter_datalinks())
assert len(links) == 1
assert (next(links[0].bysemantics("#this"))["access_url"]
== "http://dc.zah.uni-heidelberg.de/getproduct/flashheros/data/ca90/f0011.mt")
def test_sia2_record(self):
res = testing.create_dalresults([
{"name": "access_url", "datatype": "char", "arraysize": "*",
"utype": "obscore:access.reference"},
{"name": "access_format", "datatype": "char", "arraysize": "*",
"utype": "obscore:access.format"},],
[("http://example.com/datalink.xml",
"application/x-votable+xml;content=datalink")],
resultsClass=SIA2Results)
links = list(res.iter_datalinks())
assert len(links) == 1
assert (next(links[0].bysemantics("#this"))["access_url"]
== "http://dc.zah.uni-heidelberg.de/getproduct/flashheros/data/ca90/f0011.mt")
def test_sia1_record(self):
res = testing.create_dalresults([
{"name": "product", "datatype": "char", "arraysize": "*",
"ucd": "VOX:Image_AccessReference"},
{"name": "mime", "datatype": "char", "arraysize": "*",
"ucd": "VOX:Image_Format"},],
[("http://example.com/datalink.xml",
"application/x-votable+xml;content=datalink")],
resultsClass=TAPResults)
links = list(res.iter_datalinks())
assert len(links) == 1
assert (next(links[0].bysemantics("#this"))["access_url"]
== "http://dc.zah.uni-heidelberg.de/getproduct/flashheros/data/ca90/f0011.mt")
def test_ssap_record(self):
res = testing.create_dalresults([
{"name": "product", "datatype": "char", "arraysize": "*",
"utype": "ssa:access.reference"},
{"name": "mime", "datatype": "char", "arraysize": "*",
"utype": "ssa:access.format"},],
[("http://example.com/datalink.xml",
"application/x-votable+xml;content=datalink")],
resultsClass=TAPResults)
links = list(res.iter_datalinks())
assert len(links) == 1
assert (next(links[0].bysemantics("#this"))["access_url"]
== "http://dc.zah.uni-heidelberg.de/getproduct/flashheros/data/ca90/f0011.mt")
def test_generic_record(self):
# The meta.code.mime and meta.ref.url UCDs are perhaps too
# generic. To ensure a somewhat predictable behaviour,
# we at least make sure we pick the first of possibly multiple
# pairs (not that this would preclude arbitrary amounts of
# chaos).
res = testing.create_dalresults([
{"name": "access_url", "datatype": "char", "arraysize": "*",
"ucd": "meta.ref.url"},
{"name": "access_format", "datatype": "char", "arraysize": "*",
"utype": "meta.code.mime"},
{"name": "alt_access_url", "datatype": "char", "arraysize": "*",
"ucd": "meta.ref.url"},
{"name": "alt_access_format", "datatype": "char", "arraysize": "*",
"utype": "meta.code.mime"},],
[("http://example.com/datalink.xml",
"application/x-votable+xml;content=datalink",
"http://example.com/bad-pick.xml",
"application/x-votable+xml;content=datalink",)],
resultsClass=TAPResults)
links = list(res.iter_datalinks())
assert len(links) == 1
assert (next(links[0].bysemantics("#this"))["access_url"]
== "http://dc.zah.uni-heidelberg.de/getproduct/flashheros/data/ca90/f0011.mt")

@@ -7,2 +7,3 @@ #!/usr/bin/env python

from functools import partial
import os
import re

@@ -92,1 +93,19 @@

assert service["FORMAT"] == "Unsupported"
@pytest.mark.usefixtures('sia')
class TestNameMaking:
def test_slash_replace(self):
res = SIAService('http://example.com/sia').search(pos=(288, 15))
res["imageTitle"][0] = "ogsa/dai output"
assert res[0].make_dataset_filename() == os.path.join(".", "ogsa_dai_output.fits")
def test_default_for_broken_media_type(self):
res = SIAService('http://example.com/sia').search(pos=(288, 15))
res["mime"][0] = "application/x-youcannotknowthis"
assert res[0].make_dataset_filename() == os.path.join(".", "Test_Observation.dat")
def test_default_media_type_adaption(self):
res = SIAService('http://example.com/sia').search(pos=(288, 15))
res["mime"][0] = "image/png"
assert res[0].make_dataset_filename() == os.path.join(".", "Test_Observation.png")

@@ -171,3 +171,3 @@ #!/usr/bin/env python

def test_fov(self):
results = search(CADC_SIA_URL, field_of_view=(10, 20), maxrec=5)
results = search(CADC_SIA_URL, field_of_view=(10, 20), collection='TESS', maxrec=5)
assert len(results) == 5

@@ -178,3 +178,3 @@ # how to test values

def test_spatial_res(self):
results = search(CADC_SIA_URL, spatial_resolution=(1, 2), maxrec=5)
results = search(CADC_SIA_URL, spatial_resolution=(1, 2), collection='MACHO', maxrec=5)
assert len(results) == 5

@@ -186,3 +186,3 @@ for rr in results:

def test_spec_resp(self):
results = search(CADC_SIA_URL, spectral_resolving_power=(1, 2), maxrec=5)
results = search(CADC_SIA_URL, spectral_resolving_power=(1, 2), collection='NEOSSAT', maxrec=5)
assert len(results) == 5

@@ -194,4 +194,3 @@ for rr in results:

def test_exptime(self):
results = search(CADC_SIA_URL, exptime=(1, 2),
maxrec=5)
results = search(CADC_SIA_URL, exptime=(1, 2), collection='GEMINI', maxrec=5)
assert len(results) == 5

@@ -226,13 +225,13 @@ for rr in results:

def test_collection(self):
results = search(CADC_SIA_URL, collection='CFHT', maxrec=5)
results = search(CADC_SIA_URL, collection='IRIS', maxrec=5)
assert len(results) == 5
for rr in results:
assert rr.obs_collection == 'CFHT'
assert rr.obs_collection == 'IRIS'
@pytest.mark.filterwarnings("ignore::pyvo.dal.exceptions.DALOverflowWarning")
def test_instrument(self):
results = search(CADC_SIA_URL, instrument='SCUBA-2', maxrec=5)
results = search(CADC_SIA_URL, instrument='IRAS', maxrec=5)
assert len(results) == 5
for rr in results:
assert rr.instrument_name == 'SCUBA-2'
assert rr.instrument_name == 'IRAS'

@@ -248,6 +247,6 @@ @pytest.mark.filterwarnings("ignore::pyvo.dal.exceptions.DALOverflowWarning")

def test_target_name(self):
results = search(CADC_SIA_URL, target_name='OGF:t028', maxrec=5)
results = search(CADC_SIA_URL, target_name='BarnardStar', collection='NEOSSAT', maxrec=5)
assert len(results) == 5
for rr in results:
assert rr.target_name == 'OGF:t028'
assert rr.target_name == 'BarnardStar'

@@ -254,0 +253,0 @@ @pytest.mark.filterwarnings("ignore::pyvo.dal.exceptions.DALOverflowWarning")

@@ -789,1 +789,13 @@ # Licensed under a 3-clause BSD style license - see LICENSE.rst

assert func.form == "ivo_hasword(haystack TEXT, needle TEXT) -> INTEGER"
def test_get_endpoint_candidates():
# Directly instantiate the TAPService with a known base URL
svc = TAPService("http://astroweb.projects.phys.ucl.ac.uk:8000/tap")
# Check if the correct endpoint candidates are generated
expected_urls = [
"http://astroweb.projects.phys.ucl.ac.uk:8000/tap/capabilities",
"http://astroweb.projects.phys.ucl.ac.uk:8000/capabilities"
]
assert svc._get_endpoint_candidates("capabilities") == expected_urls

@@ -21,20 +21,17 @@ # Licensed under a 3-clause BSD style license - see LICENSE.rst

class EndpointMixin():
def _get_endpoint(self, endpoint):
# finds the endpoint relative to the base url or its parent
# and returns its content in raw format
def _get_endpoint_candidates(self, endpoint):
urlcomp = urlparse(self.baseurl)
# Include the port number if present
netloc = urlcomp.hostname
if urlcomp.port:
netloc += f':{urlcomp.port}'
curated_baseurl = f'{urlcomp.scheme}://{netloc}{urlcomp.path}'
# do not trust baseurl as it might contain query or fragments
urlcomp = urlparse(self.baseurl)
curated_baseurl = '{}://{}{}'.format(urlcomp.scheme,
urlcomp.hostname,
urlcomp.path)
if not endpoint:
raise AttributeError('endpoint required')
ep_urls = [
'{baseurl}/{endpoint}'.format(baseurl=curated_baseurl,
endpoint=endpoint),
url_sibling(curated_baseurl, endpoint)
]
for ep_url in ep_urls:
return [f'{curated_baseurl}/{endpoint}', url_sibling(curated_baseurl, endpoint)]
def _get_endpoint(self, endpoint):
for ep_url in self._get_endpoint_candidates(endpoint):
try:

@@ -47,5 +44,3 @@ response = self._session.get(ep_url, stream=True)

else:
raise DALServiceError(
"No working {endpoint} endpoint provided".format(
endpoint=endpoint))
raise DALServiceError(f"No working {endpoint} endpoint provided")

@@ -52,0 +47,0 @@ return response.raw

@@ -400,3 +400,3 @@ # Licensed under a 3-clause BSD style license - see LICENSE.rst

def to_service(self):
def to_service(self, *, session=None):
if self.type == "vr:webbrowser":

@@ -416,5 +416,5 @@ return _BrowserService(self.access_url)

if service_class == sia2.SIA2Service:
return service_class(self.access_url, check_baseurl=False)
return service_class(self.access_url, check_baseurl=False, session=session)
else:
return service_class(self.access_url)
return service_class(self.access_url, session=session)

@@ -773,3 +773,4 @@ def supports(self, standard_id):

service_type: str = None,
lax: bool = True):
lax: bool = True,
session: object = None):
"""

@@ -809,2 +810,6 @@ return an appropriate DALService subclass for this resource that

session : object
optional requests session to use to communicate with the service
constructed.
Returns

@@ -820,3 +825,3 @@ -------

return self.get_interface(service_type, lax, std_only=True
).to_service()
).to_service(session=session)

@@ -851,11 +856,12 @@ @property

============ =========================================
============ ===========================================
Service type Use the argument syntax for
============ =========================================
============ ===========================================
catalog :py:meth:`pyvo.dal.scs.SCSService.search`
image :py:meth:`pyvo.dal.sia.SIAService.search`
spectrum :py:meth:`pyvo.dal.ssa.SSAService.search`
sia :py:meth:`pyvo.dal.sia.SIAService.search`
sia2 :py:meth:`pyvo.dal.sia2.SIA2Service.search`
ssa :py:meth:`pyvo.dal.ssa.SSAService.search`
line :py:meth:`pyvo.dal.sla.SLAService.search`
database *not yet supported*
============ =========================================
============ ===========================================

@@ -1004,2 +1010,3 @@ Raises

res.description = table_row["table_description"]
res.utype = table_row["table_utype"]
res._columns = [

@@ -1032,3 +1039,4 @@ self._build_vosi_column(row)

tables = svc.run_sync(
"""SELECT table_name, table_description, table_index, table_title
"""SELECT table_name, table_description, table_index, table_title,
table_utype
FROM rr.res_table

@@ -1035,0 +1043,0 @@ WHERE ivoid={}""".format(

@@ -37,2 +37,3 @@ # Licensed under a 3-clause BSD style license - see LICENSE.rst

("sia", "sia"),
("sia1", "sia"),
# SIA2 is irregular

@@ -211,3 +212,3 @@ # funky scheme used by SIA2 without breaking everything else

"""
A contraint using plain text to match against title, description,
A constraint using plain text to match against title, description,
subjects, and person names.

@@ -222,3 +223,4 @@ """

----------
*words : tuple of str
*words : str
One or more string arguments.
It is recommended to pass multiple words in multiple strings

@@ -228,2 +230,7 @@ arguments. You can pass in phrases (i.e., multiple words

significantly between different registries.
Examples
--------
>>> from pyvo import registry
>>> registry.Freetext("Gamma", "Ray", "Burst") # doctest: +IGNORE_OUTPUT
"""

@@ -323,5 +330,5 @@ self.words = words

* ``image`` (image services; at this point equivalent to sia, but
scheduled to include sia2, too)
* ``sia`` (SIAP version 1 services)
* ``sia``, ``sia1`` (SIAP version 1 services; prefer ``sia1`` for symmetry,
although ``sia`` will be kept as the official IVOA short name for SIA1)
* ``sia2`` (SIAP version 2 services)

@@ -334,2 +341,4 @@ * ``spectrum``, ``ssa``, ``ssap`` (all synonymous for spectral

* ``tap``, ``table`` (synonymous for TAP services, prefer ``tap``)
* ``image`` (a to be deprecated alias for sia1)
* ``spectrum`` (a to be deprecated alias for ssap)

@@ -362,3 +371,3 @@ You can also pass in the standards' ivoid (which

----------
*stds : tuple of str
*stds : str
one or more standards identifiers. The constraint will

@@ -438,3 +447,3 @@ match records that have any of them.

----------
*bands : tuple of strings
*bands : strings
One or more of the terms given in http://www.ivoa.net/rdf/messenger.

@@ -536,2 +545,3 @@ The constraint matches when a resource declares at least

ivoid : string
One or more string arguments.
The IVOA identifier of the resource to match. As RegTAP

@@ -558,3 +568,3 @@ requires lowercasing ivoids on ingestion, the constraint

patterns : tuple of strings
patterns : strings
SQL patterns (i.e., ``%`` is 0 or more characters) for

@@ -618,2 +628,6 @@ UCDs. The constraint will match when a resource has

>>> resources = registry.Spatial((SkyCoord("23d +3d"), 3))
Or you can provide the radius angle as an Astropy Quantity:
>>> resources = registry.Spatial((SkyCoord("23d +3d"), 1*u.rad))
"""

@@ -635,3 +649,3 @@ _keyword = "spatial"

as ASCII MOCs, SkyCoords as points, and a pair of a
SkyCoord and a float as a circle. Other types (proper
SkyCoord and a float or Quantity as a circle. Other types (proper
geometries or MOCPy objects) might be supported in the

@@ -666,5 +680,12 @@ future.

if isinstance(geom_spec[0], SkyCoord):
# If radius given is astropy quantity, then convert to degrees
if isinstance(geom_spec[1], u.Quantity):
if geom_spec[1].unit.physical_type != 'angle':
raise ValueError("Radius quantity is not of type angle.")
radius = geom_spec[1].to(u.deg).value
else:
radius = geom_spec[1]
geom = tomoc(format_function_call("CIRCLE",
[geom_spec[0].ra.value, geom_spec[0].dec.value,
geom_spec[1]]))
radius]))
else:

@@ -924,4 +945,3 @@ geom = tomoc(format_function_call("POINT", geom_spec))

joined_tables = ["rr.resource", "rr.capability", "rr.interface",
"rr.alt_identifier"
joined_tables = ["rr.resource", "rr.capability", "rr.interface"
] + list(extra_tables)

@@ -928,0 +948,0 @@

@@ -34,3 +34,3 @@ #!/usr/bin/env python

@pytest.fixture(name='capabilities')
def _capabilities(mocker):
def _capabilities(mocker, scope="session"):
def callback(request, context):

@@ -56,3 +56,3 @@ return get_pkg_data_contents('data/capabilities.xml')

@pytest.fixture(name='regtap_pulsar_distance_response')
def _regtap_pulsar_distance_response(mocker):
def _regtap_pulsar_distance_response(mocker, scope="session"):
with mocker.register_uri(

@@ -65,3 +65,3 @@ 'POST', REGISTRY_BASEURL + '/sync',

@pytest.fixture()
def keywords_fixture(mocker):
def keywords_fixture(mocker, scope="session"):
def keywordstest_callback(request, context):

@@ -89,3 +89,3 @@ data = dict(parse_qsl(request.body))

@pytest.fixture()
def single_keyword_fixture(mocker):
def single_keyword_fixture(mocker, scope="session"):
def keywordstest_callback(request, context):

@@ -109,3 +109,3 @@ data = dict(parse_qsl(request.body))

@pytest.fixture()
def servicetype_fixture(mocker):
def servicetype_fixture(mocker, scope="session"):
def servicetypetest_callback(request, context):

@@ -131,3 +131,3 @@ data = dict(parse_qsl(request.body))

@pytest.fixture()
def waveband_fixture(mocker):
def waveband_fixture(mocker, scope="session"):
def wavebandtest_callback(request, content):

@@ -149,3 +149,3 @@ data = dict(parse_qsl(request.body))

@pytest.fixture()
def datamodel_fixture(mocker):
def datamodel_fixture(mocker, scope="session"):
def datamodeltest_callback(request, content):

@@ -172,3 +172,3 @@ data = dict(parse_qsl(request.body))

@pytest.fixture()
def aux_fixture(mocker):
def aux_fixture(mocker, scope="session"):
def auxtest_callback(request, context):

@@ -190,3 +190,3 @@ data = dict(parse_qsl(request.body))

@pytest.fixture(name='multi_interface_fixture')
def _multi_interface_fixture(mocker):
def _multi_interface_fixture(mocker, scope="session"):
# to update this, run

@@ -209,3 +209,3 @@ # import requests

@pytest.fixture(name='flash_service')
def _flash_service(multi_interface_fixture):
def _flash_service(multi_interface_fixture, scope="session"):
return regtap.search(

@@ -700,3 +700,3 @@ ivoid="ivo://org.gavo.dc/flashheros/q/ssa")[0]

assert (rsc.get_contact()
== "GAVO Data Center Team (++49 6221 54 1837)"
== "GAVO Data Centre Team (+49 6221 54 1837)"
" <gavo@ari.uni-heidelberg.de>")

@@ -833,10 +833,10 @@

# the stuff might change upstream any time and then break our unit tests.
@pytest.fixture(name='flash_tables')
def _flash_tables():
@pytest.fixture(name='obscore_tables')
def _obscore_tables(scope="session"):
rsc = _makeRegistryRecord(
ivoid="ivo://org.gavo.dc/flashheros/q/ssa")
ivoid="ivo://org.gavo.dc/__system__/obscore/obscore")
return rsc.get_tables()
@pytest.mark.usefixtures("flash_tables")
@pytest.mark.usefixtures("obscore_tables")
class TestGetTables:

@@ -854,23 +854,22 @@ @pytest.mark.remote_data

@pytest.mark.remote_data
def test_get_tables_names(self, flash_tables):
assert (list(sorted(flash_tables.keys()))
== ["flashheros.data", "ivoa.obscore"])
def test_get_tables_names(self, obscore_tables):
assert (list(sorted(obscore_tables.keys()))
== ["ivoa.obscore"])
@pytest.mark.remote_data
def test_get_tables_table_instance(self, flash_tables):
assert (flash_tables["ivoa.obscore"].name
def test_get_tables_table_instance(self, obscore_tables):
assert (obscore_tables["ivoa.obscore"].name
== "ivoa.obscore")
assert (flash_tables["ivoa.obscore"].description
== "This data collection is queryable in GAVO Data"
" Center's obscore table.")
assert (flash_tables["flashheros.data"].title
== "Flash/Heros SSA table")
assert (obscore_tables["ivoa.obscore"].description[:42]
== "The IVOA-defined obscore table, containing")
assert (obscore_tables["ivoa.obscore"].title
== "GAVO Data Center Obscore Table")
assert (flash_tables["flashheros.data"].origin.ivoid
== "ivo://org.gavo.dc/flashheros/q/ssa")
assert (obscore_tables["ivoa.obscore"].origin.ivoid
== "ivo://org.gavo.dc/__system__/obscore/obscore")
@pytest.mark.remote_data
def test_get_tables_column_meta(self, flash_tables):
def getflashcol(name):
for col in flash_tables["flashheros.data"].columns:
def test_get_tables_column_meta(self, obscore_tables):
def getcol(name):
for col in obscore_tables["ivoa.obscore"].columns:
if name == col.name:

@@ -880,18 +879,19 @@ return col

assert getflashcol("accref").datatype.content == "char"
assert getflashcol("accref").datatype.arraysize == "*"
assert getcol("access_url").datatype.content == "char"
assert getcol("access_url").datatype.arraysize == "*"
# TODO: upstream bug: the following needs to fixed in DaCHS before
# the assertion passes
# assert getflashcol("ssa_region").datatype._extendedtype == "point"
assert getcol("access_format").ucd == 'meta.code.mime'
assert getflashcol("mime").ucd == 'meta.code.mime'
assert getcol("em_min").unit == "m"
assert getflashcol("ssa_specend").unit == "m"
assert (getcol("t_max").utype
== "obscore:char.timeaxis.coverage.bounds.limits.stoptime")
assert (getflashcol("ssa_specend").utype
== "ssa:char.spectralaxis.coverage.bounds.stop")
assert (getcol("t_exptime").description
== "Total exposure time")
assert (getflashcol("ssa_fluxcalib").description
== "Type of flux calibration")
@pytest.mark.remote_data
def test_get_tables_utype(self, obscore_tables):
assert (obscore_tables["ivoa.obscore"].utype
== "ivo://ivoa.net/std/obscore#table-1.1")

@@ -898,0 +898,0 @@

@@ -131,3 +131,3 @@ #!/usr/bin/env python

def test_includeaux(self):
assert (rtcons.Servicetype("http://extstandards/invention", "image"
assert (rtcons.Servicetype("http://extstandards/invention", "sia1"
).include_auxiliary_services().get_search_condition(FAKE_GAVO)

@@ -144,3 +144,3 @@ == "standard_id IN ('http://extstandards/invention',"

" a full standard URI nor one of the bespoke identifiers"
" image, sia, spectrum, ssap, ssa, scs, conesearch, line, slap,"
" image, sia, sia1, spectrum, ssap, ssa, scs, conesearch, line, slap,"
" table, tap, sia2")

@@ -258,2 +258,11 @@

def test_SkyCoord_Circle_RadiusQuantity(self):
for radius in [3*u.deg, 180*u.Unit('arcmin'), 10800*u.Unit('arcsec')]:
cons = registry.Spatial((SkyCoord(3 * u.deg, -30 * u.deg), radius))
assert cons.get_search_condition(FAKE_GAVO) == (
"1 = CONTAINS(MOC(6, CIRCLE(3.0, -30.0, 3.0)), coverage)")
with pytest.raises(ValueError, match="is not of type angle."):
cons = registry.Spatial((SkyCoord(3 * u.deg, -30 * u.deg), (1 * u.m)))
def test_enclosed(self):

@@ -260,0 +269,0 @@ cons = registry.Spatial("0/1-3", intersect="enclosed")

@@ -8,2 +8,2 @@ # Note that we need to fall back to the hard-coded version if either

except Exception:
version = '1.5.2'
version = '1.5.3'

@@ -6,3 +6,3 @@ [tox]

envlist =
py{38,39,310,311,312}-test{,-alldeps,-oldestdeps,-devdeps}{,-online}{,-cov}
py{38,39,310,311,312,313}-test{,-alldeps,-oldestdeps,-devdeps}{,-online}{,-cov}
linkcheck

@@ -30,2 +30,4 @@ codestyle

devdeps: PIP_EXTRA_INDEX_URL = https://pypi.anaconda.org/scientific-python-nightly-wheels/simple https://pypi.anaconda.org/liberfa/simple https://pypi.anaconda.org/astropy/simple
# astropy doesn't yet have a 3.13 compatible release
py313: PIP_EXTRA_INDEX_URL = https://pypi.anaconda.org/liberfa/simple https://pypi.anaconda.org/astropy/simple

@@ -39,6 +41,9 @@ deps =

# astropy doesn't yet have a 3.13 compatible release
py313: astropy>=0.0dev0
oldestdeps: astropy==4.1
# We set a suitably old numpy along with an old astropy, no need to pick up
# deprecations and errors due to their unmatching versions
oldestdeps: numpy==1.16
oldestdeps: numpy==1.19

@@ -45,0 +50,0 @@ online: pytest-rerunfailures

"""Configure Test Suite.
This file is used to configure the behavior of pytest when using the Astropy
test infrastructure. It needs to live inside the package in order for it to
get picked up when running the tests inside an interpreter using
`pyvo.test()`.
"""
import numpy as np
from astropy.utils import minversion
try:
from pytest_astropy_header.display import PYTEST_HEADER_MODULES, TESTED_VERSIONS
ASTROPY_HEADER = True
except ImportError:
ASTROPY_HEADER = False
# Keep this until we require numpy to be >=2.0
if minversion(np, "2.0.0.dev0+git20230726"):
np.set_printoptions(legacy="1.25")
def pytest_configure(config):
"""Configure Pytest with Astropy.
Parameters
----------
config : pytest configuration
"""
if ASTROPY_HEADER:
config.option.astropy_header = True
# Customize the following lines to add/remove entries from the list of
# packages for which version numbers are displayed when running the tests.
PYTEST_HEADER_MODULES['Astropy'] = 'astropy' # noqa
PYTEST_HEADER_MODULES['requests'] = 'requests' # noqa
PYTEST_HEADER_MODULES.pop('Pandas', None)
PYTEST_HEADER_MODULES.pop('h5py', None)
PYTEST_HEADER_MODULES.pop('Scipy', None)
PYTEST_HEADER_MODULES.pop('Matplotlib', None)
from . import __version__
TESTED_VERSIONS['pyvo'] = __version__