Hubble Source Catalog SWEEPS Proper Motion Notebook#

2019 - 2022, Steve Lubow, Rick White, Trenton McKinney#

A new MAST interface supports queries to the current and previous versions of the Hubble Source Catalog. It allows searches of the summary table (with multi-filter mean photometry) and the detailed table (with all the multi-epoch measurements). It also has an associated API, which is used in this notebook.

The web-based user interface to the HSC does not currently include access to the new proper motions available for the SWEEPS field in version 3.1 of the Hubble Source Catalog. However those tables are accessible via the API. This notebook shows how to use them.

This notebook is similar to the previously released version that uses CasJobs rather than the new API. The Casjobs interface is definitely more powerful and flexible, but the API is simpler to use for users who are not already experienced Casjobs users. Currently the API does not include easy access to the colors and magnitudes of the SWEEPS objects, but they will be added soon.

Additional information is available on the SWEEPS Proper Motions help page.

A notebook that provides a quick introduction to the HSC API is also available. Another notebook generates a color-magnitude diagram for the Small Magellanic Cloud in only a couple of minutes.

Instructions:#

  • Complete the initialization steps described below.

  • Run the notebook to completion.

  • Modify and rerun any sections of the Table of Contents below.

Running the notebook from top to bottom takes about 4 minutes (depending on the speed of your computer).

Table of Contents#

  • Initialization

  • Properties of Full Catalog

    • Sky Coverage

    • Proper Motion Distributions

    • Visit Distribution

    • Time Coverage Distributions

    • Detection Positions

    • Positions for a Sample With Good PMs

  • Science Applications

    • High Proper Motion Objects

    • HLA Cutout Images for Selected Objects

Initialization #

Install Python modules#

  1. This notebook requires the use of Python 3.

  2. Modules can be installed with conda, if using the Anaconda distribution of python, or with pip.

    • If you are using conda, do not install / update / remove a module with pip, that exists in a conda channel.

    • If a module is not available with conda, then it’s okay to install it with pip

import astropy
import time
import sys
import os
import requests
import json
import numpy as np
import matplotlib.pyplot as plt
from matplotlib.colors import LogNorm
from pathlib import Path

## For handling ordinary astropy Tables
from astropy.table import Table
from astropy.io import fits, ascii

from PIL import Image
from io import BytesIO

from fastkde import fastKDE
from scipy.interpolate import RectBivariateSpline
from astropy.modeling import models, fitting

# There are a number of relatively unimportant warnings that 
# show up, so for now, suppress them:
import warnings
warnings.filterwarnings("ignore")

# set width for pprint
astropy.conf.max_width = 150
# set universal matplotlib parameters
plt.rcParams.update({'font.size': 16})

MAST API functions#

  • Execute HSC searches and resolve names using MAST query.

  • Here we define several interrelated functions for retrieving information from the MAST API.

    • The hcvcone(ra, dec, radius [, keywords]) function searches the HCV catalog near a position.

    • The hcvsearch() function performs general non-positional queries.

    • The hcvmetadata() function gives information about the columns available in a table.

hscapiurl = "https://catalogs.mast.stsci.edu/api/v0.1/hsc"


def hsccone(ra, dec, radius, table="summary", release="v3", format="csv", magtype="magaper2",
            columns=None, baseurl=hscapiurl, verbose=False, **kw):
    """Do a cone search of the HSC catalog
    
    Parameters
    ----------
    ra (float): (degrees) J2000 Right Ascension
    dec (float): (degrees) J2000 Declination
    radius (float): (degrees) Search radius (<= 0.5 degrees)
    table (string): summary, detailed, propermotions, or sourcepositions
    release (string): v3 or v2
    magtype (string): magaper2 or magauto (only applies to summary table)
    format: csv, votable, json
    columns: list of column names to include (None means use defaults)
    baseurl: base URL for the request
    verbose: print info about request
    **kw: other parameters (e.g., 'numimages.gte':2)
    """
    
    data = kw.copy()
    data['ra'] = ra
    data['dec'] = dec
    data['radius'] = radius
    return hscsearch(table=table, release=release, format=format, magtype=magtype,
                     columns=columns, baseurl=baseurl, verbose=verbose, **data)


def hscsearch(table="summary", release="v3", magtype="magaper2", format="csv",
              columns=None, baseurl=hscapiurl, verbose=False, **kw):
    """Do a general search of the HSC catalog (possibly without ra/dec/radius)
    
    Parameters
    ----------
    table (string): summary, detailed, propermotions, or sourcepositions
    release (string): v3 or v2
    magtype (string): magaper2 or magauto (only applies to summary table)
    format: csv, votable, json
    columns: list of column names to include (None means use defaults)
    baseurl: base URL for the request
    verbose: print info about request
    **kw: other parameters (e.g., 'numimages.gte':2).  Note this is required!
    """
    
    data = kw.copy()
    if not data:
        raise ValueError("You must specify some parameters for search")
    if format not in ("csv", "votable", "json"):
        raise ValueError("Bad value for format")
    url = "{}.{}".format(cat2url(table, release, magtype, baseurl=baseurl), format)
    if columns:
        # check that column values are legal
        # create a dictionary to speed this up
        dcols = {}
        for col in hscmetadata(table, release, magtype)['name']:
            dcols[col.lower()] = 1
        badcols = []
        for col in columns:
            if col.lower().strip() not in dcols:
                badcols.append(col)
        if badcols:
            raise ValueError(f"Some columns not found in table: {', '.join(badcols)}")
        # two different ways to specify a list of column values in the API
        # data['columns'] = columns
        data['columns'] = f"[{','.join(columns)}]"

    # either get or post works
    # r = requests.post(url, data=data)
    r = requests.get(url, params=data)

    if verbose:
        print(r.url)
    r.raise_for_status()
    if format == "json":
        return r.json()
    else:
        return r.text


def hscmetadata(table="summary", release="v3", magtype="magaper2", baseurl=hscapiurl):
    """Return metadata for the specified catalog and table
    
    Parameters
    ----------
    table (string): summary, detailed, propermotions, or sourcepositions
    release (string): v3 or v2
    magtype (string): magaper2 or magauto (only applies to summary table)
    baseurl: base URL for the request
    
    Returns an astropy table with columns name, type, description
    """
    url = "{}/metadata".format(cat2url(table, release, magtype, baseurl=baseurl))
    r = requests.get(url)
    r.raise_for_status()
    v = r.json()
    # convert to astropy table
    tab = Table(rows=[(x['name'], x['type'], x['description']) for x in v],
                names=('name', 'type', 'description'))
    return tab


def cat2url(table="summary", release="v3", magtype="magaper2", baseurl=hscapiurl):
    """Return URL for the specified catalog and table
    
    Parameters
    ----------
    table (string): summary, detailed, propermotions, or sourcepositions
    release (string): v3 or v2
    magtype (string): magaper2 or magauto (only applies to summary table)
    baseurl: base URL for the request
    
    Returns a string with the base URL for this request
    """
    checklegal(table, release, magtype)
    if table == "summary":
        url = f"{baseurl}/{release}/{table}/{magtype}"
    else:
        url = f"{baseurl}/{release}/{table}"
    return url


def checklegal(table, release, magtype):
    """Checks if this combination of table, release and magtype is acceptable
    
    Raises a ValueError exception if there is problem
    """
    
    releaselist = ("v2", "v3")
    if release not in releaselist:
        raise ValueError(f"Bad value for release (must be one of {', '.join(releaselist)})")
    if release == "v2":
        tablelist = ("summary", "detailed")
    else:
        tablelist = ("summary", "detailed", "propermotions", "sourcepositions")
    if table not in tablelist:
        raise ValueError(f"Bad value for table (for {release} must be one of {', '.join(tablelist)})")
    if table == "summary":
        magtypelist = ("magaper2", "magauto")
        if magtype not in magtypelist:
            raise ValueError(f"Bad value for magtype (must be one of {', '.join(magtypelist)})")

Use metadata query to get information on available columns#

This query works for any of the tables in the API (summary, detailed, propermotions, sourcepositions).

meta = hscmetadata("propermotions")
print(' '.join(meta['name']))
meta[:5]
objID numSources raMean decMean lonMean latMean raMeanErr decMeanErr lonMeanErr latMeanErr pmRA pmDec pmLon pmLat pmRAErr pmDecErr pmLonErr pmLatErr pmRADev pmDecDev pmLonDev pmLatDev epochStart epochEnd epochMean DSigma NumFilters NumVisits NumImages CI CI_Sigma KronRadius KronRadius_Sigma HTMID X Y Z
Table length=5
nametypedescription
str16str5str31
objIDlongobjID_descriptionTBD
numSourcesintnumSources_descriptionTBD
raMeanfloatraMean_descriptionTBD
decMeanfloatdecMean_descriptionTBD
lonMeanfloatlonMean_descriptionTBD

Retrieve data on selected SWEEPS objects#

This makes a single large request to the HSC search interface to the get the contents of the proper motions table. Despite its large size (460K rows), the query is relatively efficient: it takes about 25 seconds to retrieve the results from the server, and then another 20 seconds to convert it to an astropy table. The speed of the table conversion will depend on your computer.

Note that the query restricts the sample to objects with at least 20 images total spread over at least 10 different visits. These constraints can be modified depending on your science goals.

columns = """ObjID,raMean,decMean,raMeanErr,decMeanErr,NumFilters,NumVisits,
    pmLat,pmLon,pmLatErr,pmLonErr,pmLatDev,pmLonDev,epochMean,epochStart,epochEnd""".split(",")
columns = [x.strip() for x in columns]
columns = [x for x in columns if x and not x.startswith('#')]

# missing -- impossible with current data I think
# MagMed, n, MagMAD

constraints = {'NumFilters.gt': 1, 'NumVisits.gt': 9, 'NumImages.gt': 19}

# note the pagesize parameter, which allows retrieving very large results
# the default pagesize is 50000 rows

t0 = time.time()
results = hscsearch(table="propermotions", release='v3', columns=columns, verbose=True, pagesize=500000, **constraints)
print("{:.1f} s: retrieved data".format(time.time()-t0))
tab = ascii.read(results)
print(f"{(time.time()-t0):.1f} s: converted to astropy table")

# change some column names for consistency with the Casjobs version of this notebook
tab.rename_column("raMean", "RA")
tab.rename_column("decMean", "Dec")
tab.rename_column("raMeanErr", "RAerr")
tab.rename_column("decMeanErr", "Decerr")
tab.rename_column("pmLat", "bpm")
tab.rename_column("pmLon", "lpm")
tab.rename_column("pmLatErr", "bpmerr")
tab.rename_column("pmLonErr", "lpmerr")

# compute some additional columns

tab['pmdev'] = np.sqrt(tab['pmLonDev']**2+tab['pmLatDev']**2)
tab['yr'] = (tab['epochMean'] - 47892)/365.25+1990
tab['dT'] = (tab['epochEnd']-tab['epochStart'])/365.25
tab['yrStart'] = (tab['epochStart'] - 47892)/365.25+1990
tab['yrEnd'] = (tab['epochEnd'] - 47892)/365.25+1990

# delete some columns that are not needed after the computations
del tab['pmLonDev'], tab['pmLatDev'], tab['epochEnd'], tab['epochStart'], tab['epochMean']

tab
https://catalogs.mast.stsci.edu/api/v0.1/hsc/v3/propermotions.csv?pagesize=500000&NumFilters.gt=1&NumVisits.gt=9&NumImages.gt=19&columns=%5BObjID%2CraMean%2CdecMean%2CraMeanErr%2CdecMeanErr%2CNumFilters%2CNumVisits%2CpmLat%2CpmLon%2CpmLatErr%2CpmLonErr%2CpmLatDev%2CpmLonDev%2CepochMean%2CepochStart%2CepochEnd%5D
11.2 s: retrieved data
20.4 s: converted to astropy table
Table length=462899
ObjIDRADecRAerrDecerrNumFiltersNumVisitsbpmlpmbpmerrlpmerrpmdevyrdTyrStartyrEnd
int64float64float64float64float64int64int64float64float64float64float64float64float64float64float64float64
4000709004025269.81113540475-29.205331698855490.244190579750366480.3275327326458957247-0.11908705096435673-4.0124105878217960.121625579201741940.211709615309151531.79975412340753982013.300790214705811.3719145413717642003.43617966200852014.8080942033803
4000709004027269.8116388910937-29.2053644971632241.17886355783241341.4716648853867305237-4.11980766403094-2.36826319069911940.57231623705264460.86976990757631479.4432900923816362013.16105489053911.3421286555118022003.43617966200852014.7783083175204
4000709004028269.82427914269425-29.205332114148940.447115976928541440.4459631147729665247-4.421746372958551-1.27592423645707640.25299765178158350.280052440072832753.32542098736458322013.300790214705811.3719145413717642003.43617966200852014.8080942033803
4000709004029269.7864474985505-29.2053920567436341.4230557763791780.9794774170182019217-3.1557679475584868-6.15998041608915251.50945740504556031.63276702970973325.1977249052677912013.76309843551852.46475822331790532012.34333598006242014.8080942033803
4000709004030269.8334866333895-29.2053229976339350.215092080936767070.22191292535149054246-1.107537903856762-7.57733802567590150.209906647536124450.304189596275042051.5979651720866962013.51523827019923.00678224901781552011.80131195436252014.8080942033803
4000709004031269.83554602222245-29.205334163165060.347145884080135140.250697200695798762463.3546500586370094-6.2151349534577140.378431891708436550.344989568432256372.06054617500295432013.51523827019923.00678224901781552011.80131195436252014.8080942033803
4000709004032269.83522978713324-29.205368504638340.473797117208960340.44011831582130173246-1.21097248977553.3958297802470510.54589134279109460.54778390987302253.2253930114603852013.51523827019923.00678224901781552011.80131195436252014.8080942033803
4000709004034269.7970410814722-29.205245120739880.240211252575057570.24465317966973735246-1.212287611021993-3.53587365116449260.36387947404877880.188980151063758431.5667836127960692013.51523827019923.00678224901781552011.80131195436252014.8080942033803
4000709004035269.81971436037634-29.205270255801690.87112562970743121.74750383382100032463.1764186967112096-1.51459176352209780.69375761910577920.92295734830923539.546804199929982013.304928086019811.3719145413717642003.43617966200852014.8080942033803
4000709004036269.8059701857259-29.2052892707179457.0418124153735947.95066581315764952471.169131976204993-8.9464616063761821.66540120091548337.53377351113682688.288892595886242013.505924577974211.3719145413717642003.43617966200852014.8080942033803
................................................
6000165252534269.71483815366565-29.206086214028341.33594776994389840.5573498829904994911-158.5890473271123820.397894750944268135.0319886592532499.187224335472463.321800575501832015.5671637223780.0244001449480357042015.55126350699022015.5756636519384
6000165252545269.7151304059814-29.2060348342242831.31330820217140020.5753188700691569911-148.0714644649570264.08441219776404139.93188319578789.218592747867523.36905251053659432015.5671637223780.0244001449480357042015.55126350699022015.5756636519384
6000165252557269.7170959958215-29.2060310812931481.04748894303441080.44892665653642305911-96.4535890138543-164.8949216740117108.836245500838274.525936106617952.82552546150625352015.5671637223780.0244001449480357042015.55126350699022015.5756636519384
6000165252578269.7181053315153-29.2059486397204821.13745344095894340.7820141140582517911-351.0270921133267252.50310186752114140.5716522960252675.929240786908243.2885688221851552015.5671637223780.0244001449480357042015.55126350699022015.5756636519384
6000165252579269.71539504162905-29.205976266396431.03045655115482380.6078680461509636911-228.79277774475963-121.21217408415274107.1036551495499487.77432668972562.94894432273044462015.5671637223780.0244001449480357042015.55126350699022015.5756636519384
6000165252585269.7139116946398-29.205814011368681.09288299659774730.1737181086154091911-55.7317661312443356.702560650169243110.5725601821721864.646209025443372.34087350875307452015.5671637223780.0244001449480357042015.55126350699022015.5756636519384
6000165252590269.7147254733658-29.2059121310389461.08581478032121080.3514023427469821911-189.748370998037945.448745954656275106.9487564590321777.531239507103652.7776782692039962015.5671637223780.0244001449480357042015.55126350699022015.5756636519384
6000165252609269.715804778712-29.2058440024403331.15552932288573970.33405860052843767911-321.590214293594160.041693585055896765121.1004225804220268.686674361037752.89114549648759932015.5671637223780.0244001449480357042015.55126350699022015.5756636519384
6000165252615269.71775504663-29.2058574610025871.00885362738852711.0012626630704595911-4.38245839524729225.536209929537428109.86328192409816122.457818350454863.3328989175710682015.5671637223780.0244001449480357042015.55126350699022015.5756636519384
6000165252625269.715136545852-29.205857066119880.96517610688359720.8510531750863013911-101.02664190234734-37.53601998368994117.3843076165260791.674558189191132.9698773612453772015.5671637223780.0244001449480357042015.55126350699022015.5756636519384

Properties of Full Catalog #

Sky Coverage #

fig, ax = plt.subplots(figsize=(12, 10))
ax.scatter('RA', 'Dec', data=tab, s=1, alpha=0.1)
ax.set(xlabel='RA', ylabel='Dec', title=f'{len(tab)} stars in SWEEPS')
ax.invert_xaxis()
../../../_images/0b88d35259a9954d7895341b2c5f25320d6b41cacdfaa617da4e7aa30e6b5f4a.png

Proper Motion Histograms #

Proper motion histograms for lon and lat#

bins = 0.2
hrange = (-20, 20)
bincount = int((hrange[1]-hrange[0])/bins + 0.5) + 1

fig, ax = plt.subplots(figsize=(12, 10))
for col, label in zip(['lpm', 'bpm'], ['Longitude', 'Latitude']):
    ax.hist(col, data=tab, range=hrange, bins=bincount, label=label, histtype='step', linewidth=2)
ax.set(xlabel='Proper motion [mas/yr]', ylabel=f'Number [in {bins:.2} mas bins]', title=f'{len(tab):,} stars in SWEEPS')
_ = ax.legend()
../../../_images/796a7610ae1f9f651bb53c1614cfab132b6a4e37b3bce03a9e21234a87218709.png

Proper motion error cumulative histogram for lon and lat#

bins = 0.01
hrange = (0, 2)
bincount = int((hrange[1]-hrange[0])/bins + 0.5) + 1

fig, ax = plt.subplots(figsize=(12, 10))
for col, label in zip(['lpmerr', 'bpmerr'], ['Longitude Error', 'Latitude Error']):
    ax.hist(col, data=tab, range=hrange, bins=bincount, label=label, histtype='step', cumulative=1, linewidth=2)
ax.set(xlabel='Proper motion error [mas/yr]', ylabel=f'Cumulative number [in {bins:0.2} mas bins]', title=f'{len(tab):,} stars in SWEEPS')
_ = ax.legend(loc='upper left')
../../../_images/b1cb87b91b5f3e22c88ae5b5b5c888473e3b6e82578691795f540f10d6ba3596.png

Proper motion error log histogram for lon and lat#

bins = 0.01
hrange = (0, 6)
bincount = int((hrange[1]-hrange[0])/bins + 0.5) + 1

fig, ax = plt.subplots(figsize=(12, 10))
for col, label in zip(['lpmerr', 'bpmerr'], ['Longitude Error', 'Latitude Error']):
    ax.hist(col, data=tab, range=hrange, bins=bincount, label=label, histtype='step', linewidth=2)
ax.set(xlabel='Proper motion error [mas/yr]', ylabel=f'Number [in {bins:0.2} mas bins]', title=f'{len(tab):,} stars in SWEEPS', yscale='log')
_ = ax.legend(loc='upper right')
../../../_images/1fd5a5290c1d369a93cade8aa055c794b50fceda9f209c994dfc776cd9273e57.png

Proper motion error as a function of dT#

Exclude objects with dT near zero, and to improve the plotting add a bit of random noise to spread out the quanitized time values.

# restrict to sources with dT > 1 year
dtmin = 1.0
w = np.where(tab['dT'] > dtmin)[0]
if ('rw' not in locals()) or len(rw) != len(w):
    rw = np.random.random(len(w))
x = np.array(tab['dT'][w]) + 0.5*(rw-0.5)
y = np.log(np.array(tab['lpmerr'][w]))

# Calculate the point density
t0 = time.time()
myPDF, axes = fastKDE.pdf(x, y, numPoints=2**9+1)
print(f"kde took {(time.time()-t0):.1f} sec")

# interpolate to get z values at points
finterp = RectBivariateSpline(axes[1], axes[0], myPDF)
z = finterp(y, x, grid=False)

# Sort the points by density, so that the densest points are plotted last
idx = z.argsort()
xs, ys, zs = x[idx], y[idx], z[idx]

# select a random subset of points in the most crowded regions to speed up plotting
wran = np.where(np.random.random(len(zs))*zs < 0.05)[0]
print(f"Plotting {len(wran)} of {len(zs)} points")
xs = xs[wran]
ys = ys[wran]
zs = zs[wran]
kde took 1.6 sec
Plotting 190206 of 461199 points
fig, ax = plt.subplots(figsize=(12, 10))
sc = ax.scatter(xs, np.exp(ys), c=zs, s=2, edgecolors='none', cmap='plasma')
ax.set(xlabel='Date range [yr]', ylabel='Proper motion error [mas/yr]',
       title=f'{len(tab):,} stars in SWEEPS\nLongitude PM error', yscale='log')
_ = fig.colorbar(sc, ax=ax)
../../../_images/be6a24e0b64242ae65e8797a66b06d416166d15e6de7eb428b250523ea1020d3.png

Proper motion error log histogram for lon and lat#

Divide sample into points with \(<6\) years of data and points with more than 6 years of data.

bins = 0.01
hrange = (0, 6)
bincount = int((hrange[1]-hrange[0])/bins + 0.5) + 1
tsplit = 6

# select the data to plot
mask = tab['dT'] <= tsplit
data1 = tab[mask]
data2 = tab[~mask]
fig, (ax1, ax2) = plt.subplots(nrows=2, ncols=1, figsize=(12, 12), sharey=True)

for ax, data in zip([ax1, ax2], [data1, data2]):
    for col, label in zip(['lpmerr', 'bpmerr'], ['Longitude Error', 'Latitude Error']):
        ax.hist(col, data=data, range=hrange, bins=bincount, label=label, histtype='step', linewidth=2)
    
ax1.set(xlabel='Proper motion error [mas/yr]', ylabel=f'Number [in {bins:0.2} mas bins]',
        title=f'{len(data1):,} stars in SWEEPS with dT < {tsplit} yrs', yscale='log')
ax2.set(xlabel='Proper motion error [mas/yr]', ylabel=f'Number [in {bins:0.2} mas bins]',
        title=f'{len(data2):,} stars in SWEEPS with dT > {tsplit} yrs', yscale='log')

ax1.legend()
ax2.legend()

_ = fig.tight_layout()
../../../_images/07d57e7121d9bf4767cc80b87f438b8345ee810cdcfcaacecb98d7950ad880b9.png

Number of Visits Histogram #

bins = 1
hrange = (0, 130)
bincount = int((hrange[1]-hrange[0])/bins + 0.5) + 1

fig, ax = plt.subplots(figsize=(12, 10))
ax.hist('NumVisits', data=tab, range=hrange, bins=bincount, label='Number of visits ', histtype='step', linewidth=2)
ax.set(xlabel='Number of visits', ylabel='Number of objects', title=f'{len(tab):,} stars in SWEEPS')
_ = ax.margins(x=0)
../../../_images/f3f9811f8e6a906bea62f546e074aa982b49e054596ef4175a4f5765d63cb7f4.png

Time Histograms #

First plot histogram of observation dates.

bins = 1
hrange = (2000, 2020)
bincount = int((hrange[1]-hrange[0])/bins + 0.5) + 1

fig, ax = plt.subplots(figsize=(12, 10))
ax.hist('yr', data=tab, range=hrange, bins=bincount, label='year ', histtype='step', linewidth=2)
ax.set(xlabel='mean detection epoch (year)', ylabel='Number of objects', title=f'{len(tab):,} stars in SWEEPS')
ax.set_xticks(ticks=range(2000, 2021, 2))
_ = ax.margins(x=0)
../../../_images/417049e493b15b50f14f85b8a655ab21d658eb6d4e1960b99ce8f203a56571d0.png

Then plot histogram of observation duration for the objects.

bins = 0.25
hrange = (0, 15)
bincount = int((hrange[1]-hrange[0])/bins + 0.5) + 1

fig, ax = plt.subplots(figsize=(12, 10))
ax.hist('dT', data=tab, range=hrange, bins=bincount, label='year ', histtype='step', linewidth=2)
_ = ax.set(xlabel='time span (years)', ylabel='Number of objects', title=f'{len(tab):,} stars in SWEEPS', yscale='log')
../../../_images/ff916266d50b785edab68c69bffd7c8f8adc0b30915093e2408583a4d9a15894.png

Detection Positions #

Define a function to plot the PM fit for an object.

# define function
def positions(Obj):
    """
    input parameter Obj is the value of the ObjID 
    output plots change in (lon, lat) as a function of time
    overplots proper motion fit
    provides ObjID and proper motion information in labels
    """

    # get the measured positions as a function of time
    pos = ascii.read(hscsearch("sourcepositions", columns="dT,dLon,dLat".split(','), objid=Obj))
    pos.sort('dT')
    
    # get the PM fit parameters
    pm = ascii.read(hscsearch("propermotions", columns="pmlon,pmlonerr,pmlat,pmlaterr".split(','), objid=Obj))
    
    lpm = pm['pmlon'][0]
    bpm = pm['pmlat'][0]
    
    # get the intercept for the proper motion fit referenced to the start time
    # time between mean epoch and zero (ref) epoch (years)

    x = pos['dT']
    # xpm = np.linspace(0, max(x), 10)
    xpm = np.array([x.min(), x.max()])
        
    y1 = pos['dLon']
    ypm1 = lpm*xpm
    
    y2 = pos['dLat']
    ypm2 = bpm*xpm

    # plot figure 
    fig, (ax1, ax2) = plt.subplots(nrows=1, ncols=2, figsize=(6, 3), tight_layout=True)
    
    ax1.scatter(x, y1, s=10)
    ax1.plot(xpm, ypm1, '-r')
    
    ax2.scatter(x, y2, s=10)
    ax2.plot(xpm, ypm2, '-r')
    
    ax1.set_xlabel('dT (yrs)', fontsize=10)
    ax1.set_ylabel('dLon (mas)', fontsize=10)
    ax2.set_xlabel('dT (yrs)', fontsize=10)
    ax2.set_ylabel('dLat (mas)', fontsize=10)
    
    fig.suptitle(f'ObjID {Obj}'
                 f'\n{len(x)} detections,  (lpm, bpm) = ({lpm:.1f}, {bpm:.1f}) mas/yr', fontsize=10)
    
    plt.show()
    plt.close()

Plot positions for a sample of objects with good proper motions #

This selects objects that are detected in more than 90 visits with a median absolute deviation from the fit of less than 1.5 mas and proper motion error less than 1.0 mas/yr.

n = tab['NumVisits']
dev = tab['pmdev']
objid = tab['ObjID']
lpmerr0 = np.array(tab['lpmerr'])
bpmerr0 = np.array(tab['bpmerr'])
wi = np.where((dev < 1.5) & (n > 90) & (np.sqrt(bpmerr0**2+lpmerr0**2) < 1.0))[0]
print(f"Plotting {len(wi)} objects")
for o in objid[wi]:
    positions(o)
Plotting 21 objects
../../../_images/1dfeb9baf400ad41b09ecc26a1922f0780ae5e37001acaa65e57de74560a3ce4.png ../../../_images/0f0591d5b0154dbfc7be55135d80eeff09ee0be57d5ed165d2f3c90e927fd621.png ../../../_images/52f04894ba8b88360608fc07e89d4db16354db06516247cfffc2da29bb805d0c.png ../../../_images/61c725fa04139e9bae9d46460b3ef047d48b058631d21b237f927ecfe3908282.png ../../../_images/4527ed728b0a9554ccbc7be4d9ec935b7b42ce9a86abf28e7458f9b27af7a0b2.png ../../../_images/c3ea286268f4e72a7674208d732eb5f96eba83474b3b0fb0e1e66fd54ee4deb8.png ../../../_images/4e3db1f43e401cb46a54c5ee2b7426ffef15a93556eff29e6b8cb0ff0bfa2467.png ../../../_images/6628d1e7a9548303e41e73ff2e4d61dc9d80df6ec581f6a2c5caefd23ae7dadf.png ../../../_images/86763544f19e747d47b0e89e2894aa5d384bb0aaa1e9a5032444764c932f9d0e.png ../../../_images/3478c780341517d9c705bde995ab2b1a4d11e627322efeb9e27b92880ccf85f4.png ../../../_images/d7a45e149ff525c5b75bceaf22a9a4ea78bd18ecf1d53182575ce14b6712af9f.png ../../../_images/1aa188df93e457529391772fad02271237f2b6ab85e0c962f9b243f9821da5eb.png ../../../_images/ff97cde763dbf47f8c27c39b7be187e72b023bed55b43fb09f6b7dd577a2f091.png ../../../_images/5e81b79bad610e1e3c621609a71c74aaf1ed39195987faa1da298e1a9efd02ba.png ../../../_images/d1462e2ab728bd4da052da2921b9c960335f4ff1a70ba12e547bd7cf243d6a70.png ../../../_images/6c3c9c5db1507199f508f33d0a271a5f1b9f46172172c6133925e8395518e6ef.png ../../../_images/55e1380680b26501bd6086ff4453f93cbd6843491523fc9d58fdebefc514e450.png ../../../_images/a82732f2e11b9b9374c01cb775ec3cd46f5148b50b52793acff86213bf61c0ee.png ../../../_images/603c77525dbf925f2beee2bb102e23db0254d10a2840bd6070a55aaf57b926e5.png ../../../_images/599f762c7a814238ec59a1b0a8c16f2cfb2c9a4f04226b6025b3d7fff3298b10.png ../../../_images/6794ec08471db313061df260dde1942178ce880be14c4ccea7a62aac5cfe8fac.png

Science Applications #

High Proper Motion Objects #

Get a list of objects with high, accurately measured proper motions. Proper motions are measured relative to the Galactic center.

lpm_sgra = -6.379 # +- 0.026
bpm_sgra = -0.202 # +- 0.019

lpm0 = np.array(tab['lpm'])
bpm0 = np.array(tab['bpm'])
lpmerr0 = np.array(tab['lpmerr'])
bpmerr0 = np.array(tab['bpmerr'])
pmtot0 = np.sqrt((bpm0-bpm_sgra)**2+(lpm0-lpm_sgra)**2)
pmerr0 = np.sqrt(bpmerr0**2+lpmerr0**2)
dev = tab['pmdev']

# sort samples by decreasing PM
wpmh = np.where((pmtot0 > 15) & (pmerr0 < 1.0) & (dev < 5))[0]
wpmh = wpmh[np.argsort(-pmtot0[wpmh])]

print(f"Plotting {len(wpmh)} objects")
for o in tab["ObjID"][wpmh]:
    positions(o)
Plotting 31 objects
../../../_images/58b522aa7d296c9d16b487d5797eacf4ae65de422e61d935045571edc0770c4a.png ../../../_images/221651c57b29e3bf9c9949d955bb89f92837d1ee4ad2ce404349e5bf0beaac7d.png ../../../_images/2df893e08eb5140a37093e61a7ef0544850370886f500c139ce63daadf5cd4a2.png ../../../_images/dd58a92df77ba58b5dc8dc96cb47f9a91e45e3e7e672d5931f7b1a512dbefdc2.png ../../../_images/40809eefdc926fe8c3b1a01ec07252bf510bd1bc6fbe08049af9cde2216b5add.png ../../../_images/f18b4094b5c2815886dff1ade0adf0c8a0ff79975936420774e3f7a467647ad8.png ../../../_images/9e9a94daab742766b1e599b2efc7a7ba5b357bea9b118b0e3d9e526fd4c8e27d.png ../../../_images/fbe23f3335949cd332b7e09fa47d739b8eacc8ca5281d173d95490fc74add56a.png ../../../_images/b5c80f8ad1faeffb9404a1d8eafe54b8630e673ca6e95fb79ae9e134a7e39e34.png ../../../_images/75c30c99ee14684a7d017360349094d7fbaa0465b8b891be32e4dce0014e013a.png ../../../_images/cb48b857634fc1aba9242c765e0f177b7f7301b24f24063ac66f949e5b137a5b.png ../../../_images/ec19e9cc1612accc674d719554783edd331c188fefa88264fd0b3b4e8e3c8ee0.png ../../../_images/26f9b3b9f9c3b1a7cf9cf4805cd2009225c3e7965d77ce8daa051f1be250a113.png ../../../_images/54d50ff1f70995e11b610ef396040de7ac35e99a6999b3f74134840a8a69c71d.png ../../../_images/8963d08ffdf546f1ed09e74afacdf3ca9a1c4c984c8e39d73bc69fc8ad44af9a.png ../../../_images/e13dc92b8df39f0b75806d7249d18d35029e8d696aee5c818b18947174abd045.png ../../../_images/18f74b153155df13e6a7b44650be98c2da0cb442e46d49cdccfb6e6038a9d503.png ../../../_images/923af00113c752d22c5ecf6f76a018ecc9dd1d16af9ae650364712bcb22ea89e.png ../../../_images/13e83859eb3432d17025ec84889a67ec204b275e122ee3310f300d9930a445b4.png ../../../_images/4aa84051b811744efb0777eb73758731b19bebad2d41b9524b0748859f2d5f52.png ../../../_images/9b140503947f09952aa90de8128f9b8d6b79e5863128587e382e3def6ee699fd.png ../../../_images/1a82b0e4263fc69e7a1550c2750d93a10ca737a5940df6e41ee5a097317a8ed1.png ../../../_images/efc98dea0b237899b2bce5066f9b8399f0c40ed4dd72b4287d9aa510359c3192.png ../../../_images/0e85c91e5db0bf6e3f457064d95eb69fc69650183c8b458fb36a4f9c1db9e9df.png ../../../_images/c7780dcd6b948ac60b5f2b12cd4f2d0541c484e7e29f3e7bfce914c165e33099.png ../../../_images/6ccd72c123daeddb9436cacf6dec3e47e3e43d3019357f5a58f835e480e97727.png ../../../_images/30511a8c96dfea7cef5aec87e34044fc18ff5f43bb002ed47ae732b6e919d6da.png ../../../_images/8688dbe90ce4afd1f6fea0131b559cff6d9b75c7c0afc7f2b18901d5960f968b.png ../../../_images/8e74504aa35151ef6af94662a502b26e614d171a0b519d0d26ab3fe2cda83b1f.png ../../../_images/b6c62b73fe0e00edf376e1a9fa8f159d0fa22593a5d335783f3d4842af82d078.png ../../../_images/b1f156aa75e21864aed2d4ea967250c6765cc1d0a73466f247cd047a972b5c3b.png

Get HLA cutout images for selected objects #

Get HLA color cutout images for the high-PM objects. The query_hla function gets a table of all the color images that are available at a given position using the f814w+f606w filters. The get_image function reads a single cutout image (as a JPEG color image) and returns a PIL image object.

See the documentation on HLA VO services and the fitscut image cutout service for more information on the web services being used.

def query_hla(ra, dec, size=0.0, imagetype="color", inst="ACS", format="image/jpeg",
              spectral_elt=("f814w", "f606w"), autoscale=95.0, asinh=1, naxis=33):
    # convert a list of filters to a comma-separated string
    if not isinstance(spectral_elt, str):
        spectral_elt = ",".join(spectral_elt)
    siapurl = ("https://hla.stsci.edu/cgi-bin/hlaSIAP.cgi?"
               f"pos={ra},{dec}&size={size}&imagetype={imagetype}&inst={inst}"
               f"&format={format}&spectral_elt={spectral_elt}"
               f"&autoscale={autoscale}&asinh={asinh}"
               f"&naxis={naxis}")
    votable = Table.read(siapurl, format="votable")
    return votable


def get_image(url):
    """Get image from a URL"""
    r = requests.get(url)
    im = Image.open(BytesIO(r.content))
    return im
# display earliest and latest images side-by-side
# top 10 highest PM objects
wsel = wpmh[:10]
nim = len(wsel)
icols = 1        # objects per row
ncols = 2*icols  # two images for each object
nrows = (nim+icols-1)//icols

imsize = 33
xcross = np.array([-1, 1, 0, 0, 0])*2 + imsize/2
ycross = np.array([0, 0, 0, -1, 1])*2 + imsize/2

# selected data from tab
sd = tab[['RA', 'Dec', 'ObjID']][wsel]

# create the figure
fig, axes = plt.subplots(nrows=nrows, ncols=ncols, figsize=(12, (12/ncols)*nrows))

# iterate each observation, and each set of axes for the first and last image
for (ax1, ax2), obj in zip(axes, sd):
    
    # get the image urls and observation datetime
    hlatab = query_hla(obj["RA"], obj["Dec"], naxis=imsize)[['URL', 'StartTime']]
    # sort the data by the observation datetime, and get the first and last observation url
    (url1, time1), (url2, time2) = hlatab[np.argsort(hlatab['StartTime'])][[0, -1]]
    
    # get the images
    im1 = get_image(url1)
    im2 = get_image(url2)
    
    # plot the images
    ax1.imshow(im1, origin="upper")
    ax2.imshow(im2, origin="upper")
    
    # plot the center
    ax1.plot(xcross, ycross, 'g')
    ax2.plot(xcross, ycross, 'g')
    
    # labels and titles
    ax1.set(ylabel=f'ObjID {obj["ObjID"]}', title=time1)
    ax2.set_title(time2)
../../../_images/f2c3aba7ff5ed4aef583a8ef05c735b8dabe0b656cf400933dff627644fcf65b.png

Look at the entire collection of images for the highest PM object#

i = wpmh[0]

# selected data
sd = tab['ObjID', 'RA', 'Dec', 'bpm', 'lpm', 'yr', 'dT'][i]
display(sd)

imsize = 33
# get the URL and StartTime data
hlatab = query_hla(sd['RA'], sd['Dec'], naxis=imsize)[['URL', 'StartTime']]
# sort the data
hlatab = hlatab[np.argsort(hlatab['StartTime'])]

nim = len(hlatab)
ncols = 8
nrows = (nim+ncols-1)//ncols

xcross = np.array([-1, 1, 0, 0, 0])*2 + imsize/2
ycross = np.array([0, 0, 0, -1, 1])*2 + imsize/2
Row index=193587
ObjIDRADecbpmlpmyrdT
int64float64float64float64float64float64float64
4000711121911269.7367256594695-29.209699919117618-18.43788346257518-36.801459330875692004.1942381434012.749260607770275
# get the images: takes about 90 seconds for 77 images
images = [get_image(url) for url in hlatab['URL']]
# create the figure
fig, axes = plt.subplots(nrows=nrows, ncols=ncols, figsize=(20, (20/ncols)*nrows))
# flatten the axes for easy iteration and zipping
axes = axes.flat

plt.rcParams.update({"font.size": 11})
for ax, time1, img in zip(axes, hlatab['StartTime'], images):
    # plot image
    ax.imshow(img, origin="upper")
    # plot the center
    ax.plot(xcross, ycross, 'g')
    # set the title
    ax.set_title(time1)
    
# remove the last 3 unused axes
for ax in axes[nim:]:
    ax.remove()

fig.suptitle(f"ObjectID: {sd['ObjID']}\nRA: {sd['RA']:0.2f} Dec: {sd['Dec']:0.2f}\nObservations: {nim}", y=1, fontsize=14)
fig.tight_layout()
../../../_images/92d93c87862a353e643541b7f01a04c5f1ac0db12070d206491aaaab00e75e99.png