Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

HIRES wavelengths and a bit more #1628

Merged
merged 43 commits into from
Aug 13, 2023
Merged
Show file tree
Hide file tree
Changes from 25 commits
Commits
Show all changes
43 commits
Select commit Hold shift + click to select a range
cb52d46
wip
Apr 24, 2023
f074e6d
Merge remote-tracking branch 'origin/sensfunc_blaze_jwst_lists' into …
Apr 24, 2023
3001b80
Merge remote-tracking branch 'origin/sensfunc_blaze_jwst_lists' into …
profxj May 23, 2023
25a3d36
redo a single order well
profxj May 29, 2023
9761588
fuss
profxj Jun 5, 2023
e039188
Merge remote-tracking branch 'origin/develop' into hires_waves_and_re…
profxj Jul 3, 2023
58871ca
Merge remote-tracking branch 'origin/sensfunc_blaze_jwst_lists' into …
profxj Jul 3, 2023
01c8702
wip
profxj Jul 3, 2023
e738f91
wip!
profxj Jul 3, 2023
85bc73b
semi-successful
profxj Jul 3, 2023
73a91fa
good as it gets for now
profxj Jul 5, 2023
9ef5fa0
wip
profxj Jul 6, 2023
bd38ce1
closer!
profxj Jul 6, 2023
db75b59
id fix (I hope)
profxj Jul 6, 2023
8a82176
mo
profxj Jul 7, 2023
69e32e7
Merge remote-tracking branch 'origin/sensfunc_blaze_jwst_lists' into …
profxj Jul 10, 2023
e9ebbf9
good progress
profxj Jul 10, 2023
7463cdd
polishing
profxj Jul 10, 2023
d8e314f
docs
profxj Jul 10, 2023
e1327b1
2d
profxj Jul 13, 2023
71f2cb3
nires fixes
profxj Jul 13, 2023
ba74766
Merge remote-tracking branch 'origin/sensfunc_blaze_jwst_lists' into …
profxj Jul 13, 2023
5c2ae30
debuggin
profxj Jul 13, 2023
97627dd
debuggin continues
profxj Jul 13, 2023
d50b53d
Merge remote-tracking branch 'origin/sensfunc_blaze_jwst_lists' into …
profxj Jul 20, 2023
35c3d3a
PR edits
profxj Jul 20, 2023
0939c93
fix
profxj Jul 21, 2023
c11a0b5
the fix is in
profxj Jul 21, 2023
a3fd75f
mo
profxj Jul 21, 2023
569a736
unknowns bug
profxj Jul 21, 2023
336f3a3
Merge remote-tracking branch 'origin/sensfunc_blaze_jwst_lists' into …
profxj Jul 21, 2023
c8479ad
xshooter doc
profxj Jul 24, 2023
969a930
Merge remote-tracking branch 'origin/sensfunc_blaze_jwst_lists' into …
profxj Jul 24, 2023
3d40489
bump up rms
profxj Jul 24, 2023
2db1699
identify fix
profxj Jul 24, 2023
1e513ef
Merge remote-tracking branch 'origin/develop' into hires_waves_and_re…
profxj Aug 3, 2023
74cd1a0
edits for PR
profxj Aug 4, 2023
f8a0ff8
RC comments
profxj Aug 6, 2023
bdc4296
Merge remote-tracking branch 'origin/develop' into hires_waves_and_re…
profxj Aug 6, 2023
ea55d54
docs
profxj Aug 11, 2023
3198a88
load_object
profxj Aug 11, 2023
c1eb574
Merge remote-tracking branch 'origin/develop' into hires_waves_and_re…
profxj Aug 11, 2023
f3fcde3
Merge remote-tracking branch 'origin/develop' into hires_waves_and_re…
profxj Aug 13, 2023
File filter

Filter by extension

Filter by extension

Conversations
Failed to load comments.
Loading
Jump to
Jump to file
Failed to load files.
Loading
Diff view
Diff view
28 changes: 28 additions & 0 deletions doc/calibrations/wave_calib.rst
Original file line number Diff line number Diff line change
Expand Up @@ -259,6 +259,34 @@ observations, long-slit observations where wavelengths
vary (*e.g.*, grating tilts). We are likely to implement
this for echelle observations (*e.g.*, HIRES).

.. _wvcalib-echelle:

Echelle Spectrographs
=====================

Echelle spectrographs are a special case for wavelength
solutions, primarily because the orders follow the
grating equation.

In general, the approach is:

1. Identify the arc lines in each order
2. Fit the arc lines in each order to a polynomial, individually
3. Fit a 2D solution to the lines using the order number as a basis
4. Reject orders where the RMS of the fit exceeds ``rms_threshold``
5. Attempt to recover the missing orders using the 2D fit and
a higher RMS threshold
6. Refit the 2D solution

One should always inspect the outputs, especially the 2D solution
(global and orders). One may then modify the ``rms_threshold``
and/or hand-fit a few of the orders to improve the solution.

For echelle spectrographs with multiple detectors *per* camera
that are mosaiced (e.g. Keck/HIRES), we fit the 2D solutions on a *per* detector
basis. Ths is because we have found the mosaic solutions to be
too difficult to make sufficiently accurate.

profxj marked this conversation as resolved.
Show resolved Hide resolved
.. _wvcalib-byhand:

By-Hand Approach
Expand Down
18 changes: 18 additions & 0 deletions doc/spectrographs/keck_hires.rst
Original file line number Diff line number Diff line change
@@ -0,0 +1,18 @@
==========
Keck HIRES
==========

Overview
========

This file summarizes several instrument specific settings that are related to the Keck/HIRES spectrograph.


Wavelengths
===========

See :ref:`wvcalib-echelle` for details on the wavelength calibration.

We also note that Order 45 is frequently flagged
as bad in the wavelength solution. This is due to, in part,
to very bright ThAr line contamination.
profxj marked this conversation as resolved.
Show resolved Hide resolved
2 changes: 1 addition & 1 deletion pypeit/calibrations.py
Original file line number Diff line number Diff line change
Expand Up @@ -837,7 +837,7 @@ def get_wv_calib(self):
self.wv_calib.chk_synced(self.slits)
self.slits.mask_wvcalib(self.wv_calib)
# Return
if self.par['wavelengths']['redo_slit'] is None:
if self.par['wavelengths']['redo_slits'] is None:
return self.wv_calib

# Determine lamp list to use for wavecalib
Expand Down
2 changes: 0 additions & 2 deletions pypeit/core/arc.py
Original file line number Diff line number Diff line change
Expand Up @@ -18,8 +18,6 @@

from pypeit import msgs
from pypeit import utils
from pypeit.core.wavecal import wvutils
from pypeit.core.wavecal import wv_fitting
profxj marked this conversation as resolved.
Show resolved Hide resolved
from pypeit.core import fitting
from IPython import embed

Expand Down
186 changes: 164 additions & 22 deletions pypeit/core/wavecal/autoid.py
Original file line number Diff line number Diff line change
Expand Up @@ -376,12 +376,11 @@ def match_qa(arc_spec, tcent, line_list, IDs, scores, outfile = None, title=None
return





def reidentify(spec, spec_arxiv_in, wave_soln_arxiv_in, line_list, nreid_min, det_arxiv=None, detections=None, cc_thresh=0.8,cc_local_thresh = 0.8,
match_toler=2.0, nlocal_cc=11, nonlinear_counts=1e10,sigdetect=5.0,fwhm=4.0,
debug_xcorr=False, debug_reid=False, debug_peaks = False):
def reidentify(spec, spec_arxiv_in, wave_soln_arxiv_in, line_list,
nreid_min, det_arxiv=None,
detections=None, cc_thresh=0.8, cc_local_thresh=0.8,
match_toler=2.0, nlocal_cc=11, nonlinear_counts=1e10,
sigdetect=5.0, fwhm=4.0, debug_xcorr=False, debug_reid=False, debug_peaks = False):
""" Determine a wavelength solution for a set of spectra based on archival wavelength solutions

Parameters
Expand Down Expand Up @@ -445,6 +444,12 @@ def reidentify(spec, spec_arxiv_in, wave_soln_arxiv_in, line_list, nreid_min, de
debug_reid: bool, default = False
Show plots useful for debugging the line reidentification

sigdetect: float, default = 5.0
Threshold for detecting arcliens

fwhm: float, default = 4.0
Full width at half maximum for the arc lines

Returns
-------
(detections, spec_cont_sub, patt_dict)
Expand Down Expand Up @@ -534,6 +539,7 @@ def reidentify(spec, spec_arxiv_in, wave_soln_arxiv_in, line_list, nreid_min, de

wvc_arxiv = np.zeros(narxiv, dtype=float)
disp_arxiv = np.zeros(narxiv, dtype=float)

# Determine the central wavelength and dispersion of wavelength arxiv
for iarxiv in range(narxiv):
wvc_arxiv[iarxiv] = wave_soln_arxiv[nspec//2, iarxiv]
Expand Down Expand Up @@ -646,9 +652,10 @@ def reidentify(spec, spec_arxiv_in, wave_soln_arxiv_in, line_list, nreid_min, de
patt_dict_slit['sigdetect'] = sigdetect
return detections, spec_cont_sub, patt_dict_slit


# Finalize the best guess of each line
patt_dict_slit = patterns.solve_xcorr(detections, wvdata, det_indx, line_indx, line_cc,nreid_min=nreid_min,cc_local_thresh=cc_local_thresh)
patt_dict_slit = patterns.solve_xcorr(
detections, wvdata, det_indx, line_indx, line_cc,
nreid_min=nreid_min,cc_local_thresh=cc_local_thresh)
patt_dict_slit['sign'] = sign # This is not used anywhere
patt_dict_slit['bwv'] = np.median(wcen[wcen != 0.0])
patt_dict_slit['bdisp'] = np.median(disp[disp != 0.0])
Expand Down Expand Up @@ -687,6 +694,135 @@ def reidentify(spec, spec_arxiv_in, wave_soln_arxiv_in, line_list, nreid_min, de
return detections, spec_cont_sub, patt_dict_slit


def match_to_arxiv(lamps:list, spec:np.ndarray, wv_guess:np.ndarray,
spec_arxiv:np.ndarray, wave_arxiv:np.ndarray, nreid_min:int,
match_toler=2.0, nonlinear_counts=1e10, sigdetect=5.0, fwhm=4.0,
debug_peaks:bool=False, use_unknowns:bool=False):
""" Algorithm to match an input arc spectrum to an archival arc spectrum
profxj marked this conversation as resolved.
Show resolved Hide resolved
using a set wavelength guess for the input

Used only missing orders of echelle spectrographs (so far)
profxj marked this conversation as resolved.
Show resolved Hide resolved

Args:
lamps (list): List of lamps used in the arc
spec (np.ndarray): Spectrum to match
wv_guess (np.ndarray): Wavelength guess for the input spectrum
profxj marked this conversation as resolved.
Show resolved Hide resolved
spec_arxiv (np.ndarray): Spectrum to match to
profxj marked this conversation as resolved.
Show resolved Hide resolved
wave_arxiv (np.ndarray): Wavelegnth solution for the archival spectrum
nreid_min (int):
Minimum number of times that a given candidate reidentified line must be properly matched with a line in the arxiv
to be considered a good reidentification. If there is a lot of duplication in the arxiv of the spectra in question
(i.e. multislit) set this to a number like 2-4. For echelle this depends on the number of solutions in the arxiv.
For fixed format echelle (ESI, X-SHOOTER, NIRES) set this 1. For an echelle with a tiltable grating, it will depend
on the number of solutions in the arxiv.
match_toler (float, optional): Defaults to 2.0.
profxj marked this conversation as resolved.
Show resolved Hide resolved
Matching tolerance in pixels for a line reidentification. A good line match must match within this tolerance to the
the shifted and stretched archive spectrum, and the archive wavelength solution at this match must be within
match_toler dispersion elements from the line in line list.
nonlinear_counts (float, optional): Defaults to 1e10.
For arc line detection: Arc lines above this saturation
threshold are not used in wavelength solution fits because
they cannot be accurately centroided
sigdetect (float, optional): Defaults to 5.0.
Threshold for detecting arcliens
fwhm (float, optional): Defaults to 4.0.
Full width at half maximum for the arc lines
debug_peaks (bool, optional): Defaults to False.
use_unknowns (bool, optional): Defaults to False.
If True, use the unknowns in the solution (not recommended)

Returns:
_type_: _description_
profxj marked this conversation as resolved.
Show resolved Hide resolved
"""

# TODO -- The next 10 lines of code is duplicated from echelle_wvcalib
profxj marked this conversation as resolved.
Show resolved Hide resolved
# Load the line lists
if 'ThAr' in lamps:
line_lists_all = waveio.load_line_lists(lamps)
line_lists = line_lists_all[np.where(line_lists_all['ion'] != 'UNKNWN')]
unknwns = line_lists_all[np.where(line_lists_all['ion'] == 'UNKNWN')]
else:
line_lists = waveio.load_line_lists(lamps)
unknwns = waveio.load_unknown_list(lamps)

tot_line_list = astropy.table.vstack([line_lists, unknwns]) if use_unknowns else line_lists

# Generate the wavelengths from the line list and sort
wvdata = np.array(tot_line_list['wave'].data) # Removes mask if any
wvdata.sort()

# Search for lines in the input arc
tcent, ecent, cut_tcent, icut, spec_cont_sub = wvutils.arc_lines_from_spec(
spec, sigdetect=sigdetect,
nonlinear_counts=nonlinear_counts,
fwhm=fwhm, debug = debug_peaks)
# If there are no lines in the input arc, return
if tcent.size == 0:
return None

# Search for lines in the arxiv arc
tcent_arxiv, ecent_arxiv, cut_tcent_arxiv, icut_arxiv, spec_cont_sub_now = wvutils.arc_lines_from_spec(
spec_arxiv, sigdetect=sigdetect,
nonlinear_counts=nonlinear_counts, fwhm = fwhm, debug = debug_peaks)
# If there are no lines in the arxiv arc, return
if tcent_arxiv.size == 0:
return None

# Interpolate the input wavelengths
fwv_guess = scipy.interpolate.interp1d(np.arange(len(wv_guess)), wv_guess,
profxj marked this conversation as resolved.
Show resolved Hide resolved
kind='cubic', bounds_error=False,
fill_value='extrapolate')
# Interpolate the axiv both ways
profxj marked this conversation as resolved.
Show resolved Hide resolved
fpix_arxiv = scipy.interpolate.interp1d(wave_arxiv, np.arange(len(wave_arxiv)),
kind='cubic', bounds_error=False,
fill_value='extrapolate')
fwv_arxiv = scipy.interpolate.interp1d(np.arange(len(wave_arxiv)), wave_arxiv,
kind='cubic', bounds_error=False,
fill_value='extrapolate')
# Find the wavelengths of the input arc lines and then the pixels
wv_cent = fwv_guess(tcent)
pix_arxiv = fpix_arxiv(wv_cent)

# Other bits
wvc_arxiv = wave_arxiv[wave_arxiv.size//2]
igood = wave_arxiv > 1.0
disp_arxiv = np.median(wave_arxiv[igood] - np.roll(wave_arxiv[igood], 1))

line_indx = np.array([], dtype=int)
det_indx = np.array([], dtype=int)
line_cc = np.array([], dtype=float)
#line_iarxiv = np.array([], dtype=int)

# Match with tolerance
profxj marked this conversation as resolved.
Show resolved Hide resolved
for ss, ipix_arxiv in enumerate(pix_arxiv):
pdiff = np.abs(ipix_arxiv - tcent_arxiv)
bstpx = np.argmin(pdiff)
# If a match is found within 2 pixels, consider this a successful match
if pdiff[bstpx] < match_toler:
# Using the arxiv arc wavelength solution, search for the nearest line in the line list
bstwv = np.abs(wvdata - fwv_arxiv(tcent_arxiv[bstpx]))
# This is a good wavelength match if it is within match_toler disperion elements
if bstwv[np.argmin(bstwv)] < match_toler*disp_arxiv:
line_indx = np.append(line_indx, np.argmin(bstwv)) # index in the line list array wvdata of this match
det_indx = np.append(det_indx, ss) # index of this line in the detected line array detections
#line_iarxiv = np.append(line_iarxiv,iarxiv)
line_cc = np.append(line_cc,1.) # Fakery

# Initialise the patterns dictionary, sigdetect not used anywhere
if (len(np.unique(line_indx)) < 3):
patt_dict_slit = patterns.empty_patt_dict(pix_arxiv.size)
patt_dict_slit['sigdetect'] = sigdetect
else:
# Finalize the best guess of each line
patt_dict_slit = patterns.solve_xcorr(
tcent, wvdata, det_indx, line_indx, line_cc,
nreid_min=nreid_min,cc_local_thresh=-1)
patt_dict_slit['bwv'] = wvc_arxiv
patt_dict_slit['bdisp'] = disp_arxiv
patt_dict_slit['sigdetect'] = sigdetect

return tcent, spec_cont_sub, patt_dict_slit, tot_line_list

def map_fwhm(image, gpm, slits_left, slits_right, slitmask, npixel=None, nsample=None, sigdetect=10., specord=1,
spatord=0, fwhm=5., box_rad=3.0, slit_bpm=None):
"""
Expand Down Expand Up @@ -1044,7 +1180,7 @@ def echelle_wvcalib(spec, orders, spec_arxiv, wave_arxiv, lamps, par,
ok_mask=None, use_unknowns=True, debug_all=False,
debug_peaks=False, debug_xcorr=False, debug_reid=False,
debug_fits=False, nonlinear_counts=1e10,
redo_slit:int=None):
redo_slits:list=None):
r"""
Algorithm to wavelength calibrate echelle data based on a predicted or archived wavelength solution

Expand Down Expand Up @@ -1094,9 +1230,9 @@ def echelle_wvcalib(spec, orders, spec_arxiv, wave_arxiv, lamps, par,
For arc line detection: Arc lines above this saturation
threshold are not used in wavelength solution fits because
they cannot be accurately centroided
redo_slit: int, optional
redo_slits: list, optional
If provided, only perform the wavelength calibration for the
given slit.
given slit(s).

Returns
-------
Expand Down Expand Up @@ -1149,7 +1285,7 @@ def echelle_wvcalib(spec, orders, spec_arxiv, wave_arxiv, lamps, par,
bad_orders = np.array([], dtype=int)
# Reidentify each slit, and perform a fit
for iord in range(norders):
if redo_slit is not None and orders[iord] != redo_slit:
if redo_slits is not None and orders[iord] not in redo_slits:
continue
# ToDO should we still be populating wave_calib with an empty dict here?
if iord not in ok_mask:
Expand All @@ -1169,18 +1305,24 @@ def echelle_wvcalib(spec, orders, spec_arxiv, wave_arxiv, lamps, par,
debug_peaks=(debug_peaks or debug_all),
debug_xcorr=(debug_xcorr or debug_all),
debug_reid=(debug_reid or debug_all))
# Perform the fit
if debug_fits or debug_all:
embed(header='1115 of autoid')
profxj marked this conversation as resolved.
Show resolved Hide resolved
# Check if an acceptable reidentification solution was found
if not all_patt_dict[str(iord)]['acceptable']:
wv_calib[str(iord)] = None
bad_orders = np.append(bad_orders, iord)
continue

# Perform the fit
n_final = wvutils.parse_param(par, 'n_final', iord)
final_fit = wv_fitting.fit_slit(spec_cont_sub[:, iord], all_patt_dict[str(iord)], detections[str(iord)],
tot_line_list, match_toler=par['match_toler'], func=par['func'],
n_first=par['n_first'],
sigrej_first=par['sigrej_first'], n_final=n_final, sigrej_final=par['sigrej_final'])
final_fit = wv_fitting.fit_slit(
spec_cont_sub[:, iord], all_patt_dict[str(iord)],
detections[str(iord)], tot_line_list,
match_toler=par['match_toler'],
func=par['func'], n_first=par['n_first'],
sigrej_first=par['sigrej_first'],
n_final=n_final,
sigrej_final=par['sigrej_final'])

# Did the fit succeed?
if final_fit is None:
Expand All @@ -1206,14 +1348,14 @@ def echelle_wvcalib(spec, orders, spec_arxiv, wave_arxiv, lamps, par,
# Print the final report of all lines
report_final(norders, all_patt_dict, detections,
wv_calib, ok_mask, bad_orders,
redo_slit=redo_slit, orders=orders)
redo_slits=redo_slits, orders=orders)

return all_patt_dict, wv_calib


def report_final(nslits, all_patt_dict, detections,
wv_calib, ok_mask, bad_slits,
redo_slit:int=None,
redo_slits:list=None,
orders:np.ndarray=None):
"""
Print out the final report for wavelength calibration
Expand All @@ -1231,8 +1373,8 @@ def report_final(nslits, all_patt_dict, detections,
Mask of indices of good slits
bad_slits (ndarray, bool):
List of slits that are bad
redo_slit (int, optional):
Slit or order that was redone.
redo_slits (list, optional):
Report on only these slits
orders (ndarray, optional):
Array of echelle orders to be printed out during the report.
"""
Expand All @@ -1244,7 +1386,7 @@ def report_final(nslits, all_patt_dict, detections,
badmsg += f' Order: {orders[slit]}' + msgs.newline()
badmsg += ' Wavelength calibration not performed!'
# Redo?
if redo_slit is not None and orders[slit] != redo_slit:
if redo_slits is not None and orders[slit] not in redo_slits:
continue
if slit not in ok_mask or slit in bad_slits or all_patt_dict[str(slit)] is None:
msgs.warn(badmsg)
Expand Down
14 changes: 10 additions & 4 deletions pypeit/core/wavecal/echelle.py
Original file line number Diff line number Diff line change
Expand Up @@ -208,7 +208,8 @@ def identify_ech_orders(arcspec, echangle, xdangle, dispname,

# Predict the echelle order coverage and wavelength solution
order_vec_guess, wave_soln_guess_pad, arcspec_guess_pad = predict_ech_arcspec(
angle_fits_file, composite_arc_file, echangle, xdangle, dispname, nspec, norders, pad=pad)
angle_fits_file, composite_arc_file, echangle, xdangle, dispname,
nspec, norders, pad=pad)
norders_guess = order_vec_guess.size

# Since we padded the guess we need to pad the data to the same size
Expand All @@ -222,15 +223,20 @@ def identify_ech_orders(arcspec, echangle, xdangle, dispname,
percent_ceil=50.0, sigdetect=5.0, sig_ceil=10.0, fwhm=4.0, debug=debug)

# Finish
x_ordr_shift = shift_cc / nspec
ordr_shift = int(np.round(shift_cc / nspec))
spec_shift = int(np.round(shift_cc - ordr_shift * nspec))
msgs.info('Shift in order number between prediction and reddest order: {:.3f}'.format(ordr_shift + pad))
msgs.info('Shift in order number between prediction and reddest order: {:.3f}'.format(
ordr_shift + pad))
msgs.info('Shift in spectral pixels between prediction and data: {:.3f}'.format(spec_shift))

order_vec = order_vec_guess[-1] - ordr_shift + np.arange(norders)[::-1]
# Assign
#order_vec = order_vec_guess[-1] - ordr_shift + np.arange(norders)[::-1]
profxj marked this conversation as resolved.
Show resolved Hide resolved
order_vec = order_vec_guess[0] + ordr_shift - np.arange(norders)
ind = np.isin(order_vec_guess, order_vec, assume_unique=True)

if debug:
embed(header='identify_ech_orders 232 of echelle.py')

profxj marked this conversation as resolved.
Show resolved Hide resolved
# Return
return order_vec, wave_soln_guess_pad[:, ind], arcspec_guess_pad[:, ind]

Expand Down
Loading