taskGNU Astronomy Utilities - Tasks: task #15637, Match RA and Dec catalog to X and...

 
 

You are not allowed to post comments on this tracker with your current authentication level.

task #15637: Match RA and Dec catalog to X and Y catalog to find WCS

Submitter:  Mohammad Akhlaghi <makhlaghi>
Submitted:  Tue 12 May 2020 01:15:22 AM UTC
   
 
Should Start On:  Mon 11 May 2020 11:00:00 PM UTC Should be Finished on:  Mon 11 May 2020 11:00:00 PM UTC
Category:  Astrometry Priority:  5 - Normal
Item Group:  Enhancement Status:  Done
Privacy:  Public Assigned to:  ndanzanello
Percent Complete:  100% Open/Closed:  Closed
Effort:  0.00

Jump to the original submission

Thu 24 Mar 2022 08:50:45 AM UTC, comment #12: 

Since Natali's project on finding the linear WCS translation parameters has been completed, I am closing this task.

task #16136 has been defined for finding the distortions in images.

Mohammad Akhlaghi <makhlaghi>
Group administrator
Mon 23 Aug 2021 12:56:45 AM UTC, comment #11: 

This got solved here.

I also uploaded 'match_overview_complete.png', a new image complementing the old one.


Natáli Anzanello <ndanzanello>
Sun 20 Jun 2021 04:43:10 PM UTC, comment #10: 

It's not 'vertices_after_match_fixed.png'. It's '4stars_vertices_distances.png'. Sorry for that :)

comment #9:

> I think that the matching problem has been solved. Below I explain the steps:
>
> The first thing we needed to fix was the vertices found on each catalog. It's very important that all the vertices are labeled the same. First, we label the A and B vertices as the most separated ones. In the query catalog it's just the Euclidean distance between the points, but on the reference catalog we have to use the angular distance between the points to get the same vertices. Prior to that, it was also using the Euclidean distance to the vertices on the reference catalog, so it would give different most separated A and B for the two catalogs.
> After that, we have to choose the C and D vertices. First we label randomly the two remaining vertices as C and D and then we compare the ACB and ADB angles that are less than 180 degrees and choose C to be the one that has the lesser angle.
>
> Now, we have the A, B, C and D vertices to be the same when dealing with the same quads and we have to compute their hashes. The hashes were calculated using Cx = (c1-a1)/(b1-a1), where a1, b1 and c1 are the coordinates along the axis 1. Now we have the problem related to the rotations: the distance between the points is the same, but the distance along each axis is not the same! So the Cx would be different for different axis. The same would happen for Cy, Dx and Dy.
>
> To solve this, first we transform the celestial coordinates of the reference catalog into projection plane coordinates (TAN projection) using the midpoint of AB as the coordinates of the native pole.
> We proceed defining new two axis (x and y, where the hashes will be calculated) using the A-B vector as a 45 degrees line contained in these axis. Then, we project the C-A and D-A vectors in these axis and get the hashes.
>
> I have tested for a variety of seeds and rotations and it has worked. Let me know what you think!
>
> Below, a figure showing these steps ('match_overview.png'). I also uploaded a fix to the figure that I sent in the last post because the indexes were wrong ('vertices_after_match_fixed.png').
>
> (file #51589,

Natáli Anzanello <ndanzanello>
Sun 20 Jun 2021 04:19:43 PM UTC, comment #9: 

I think that the matching problem has been solved. Below I explain the steps:

The first thing we needed to fix was the vertices found on each catalog. It's very important that all the vertices are labeled the same. First, we label the A and B vertices as the most separated ones. In the query catalog it's just the Euclidean distance between the points, but on the reference catalog we have to use the angular distance between the points to get the same vertices. Prior to that, it was also using the Euclidean distance to the vertices on the reference catalog, so it would give different most separated A and B for the two catalogs.
After that, we have to choose the C and D vertices. First we label randomly the two remaining vertices as C and D and then we compare the ACB and ADB angles that are less than 180 degrees and choose C to be the one that has the lesser angle.

Now, we have the A, B, C and D vertices to be the same when dealing with the same quads and we have to compute their hashes. The hashes were calculated using Cx = (c1-a1)/(b1-a1), where a1, b1 and c1 are the coordinates along the axis 1. Now we have the problem related to the rotations: the distance between the points is the same, but the distance along each axis is not the same! So the Cx would be different for different axis. The same would happen for Cy, Dx and Dy.

To solve this, first we transform the celestial coordinates of the reference catalog into projection plane coordinates (TAN projection) using the midpoint of AB as the coordinates of the native pole.
We proceed defining new two axis (x and y, where the hashes will be calculated) using the A-B vector as a 45 degrees line contained in these axis. Then, we project the C-A and D-A vectors in these axis and get the hashes.

I have tested for a variety of seeds and rotations and it has worked. Let me know what you think!

Below, a figure showing these steps ('match_overview.png'). I also uploaded a fix to the figure that I sent in the last post because the indexes were wrong ('vertices_after_match_fixed.png').

(file #51589,

Natáli Anzanello <ndanzanello>
Thu 17 Jun 2021 01:51:17 PM UTC, comment #8: 

Continuing on the task of finding the matches, two small fixes were made on this fork: https://codeberg.org/ndanzanello/gnuastro/src/branch/astrometry

The first is about a segmentation fault occurring when no match was found. The second is a fix on the vertices returned when a match was found. It was returning the matched vertices in the order generated prior to the geometric approach (Figure 'vertices_after_match.png' shows this).

Now, we're dealing with the problem that the vertices of the quads are not always the same (Figure '4stars_vertices_distances.png' shows this in a setup with 4 stars using 30 degrees rotation).

First, we're able to fix this to catalogs without rotation (Figure 'new_dist_4stars.png' shows this after fixing it and 'old_dist_4stars.png' is prior to the fix). We can note that the hashes are pretty similar after the fix and the distance between the hashes has gone under 0.01!

But now, the problem is the one given by the '4stars_vertices_distances.png' figure, ie, when we have to deal with a rotation. I'm still thinking about this problem and would like to know if you have some suggestions :)


Natáli Anzanello <ndanzanello>
Mon 20 Jul 2020 01:41:46 PM UTC, comment #7: 

During the rough outline mentioned before we found out that healpixs are only necessary to ensure a homogenous sampling of quads across the field. That is why we decided to ignore the healpix library for now to avoid an extra dependency.

In most scenarios, it is indeed not needed and we can simply build a grid ourselves. But one scenario where healpixs will be necessary just occurred to me: if the desired field includes the celestial poles. In this case, a simple gridding of the range of RA and Dec will not properly sample the input RA and Dec.

So without healpixs, we will only have problems if the field includes the celestial poles. However, for now (in the development phase) this isn't a problem and we can simply use a cartesian grid and progress with the main work. We can add healpixs as an optional feature to ensure homogenity in the future (instead of a simple cartesian grid) once all the other steps are complete.

In fact this will be a good feature for Gnuastro: if someone doesn't want to do astrometry near the celestial poles, they don't need to install healpix as a dependency of Gnuastro :-). But if healpix is present it will be used (as implemented later).

Sachin, to allow an easy optional usage of healpix later take the following step: define two functions very similar to healpix's API:

  • One function to define a "grid" structure over the coordinate range (just find the minimum and maximum in each dimension, and store them with the number of grid boxes in each dimension).


  • One function to take the "grid" structure and the coordinates of a point and return the index of the grid element that the coordinate falls in. By "index of grid element", I mean that if we start counting the grid tiles from the minimum in both dimensions to to the maximum, which index would correspond to the tile that the given coordinate falls into.


In this way, later, we can easily optionally call our own simple cartesian grid-ing function or healpix (if the user has it).

Mohammad Akhlaghi <makhlaghi>
Group administrator
Sun 19 Jul 2020 10:20:47 PM UTC, comment #6: 

During our discussion today, we came up with a rough outline of the steps that I am attaching as a simple plain-text file here for the record.

(file #49513)

Mohammad Akhlaghi <makhlaghi>
Group administrator
Wed 17 Jun 2020 03:56:49 AM UTC, comment #5: 

Wonderful review! Thanks Sachin ;-).

About object detection, indeed! That is correct! Gnuastro is founded on the Unix Philosophy, I especially like the original four principles of Doug McIlroy. An astrometry program shouldn't have to worry about how to do detection, that is another program's job ;-)!

The input to our program should just be an X-Y-magnitude table and a reference RA-Dec-magnitude table and its output should be the WCSLIB-created WCS keywords with distortions, that is all ;-).

The second point is also interesting! I'd love to learn more about it (if only I had time!). But hopefully later as my schedule opens up a little, I'll dig into your implementation and learn more ;-).

On the third point, today we have ESA's Gaia survey which is state of the art and much more accurate than the ones in that 2009 paper ;-). It has become the defacto standard for astrometry! That is where I got the reference RA-DEC catalog that you can use below.

Regarding generating a healpix grid, fortunately there is a standard software for it: HEALPix, which also has a C library. We can safely assume this as a new optional dependency of Gnuastro. So if the users don't have it, this feature won't be built. So go ahead and learn/use it ;-).

I agree, verification is a little more subjective and we can worry about that when we get an initial fit, its too early to spend time on it now ;-).

Mohammad Akhlaghi <makhlaghi>
Group administrator
Tue 16 Jun 2020 10:56:19 PM UTC, comment #4: 

I read the paper introducing astrometry.net and also went quickly through its repository on github. One thing that really bugged me was their documentation. It is not complete and whatever is written in more or less subtle. Anyway, the paper introduced the idea fairly well, though it was more theoretical oriented than software-oriented. Here are the basic steps that it employs for matching and wcs calculations:

  • Firstly, all the objects(stars or galaxies) in the files are detected. They use a method which is based on flux differences and peak/threshold filtering. But noisechisel provides an easy and efficient way for the same and is already present for our disposal. So object detection can easily be done:-).


  • Next, they use a quad(a set of 4 stars) to make unique geometric hash codes which is quite fast and efficient for storage and matching neighbours. It also provides invariant to translation, rotation and scaling of the star positions so that it can be computed using only the relative positions of the four stars in any conformal coordinate system.


  • The index/reference catalogue is pre-computed for fast retrievals. For reference catalogues, they use an all-sky(or near all-sky) surveys (like USNO-B1, infrared  2MASScatalo, and ultraviolet catalogue from GALEX). They then pre-process these catalogues by making a HEALPix grid and choosing the brightest stars in those HEALPixes to make a large number of geometric hashes.  These hashes are then stored in a kd-tree along with the star positions(for verification). kd-tree provides fast retrieval of query hashes present in the neighbourhood of the hashes in the reference catalogue.


  • Finally, a verification step is also done which used a bayesian decision model to calculate a threshold for a match to be passed or failed.



Now we have detected objects and made catalogues with them. But we need to make HEALix grid in our reference catalogue and then create a kd-tree to store these predefined hashes. We also need a structure to store the hashes for invariance. Then we'll need to match the hashes in our x-y catalogues to that reference catalogue (ra-dec). Maybe verification can be done later on when all these are done. What is your suggestion?

Sachin Kumar Singh <sks_15>
Sun 17 May 2020 02:44:37 AM UTC, comment #3: 

WCSLIB's disp2x() and disx2p() may be very useful for this task.

In particular, after we have found the basic WCS parameters (CRVALs, CRPIXs, CDELTs and PCs) and want to fine-tine/minimize the distortion. GSL has some good general minimization/fitting functions we can use ;-).

Mohammad Akhlaghi <makhlaghi>
Group administrator
Tue 12 May 2020 09:32:12 AM UTC, comment #2: 

I forgot to mention that the data below are taken by the Iran National Observatory (INO) Lens Array (INOLA). In particular I am very grateful to Hamed Altafi who took the pictures and shared them for us to test/play with. The results will be applicable to any instrument.

Mohammad Akhlaghi <makhlaghi>
Group administrator
Tue 12 May 2020 02:21:44 AM UTC, comment #1: 

To help in completing this task, I just uploaded some data to play with. A link to each uploaded file is available at the bottom of this comment.

The main datasets are 8 short exposure (5 sec) images of Castor, but each exposure is offset compared to the others (this offset in individual exposures is called dithering in astronomy). Please open these images and actually see how Castor's position in the image changes in each of them.

A catalog of sources (actually "Clumps" in Segment) is generated for each image using NoiseChisel+Segment+MakeCatalog using the script below (which is also available under that directory), the catalogs are the actual inputs into this task and they have a `-x-y.fits' suffix. Each catalog contains roughly 1500 clumps. Finally, there is also a reference catalog with RA and Dec of 1179744 sources near Castor from ESO's Gaia survey.

Once this task is complete, we should be able to have an accurate WCS (including distortions) for each image.

Script to generate a catalog of sources in each image and their X and Y positions.

# Base name of input files.
input="1881731715 3163025249 1887227377 3746244531 2923938133 4209692437 2966790193 595260509"

# Generate the catalog for each image.
for i in $input; do

    # First run NoiseChisel to separate signal from the background.
    astnoisechisel $i.fits --tilesize=50,50 --interpnumngb=21 \
                   --output=$i-nc.fits

    # Run Segment to get a label for each clump, don't bother with
    # detecting objects, they aren't relevant here.
    astsegment $i-nc.fits --onlyclumps --output=$i-seg.fits

    # Generate a catalog with the ID, X, Y, and magnitude (assuming a
    # zeropoint of 0).
    astmkcatalog $i-seg.fits --hdu=CLUMPS --ids --x --y --magnitude \
                 --output=$i-x-y.fits

    # Clean up.
    rm $i-nc.fits $i-seg.fits
done


Reference Catalog with RA and Dec of many sources:
http://akhlaghi.org/data/astrometry/gaia-dr2-near-castor.fits

List of FITS images:
http://akhlaghi.org/data/astrometry/1881731715.fits
http://akhlaghi.org/data/astrometry/1887227377.fits
http://akhlaghi.org/data/astrometry/2923938133.fits
http://akhlaghi.org/data/astrometry/2966790193.fits
http://akhlaghi.org/data/astrometry/3163025249.fits
http://akhlaghi.org/data/astrometry/3746244531.fits
http://akhlaghi.org/data/astrometry/4209692437.fits
http://akhlaghi.org/data/astrometry/595260509.fits

List of X-Y catalogs:
http://akhlaghi.org/data/astrometry/1881731715-x-y.fits
http://akhlaghi.org/data/astrometry/1887227377-x-y.fits
http://akhlaghi.org/data/astrometry/2923938133-x-y.fits
http://akhlaghi.org/data/astrometry/2966790193-x-y.fits
http://akhlaghi.org/data/astrometry/3163025249-x-y.fits
http://akhlaghi.org/data/astrometry/3746244531-x-y.fits
http://akhlaghi.org/data/astrometry/4209692437-x-y.fits
http://akhlaghi.org/data/astrometry/595260509-x-y.fits

Script to generate catalog:
http://akhlaghi.org/data/astrometry/generate-catalog.sh

Mohammad Akhlaghi <makhlaghi>
Group administrator
Tue 12 May 2020 01:15:22 AM UTC, original submission:  

Let's assume that `reference-ra-dec.fits' is a single catalog that contains the RA and Dec many sources from a reference source, for example from the Gaia Archive.

We also have multiple single exposure images that are named `img1.fits', `img2.fits', `img3.fits' and etc. We assume that these images partially cover the area of the reference catalog.

We then run NoiseChisel, Segment and MakeCatalog on each image to generate a catalog of the clumps (which can be used to accurately define the center of each source in the image) where the clump centers are in the image coordinates: X and Y. Let's assume the catalogs are named `img1-x-y.fits', `img2-x-y.fits', `img3-x-y.fits'.

This is the problem: we want to find the WCS of each image by matching the individual X-Y catalogs to the RA-Dec catalog. With that WCS, we will be able to align the images to a single pixel grid (task #15636) and do science with them.

The proposed interface and usage is like this:


astmatch --wcs-reference=reference-ra-dec.fits --wcscol=RA,DEC img*-x-y.fits --ccol1=X,Y


Normally Match takes two catalogs (to find the rows that match on  certain coordinates). But when the `--wcs-reference' option is given, it can take any number of input catalogs. `--wcs-reference' itself will take a single catalog as value and use the columns specified with the `--wcscol' option.

Any number of X-Y catalogs are accepted and Match will simultaneously match them with then RA-Dec catalog and produce a separate FITS file for each input catalog (maybe called `img1-x-y-wcs.fits'). The FITS file won't have any data, it will just be a header with the WCS written inside of it.

I am proposing to take multiple X-Y catalos to allow more accurate estimation of the (optical/spherical) distortions when the inputs are exposures taken with a single imager. In this scenario, each image will be dithered/off-set compared to other images, but they all share the same distortions so the more raw exposures we have, the better we are able to estimate the distortion coefficients to implement in all the output files.

In the matching, beyond using the raw RA and Dec, we can also use the magnitudes/brightnesses of each source and include that as a dimension to fit/minimize.

Also, since some low-level structure like a k-d tree may be necessary to optimally parse the reference catalog, we can add a feature to estimate such optimized internal structures in one run and use them later. Having this structure in a file that can be directly used will probably greatly speed up the processing in many scenarios.

Mohammad Akhlaghi <makhlaghi>
Group administrator

 

(Note: upload size limit is set to 16384 kB, after insertion of the required escape characters.)

Attached Files
file #51590:  match_overview.png added by ndanzanello (132KiB - image/png)
file #51573:  new_dist_4stars.png added by ndanzanello (29KiB - image/png)
file #51575:  old_dist_4stars.png added by ndanzanello (28KiB - image/png)
file #49513:  rough-outline.txt added by makhlaghi (3KiB - text/plain)

 

Depends on the following items: None found

Items that depend on this one: None found

 

Carbon-Copy List
  • -email is unavailable- added by ndanzanello (Updated the item)
  • -email is unavailable- added by sks_15 (Posted a comment)
  • -email is unavailable- added by makhlaghi
  • -email is unavailable- added by makhlaghi (Submitted the item)
  •  

    There are 0 votes so far. Votes easily highlight which items people would like to see resolved in priority, independently of the priority of the item set by tracker managers.

     

    Follow 19 latest changes.

    Date Changed by Updated Field Previous Value => Replaced by
    2022-03-24 makhlaghi Open/ClosedOpen Closed
        Discussion LockLocked None
    2021-08-23 ndanzanello Discussion LockNone Locked
    2021-08-23 ndanzanello StatusIn Progress Done
    2021-08-23 ndanzanello Percent Complete50% 100%
    2021-08-23 ndanzanello Attached File- Added match_overview_complete.png, #51805
    2021-06-20 ndanzanello Attached File- Added 4stars_vertices_distances_fixed.png, #51591
    2021-06-20 ndanzanello Attached File#51589 Removed
    2021-06-20 ndanzanello Attached File- Added vertices_after_match_fixed.png, #51589
        Attached File- Added match_overview.png, #51590
    2021-06-17 ndanzanello Attached File- Added 4stars_vertices_distances.png, #51572
        Attached File- Added new_dist_4stars.png, #51573
        Attached File- Added vertices_after_match.png, #51574
        Attached File- Added old_dist_4stars.png, #51575
    2021-06-16 makhlaghi CategoryMatch Astrometry
    2021-06-16 makhlaghi Assigned tosks_15 ndanzanello
    2020-07-19 makhlaghi Attached File- Added rough-outline.txt, #49513
        Percent Complete0% 50%
    2020-05-12 makhlaghi Carbon-Copy- Added -email is unavailable-

    Back to the top

    Powered by Savane 3.13-4448.
    Corresponding source code