giant.stellar_opnav¶
This package provides the required routines and objects to identify stars in an image and to estimate attitude based on those identified stars.
Description¶
In GIANT, Stellar OpNav refers to the process of identifying stars in an image and then extracting the attitude
information from those stars. There are many different sub-steps that need to be performed for all parts of Stellar
OpNav, which can lead to cluttered scripts and hard to maintain code when everything is thrown together. Luckily, GIANT
has done most of the work for us, and created a simple single interface to perform all portions of Stellar OpNav through
the StellarOpNav
class.
The StellarOpNav
class is generally the only interface a user will require when performing Stellar OpNav, as
it provides easy access to every component you need. It also abstracts away most of the nitty gritty details into 2
simple method calls id_stars()
and estimate_attitude()
which means you can
perform Stellar OpNav without an in depth understanding of what’s going on in the background (though at least a basic
understanding certainly helps). That being said, the substeps are also exposed throughout GIANT, so if you are doing
advanced design or analysis it is easy to get these components as well.
This package level documentation only focuses on the use of the class and some techniques for successfully performing
Stellar OpNav (in the upcoming tuning section). To see behind the scenes of what’s going on, refer to the submodule
documentations from this package
(stellar_class
, star_identification
, and estimators
).
Tuning for Successful Stellar OpNav¶
The process of tuning the Stellar OpNav routines is both a science and an art. In general, you should be able to find a single set of tuning parameters that applies to a number of similar images (similar exposure times, similar scene, etc), but this often takes a decent amount of experimentation. With this section, we hope to introduce you to all the different nobs you can turn when performing Stellar OpNav and give you some basic tips to getting a good tuning for a wide range of images.
The primary goal with tuning in Stellar OpNav is to successfully identify a good number of stars in every image. In general, at least 2 stars are required for each image to be able to solve for an updated rotation matrix for the image, but in practice it is best to get at least 4, and generally more is better. In addition, it may frequently be useful to perform other types of analysis on the images using identified stars, and in these cases more is almost always better. Therefore we will be shooting to get the tuning that provides the most correctly identified stars as possible.
Each of the parameters you will need is discussed briefly in the following table. PointOfInterestFinder
attributes
can easily be accessed through the point_of_interest_finder
property, while StarID
attributes can easily be accessed through the star_id
attribute. For more detailed
descriptions of these attributes see the PointOfInterestFinder
or StarID
documentation.
As you can see, there are 3 different processes that need tuned for a successful star identification, the image processing, the catalog query, and the identification routines themselves. The following are a few suggestions for attempting to find the correct tuning.
Getting the initial identification is generally the most difficult; therefore, you should generally have 2 tunings for an image set.
The first tuning should be fairly conservative in order to get a good refined attitude estimate for the image. (Remember that we really only need 4 or 5 correctly identified stars to get a good attitude estimate.)
threshold
set fairly high (around 20-40)denoising
set to something reasonable likeGaussianDenoising
a large initial
tolerance
–typically greater than 10 pixels. Note that this initial tolerance should include the errors in the star projections due to both the a priori attitude uncertainty and the camera modela smaller but still relatively large
ransac_tolerance
–on the order of about 1-5 pixels. This tolerance should mostly reflect a very conservative estimate on the errors caused by the camera model as the attitude errors should largely be removeda small
max_magnitude
–only allowing bright stars. Bright stars generally have more accurate catalog positions and are more likely to be picked up by theImageProcessing
algorithmsthe
max_combos
set fairly large–on the order of 500-1000
After getting the initial pairing and updating the attitude for the images (note that this is done external to the calls to
id_stars()
), you can then attempt a larger identification with dimmer starsdecreasing
threshold
(around 8-20)decreasing the
tolerance
to be about the same as your previousransac_tolerance
turning the RANSAC algorithm off by setting the
max_combos
to 0increasing the
max_magnitude
.potentially setting
denoising
to None if you’re trying to extract as many stars as is possible
If you are having problems getting the identification to work it can be useful to visually examine the results for a couple of images using the
show_id_results()
function.
Example¶
Below shows how stellar opnav can be used to id stars and estimate attitude corrections. It assumes that
the generate_sample_data
script has been run already and that the sample_data
directory is in the current
working directory. For a more in depth example using real images, see the tutorial.
>>> import pickle
>>> from pathlib import Path
>>> # use pathlib and pickle to get the data
>>> data = Path.cwd() / "sample_data" / "camera.pickle"
>>> with data.open('rb') as pfile:
>>> camera = pickle.load(pfile)
>>> # import the stellar opnav class
>>> from giant.stellar_opnav.stellar_class import StellarOpNav, StellarOpNavOptions
>>> # setup options for setllar opnav
>>> options = StellarOpNavOptions()
>>> options.point_of_interest_finder_options.threshold = 10
>>> options.point_of_interest_finder_options.centroid_size = 1
>>> options.star_id_options.max_magnitude = 5
>>> options.star_id_options.tolerance = 20
>>> # form the stellaropnav object
>>> sopnav = StellarOpNav(camera, options=options)
>>> sopnav.id_stars()
>>> sopnav.sid_summary() # print a summary of the star id results
>>> for _, image in camera: print(image.rotation_inertial_to_camera) # print the attitude before
>>> sopnav.estimate_attitude()
>>> for _, image in camera: print(image.rotation_inertial_to_camera) # print the attitude after
>>> # import the visualizer to look at the results
>>> from giant.stellar_opnav.visualizer import show_id_results
>>> show_id_results(sopnav)
Modules
This module provides a subclass of the |
|
This module provides the star identification routines for GIANT through the |
|
This module provides the ability to find the rotation that best aligns 1 set of unit vectors with another set of unit vectors. |
|
This package provides utilities for visually inspecting star identification and attitude estimation results. |
Classes
StellarOpNavOptions(use_weights: bool = False, scene: giant.ray_tracer.scene.Scene | None = None, point_of_interest_finder_options: giant.image_processing.point_source_finder.PointOfInterestFinderOptions | None = <factory>, star_id_options: giant.stellar_opnav.star_identification.StarIDOptions = <factory>, attitude_estimator_options: giant.stellar_opnav.estimators.attitude_estimator.AttitudeEstimatorOptions | giant.stellar_opnav.estimators.esoq2.ESOQ2Options | None = None, custom_attitude_estimator_class: type[giant.stellar_opnav.estimators.attitude_estimator.AttitudeEstimator] | None = None, attitude_estimator_type: giant.stellar_opnav.estimators.AttitudeEstimatorImplementations = <AttitudeEstimatorImplementations.DAVENPORT_Q_METHOD: 1>, denoising: Optional[Callable[[numpy.ndarray[tuple[Any, ...], numpy.dtype[~_ScalarT]]], numpy.ndarray[tuple[Any, ...], numpy.dtype[~_ScalarT]]]] = <factory>) |
|
This class serves as the main user interface for performing Stellar Optical Navigation. |
|
The StarID class operates on the result of image processing algorithms to attempt to match image points of interest with catalog star records. |
|
Dataclass for configuring attitude estimator subclasses |
|
This abstract base class (ABC) serves as a template for creating an attitude estimator that GIANT can use. |
|
Options for the ESOQ2 attitude estimator. |
|
Implements the ESOQ2 (Second Estimator of the Optimal Quaternion) solution to Wahba's problem. |
|
This class estimates the rotation quaternion that best aligns unit vectors from one frame with unit vectors in another frame using Davenport's Q-Method solution to Wahba's problem. |
|
The class is used to represent an outlier shown to the user for review via the function |
Functions
This function generates histograms of the matched star residuals for a given stellar opnav object. |
|
This function generates a scatter plot of x and y residuals versus star magnitudes from the matched catalog stars for a given stellar opnav object. |
|
This function generates a scatter plot of x and y residuals versus image temperature from the matched catalog stars for a given stellar opnav object. |
|
This function generates a figure for each turned on image in |