This paper, “Reconstruction d'images spatiales à haute résolution temporelle," was presented as part of International Conference on Space Optics—ICSO 1997, held in Toulouse, France.
The LASCO-C2 coronagraph aboard SOHO (the SOlar and Heliospheric Observatory) is continuously observing
the solar corona since early 1996. The instrument as well as the experimental context underwent during this
period many changes and observational constraints. The consequences for the in-orbit calibration procedures
are illustrated with the systematic measure of the coronagraph straylight. Disentangle the coronal signal and
the straylight is the crucial point. The separation and monitoring of the straylight component rely on the daily
sets of polarized observations of the corona and a minimal set of assumptions about the symmetry of the F-corona
(the dust component of the solar corona). Four main changes have been detected since 1996. Specific
recommendations for the in-orbit calibration of future spatial coronagraphs will be presented.
The analysis of the data provided by LASCO-C2 coronagraph onboard the SOHO spatial observatory revealed
the fractal characteristics of many outstanding structures of the solar corona, which is the tiny but extended
envelope of plasma wrapping the Sun. A multiscale analysis of recent image sequences has brought a clearer view
of the evolution and the local structure of these features which results from a two steps projection process of the
2D electronic distribution over the Sun polar caps. To get an insight in the volume density distribution over these
caps and their evolution within time, we used the forward modelling approach based on the present knowledge
about the plasma distribution, the physical process of diffusion and the projection geometry on the field of
view. The analysis provides us with the multifractal characterization of the observed phenomena. In the forward
modelling process the goal is to reconstruct the time sequence of 2D electronic distributions slowly evolving over
the Sun polar caps. We used different methodologies: the inverse Fourier transform of 2D+1D (surface and
time) frequency modelling, the evolving multiscale synthesis with Gaussian wavelets and the concealed Markov
approach. Lately a procedure derivate of the Voss generation schema of fBm fractals has been successfully
developed. These different methods are compared and their relative advantages and drawbacks discussed.
We present a new tool, called "OASIS" (Optimized Astrophysical Simulator for Imaging Systems), whose aim
is to generate synthetic calibrated images of solar system bodies. OASIS has been developed to support the
operations and the scientific interpretation of visible images acquired by the OSIRIS visible camera aboard the
Rosetta spacecraft, but it can be used to create synthetic images taken by the visible imaging system of any
spacecraft. OASIS allows takes as input the shape model of the object, in the form of triangular facets defining
its surface, geometric parameters describing the position and orientation of the objects included in the scene and
of the observer, and instrumental parameters describing the geometric and radiometric properties of the camera.
The rendering of the object is performed in several steps which involve: (i) sorting the triangular facets in planes
perpendicular to the direction of the light source and to the direction of the line-of-sight, (ii) tracing rays from
a given facet to the light source and to the observer to check if it is illuminated and in view from the observer,
(iii) calculating the intersection between the projected coordinates of the facets and the pixels of the image,
and finally (iv) radiometrically calibrating the images. The pixels of the final image contain the expected signal
from the object in digital numbers (DN). We show in the article examples of synthetic images of the asteroid
(2867) Steins created with OASIS, both for the preparation of the flyby and for the scientific interpretation of
the acquired images later on.
This contribution describes the methods used to accurately disentangle the components observed on a very large
series of images of the solar corona. This series consists of 12 years of continuous observations provided by the
LASCO/C2 coronagraph aboard SOHO (the SOlar and Heliospheric Observatory). Continuously centred on
the Sun, which is masked, the observed images display a blend of many components. The more conspicuous
are the K-corona from the coronal plasma, the F-corona from the coronal dust and the instrumental straylight.
All of them are optically thin but in the LASCO/C2 field of view only the K-corona is polarized. The set of
observations is composed of two huge series of images: the "polarization series" (at least one observation every
day) and the "white light series" (more than 50 images every day). The goal is to determine quantitatively the
evolution of each image component during the 12 years. Assuming 1) a small and slow temporal evolution for
the F-corona and straylight, 2) the 2D regularity of the F-corona and 3) the ability to deduce the influence of the
SOHO-Sun distance, the F-corona function is determined from the polarized series and afterwards subtracted of
the white light series to obtain the K-corona white light series.
Photometry of astrophysical sources, galaxies and stars, in crowded field images, if an old problem, is still
a challenging goal, as new space survey missions are launched, releasing new data with increased sensibility,
resolution and field of view. The GALEX mission, observes in two UV bands and produces deep sky images
of millions of galaxies or stars mixed together. These UV observations are of lower resolution than same field
observed in visible bands, and with a very faint signal, at the level of the photon noise for a substantial fraction
of objects. Our purpose is to use the better known optical counterparts as prior information in a Bayesian
approach to deduce the UV flux.
Photometry of extended sources has been addressed several times using various techniques: background
determination via sigma clipping, adaptative-aperture, point-spread-function photometry, isophotal photometry, to lists some. The Bayesian approach of using optical priors for solving the UV photometry has
already been applied by our team in a previous work. Here we describe the improvement of using the extended
shape inferred by deblending the high resolution optical images and not only the position of the optical
sources.
The resulting photometric accuracy has been tested with simulation of crowded UV fields added on top
of real UV images. Finally, this helps to converge to smaller and flat residual and increase the faint source
detection threshold. It thus gives the opportunity to work on 2nd order effects, like improving the knowledge of
the background or point-spread function by iterating on them.
Segmentation of contours and silhouettes is a recurrent topic in image recognition and understanding. In this
paper we describe a new method used to divide in two parts (the limb and the terminator) the apparent silhouette
of an irregular astronomical body illuminated by a unique source, the Sun. One of the main objectives of the
asteroids and comets flyby is the detailed 3D reconstruction of such bodies. However the number of images
obtained during a flyby is limited, as well as the number of viewing geometries. In the 3D reconstruction we
must consider not only the camera motion but also the free rotation of the body. The local brightness variations
in the image vary with the rotation of the body and with the changing body-camera distance. The topography
at the surface of the body can vary from very smooth to highly chaotic. In the shape from silhouette 3D
reconstruction methods, limb profiles are used to retrieve the visual hull of the body. It is therefore required
to be able to separate the limb profiles from the terminator ones. In this communication, we present a new
method to perform this task based on the local measurement of the contour smoothness, which we define here
as "activity". Developed in the framework of the Rosetta mission our method has been tested on a large
set of asteroid and comet images taken during interplanetary missions. It looks robust to magnification and
enlightenment changes
Since 11 years SOHO-LASCO coronagraphs are producing a unique set of the Sun corona images in the 2-32 solar
radius range. For the first time a complete set of coronal calibrated images in WL (polarized and unpolarized)
for the full solar cycle is available. The telescopes are equipped with 3 polarizers at -60,0 and 60 degrees, one
all pass channel and a set filters. Ground calibrations were completed with in orbit calibrations. To control the
evolution of sensivity for each bandpass and for each polarizer, the LASCO-C2 and LASCO-C3 coronographs
were provided with an internal system of calibration in orbit. The measures obtained in 1996 and 2003 have been
used to determine the CCD flat field for each filter bandpass, the gain constant (ADU to phe- conversion) and
the polarizers transmittance map. The solar corona itself was also used to control the local response. Spacecraft
rotations by 45 and 90 complete the test, and allowed for a ultimate but relevant global correction of the polarized
images.
Photometry of crowded fields is an old theme of astronomical image processing. Large space surveys in the UV (ultraviolet), like the GALEX mission (135-175 nm and 170-275 nm range), confronts us again with challenges like, very low light levels, poor resolution, variable stray-light in background, the extended and badly known PSFs (point spread functions), etc. However the morphological similitude of these UV images to their counterparts in the visible bands, suggests that we use all this high resolution data as the starting reference for the UV analysis. We choose the Bayesian approach. However there is not a straightforward way leading from the basic idea to its practical implementation. We will describe in this paper the path which starts with the original procedure (presented in a previous paper) and ends on the useful one. After a brief recall on the Bayesian method, we describe the process applied to restore from the UV images the point spread function (PSF) and the background due to stray-light. In the end we display the photometric performances reached for each channel and we discuss the consequences of the imperfect knowledge of background, the inaccuracy on object centring and on the PSF model. Results show a clear improvement by more than 2 mag on the magnitude limit and in the completeness of the measured objects relative to classical methods (it corresponds to more than 75000 new objects per GALEX field, i.e. approx 25% more). The simplicity of the Bayesian approach eased the analysis as well as the corrections needed in order to obtain a useful and reliable photometric procedure.
The GALEX mission of NASA, is collecting an unprecedent set of astronomical UV data in the far and the
near UV range. The telescope measures the full sky in a continuous automatic scan. Knowing the attitude
data, local images are simultaneously extracted and corrected for smearing and instrumental effects. Final UV
images show, by far, a lower resolution than their visible counterpart. It originates blends, ambiguities and missidentifications
of the astronomical sources. Our purpose is to deduce from the UV image the UV photometry
of the visible objets through a bayesian approach, using the visible data (catalog and image) as the starting
reference for the UV analysis. For the feasibility reasons as the deep field images are very large, a segmentation
procedure has been defined to manage the analysis in a tractable form. The present paper discusses all these
aspects and details the full method and performances.
KEYWORDS: Stars, Signal detection, Photonic integrated circuits, Planets, Principal component analysis, Detection and tracking algorithms, Modulation, Satellites, Interference (communication), Signal to noise ratio
The search for planetary transits in star light-curves can be improved in an non standard way applying appropriate filtering of the systematic effects just after the detection step. The procedure has been tested using a set of light curves simulated in the context of the CoRoT space mission. The level of the continuum in the detection curves is significantly lowered when compared to other standard approaches, a property we use to reduce false alarm. Ambiguities may originate in unexpected effects that combine instrumental and environmental factors. In a large set of synchronous light curves collective behaviours permit to identify systematic effects against which the detected events are compared. We estimate a significance of our detections and show that with our procedure the number of true detections is increased by more than 80% (22 events detected over the 36 injected ones). In spite of its simplicity, our method scores quite well (average results) when compared to the other methods used for the CoRoT "blind test" exercice by Moutou et al.1
We present a method based on local regularity analysis to detect glitch signatures in an interferometric signal. The regularity is given by the local value of the Holder exponent. This exponent can be derived using a Holderian analysis with a wavelet coefficients modulus calculation along wavelet transform modulus maxima lines (so called WTMML) in suitably selected regions of the time-scale half-plane. Glitches that are considered as a discontinuity on the signal show Holder exponent lower than a fixed threshold defined for a continuous signal (around -1). The method has been tested using computed histograms simulations derived from "HERSCHEL / SPIRE" theoretical signals. Statistics show that the optimization of the detection parameters should take into account variables such as sampling rate, signal to noise ratio but is almost independent of the glitch amplitude.
The LASCO-C2 coronagraph on-board the SOHO solar observatory has been providing a continuous flow of coronal images for the past nine years. Synoptic maps for each Carrington rotation have been constructed from these images and offer a global view of the temporal evolution of the solar corona, particularly the occurrence of transient events such as the coronal mass ejections (CMEs), an important component of space weather activity. CMEs present distinct signatures on synoptic maps offering a novel approach to the problem of their statistical detection. We are presently testing several techniques of automatic detection based on their morphological properties. The basic procedure involves three steps: i) morphological characterization, ii) definition and application of adapted filters (optimal trade-off filters, Canny filter,...), iii) segmentation of the filtered synoptic maps. At this stage, the CMEs are detected. The efficiency of the detection of the various filters is estimated using the ROC curves. On-going studies include the classification of CMEs based on their physical properties, the determination of their velocities, and the question of their connection to the streamer belt.
COROT is a mission of the CNES space agency, to be launched in 2005 in a Polar orbit. Its main goals are the search of stellar oscillation and the exoplanet detection. Five star fields chosen close to the galactic plane and in the opposite direction will be observed with an high photometric stability. Four 2048x2048 CCD detectors cover two detection areas one for asteroseismology and the other for exoplanets detection. To avoid the saturation risk the seismology area is just in front of focal plane; in the exoplanet area a low power prism disperses the images to get color information about each observed star. This paper presents the procedure used to deduce the polychromatic PSFs for both the seismology and the exoplanets detection areas depending on position and star color indexes. The use of standard optical packages, the expected inaccuracies and performances are discussed.
The LASCO-C2 coronagraph onboard the SOHO solar probe have been providing for the last seven years an unprecedented long sequence of coronal images at high cadence (about 75 images/day). The LASCO-C2 calibrations included the determination of the geometric characteristics (attitude, distortion) as well as the photometric and photopolarimetric responses. Such calibrations needed resort to a complementary set of approaches including optical-modelling, pre-flight measures and in-orbit measures and monitoring. In this paper we discuss about the specific contribution of each of them, the example of radiometric calibration of LASCO-C2 is dominated by the strong vignetting induced by the occultors. The occultors fully mask the extended circular area centered on the Sun image. Due to operational constraints the vignetting function has been obtained using a complementary set of approaches: 1) ray tracing, 2) the geometric convolution of diaphragms, 3) the measure of uniform sources in laboratory, 4) the measures in orbit of the stars and F-corona. Finaly the relationship of radiometry with geometric calibrations, strylight calibration and the log term stability monitoring is discussed.
KEYWORDS: Stars, Point spread functions, Charge-coupled devices, Signal attenuation, Image processing, Exoplanets, Planets, Signal to noise ratio, Photometry, Binary data
CoRoT mission for year 2006 is a small space telescope that will
measure continuously for 6 months the light flux of 12000 star in a mission of 2.5 years . The aim is to detect small droops in the light curves revealing planets transitting in front of their star. For this, 12000 logical Regions Of Interest (ROI) are defined on the CCD to optimise each star Signal to Noise Ratio (s/n). Unfortunatly only less than 256 different shapes are permitted for all ROIs, forseeing a loss in global S/N. We found a method wich reduce the 12000 ROIs to a small set of 250 shapes in a lossless way. Overall perverformances are discussed.
COROT is a mission of the CNES space agency, to be launched in 2005 in a near Polar orbit. It is devoted to star seismology and to exoplanetary transit search. Five star fields chosen close to the galactic plane will be observed during the mission with a high photometric accuracy (relative). Each observation run will last 150 days monitoring continuously more than 6000 stars. This paper presents a new method designed to perform optimal aperture photometry on board in high density fields. We describe the way the photometric windows or patterns are defined and centered on the CCD around each target star, with the expected performances. Each pattern depends on the specific 2D profile of the point spread function (PSF) but also on the pointing jitter and on the tiny deformations of the telescopes. These patterns will be stored on board in order to define for each target star the optimal pattern which will produce the integrated flux to be measured. This method allows a significant increase of the sampling rate to approximately one measure per star each 8 mn).
The LASCO-C2 and C3 coronagraphs aboard the SOHO solar observatory have been providing for the last five years an unprecedented long sequence of coronal images at high cadence (about 100 images/day). To build temporal sequences for movie displays as well as for science analysis purpose, we need a photomety having a relative accuracy better than 0.1 percent. In this paper we address this problem showing how image to image regression as well as long term correction of drifts induced by Wiener-Levy stochastic processes are able to solve this challenge. The use of time derivatives of synoptic maps to correct and verify the full procedure and comparison with more classical methods like star calibration are discussed. Difficulties due to brightening events as coronal mass ejections, showers of cosmic rays, etc., as well as those due to telemetry gaps are also addressed.
It is shown that rough surfaces of silver, magnesium and iron thin deposits may be accurately described by an autoregressive process. The autoregressive parameters are determined by using either the Yule-Walker or the Burg techniques and the advantages of describing statistically rough surfaces of thin deposits by linear models instead of the traditional autocovariance function, or the spectrum, is discussed.
A classic method for studying surface roughness of thin films that uses a microdensitometer analysis of electron micrographies of surface replica is revised and some defects are revealed. Two new methods are proposed. The first is based on multiresolution analysis, the second implements Wiener filtering by 2D Fourier Transform. The results are used to check the surface normality by means of statistical tests.
Access to the requested content is limited to institutions that have purchased or subscribe to SPIE eBooks.
You are receiving this notice because your organization may not have SPIE eBooks access.*
*Shibboleth/Open Athens users─please
sign in
to access your institution's subscriptions.
To obtain this item, you may purchase the complete book in print or electronic format on
SPIE.org.
INSTITUTIONAL Select your institution to access the SPIE Digital Library.
PERSONAL Sign in with your SPIE account to access your personal subscriptions or to use specific features such as save to my library, sign up for alerts, save searches, etc.