This paper presents a suboptimal quantization and transmission scheme for multiscale block-based compressed sensing images over wireless channels. The proposed method includes two stages: dealing with quantization distortion and transmission errors. First, given the total transmission bit rate, the optimal number of quantization bits is assigned to the sensed measurements in different wavelet sub-bands so that the total quantization distortion is minimized. Second, given the total transmission power, the energy is allocated to different quantization bit layers based on their different error sensitivities. The method of Lagrange multipliers with Karush–Kuhn–Tucker conditions is used to solve both optimization problems, for which the first problem can be solved with relaxation and the second problem can be solved completely. The effectiveness of the scheme is illustrated through simulation results, which have shown up to 10 dB improvement over the method without the rate and power optimization in medium and low signal-to-noise ratio cases.
We consider the transmission of progressive image data
over noisy channels when the coded packet size is fixed. The concatenated
cyclic redundancy check (CRC) codes and ratecompatible
punctured turbo codes are used for error control and
detection. In such an application, the distortion-based optimal channel
rate allocation for unequal error protection is complex. We first
propose a suboptimal genetic algorithm–based method that not only
largely reduces the optimization complexity but also obtains performance
approaching to the results of a brute force search. In addition,
because a large packet size is usually applied when turbo
codes are used due to the fact that the coding gain is proportional to
the packet size for a given code rate, a single remaining bit error
after channel decoding may result in CRC failure and hence the
discard of the entire packet. Therefore, we further propose a
multiple-CRC structure for certain data packets so that more correctly
decoded data could be used in source decoding. The promising
performance of the proposed scheme has been demonstrated
through simulation.
In this paper, a joint source-channel coding scheme is proposed for progressive image transmission over
channels with both random bit errors and packet loss by using rate-compatible punctured Turbo codes (RCPT)
protection only. Two technical components which are different from existing methods are presented. First, a
data frame is divided into multiple CRC blocks before being coded by a turbo code. This is to secure a high
turbo coding gain which is proportional to the data frame size. In the mean time, the beginning blocks in a
frame may still be usable although the decoding of the entire frame fails. Second, instead of employing product
codes, we only use RCPT, along with an interleaver, to protect images over channels with combined distortion
including random errors and packet loss. With this setting, the effect of packet loss is equivalent to randomly
puncturing turbo codes. As a result, the optimal allocation of channel code rates is required for the random
errors only, which largely reduces the complexity of the optimization process. The effectiveness of the proposed
schemes is demonstrated with extensive simulation results.
In this paper a joint source and space time decoding scheme is proposed for the high speed digital source
transmission over fading channels. At the transmitter Reverse Variable Length Code (RVLC) is concatenated
with recursive space time trellis code (recursive STTC). At the receiver iterative joint VLC and space time
decoding algorithm is proposed to fully utilize the residual redundance introduced in RVLC and the coding gain
of recursive space time trellis code. Simulation result shows that the proposed joint decoding system achieves a
better decoding performance over fading channels than separable decoding system.
In this paper, we propose a joint source-channel coding scheme for progressive image transmission over binary
symmetric channels(BSCs). The algorithm of set partitioning in hierarchical trees (SPIHT) is used for source
coding. Rate-compatible punctured Turbo codes (RCPT) concatenated with multiple cyclic redundancy check
(CRC) codes are adopted for channel protection. For a fixed transmission rate, the source and channel code rates
are jointly optimized to maximize the expected image quality at the receiver. Two technical components which
are different from existing methods are presented. First, a long data packet is divided into multiple CRC blocks
before being coded by turbo codes. This is to secure a high coding gain of Turbo codes which is proportional
to the interleaver size. In the mean time, the beginning blocks in a packet may still be useable although the
decoding of the entire packet fails. Second, instead of using exhaustive search, we give a genetic algorithm (GA)
based optimization method to find the appropriate channel code rates with low complexity. The effectiveness of
the scheme is demonstrated through simulations.
KEYWORDS: Image compression, Binary data, Computer programming, Video, Data compression, Video compression, Quantization, Statistical analysis, Video coding, Digital modulation
Correlation estimation plays a critical role in resource allocation and rate control for distributed data compression. A Wyner-Ziv encoder for distributed image compression is often considered as a lossy source encoder followed by a lossless Slepian-Wolf encoder. The source encoder consists of spatial transform, quantization, and bit plane extraction. In this work, we find that Gray code, which has been extensively used in digital modulation, is able to significantly improve the correlation between the source data and its side information. Theoretically, we analyze the behavior of Gray code within the context of distributed image compression. Using this theoretical model, we are able to efficiently allocate the bit budget and determine the code rate of the Slepian-Wolf encoder. Our experimental results demonstrate that the Gray code, coupled with accurate correlation estimation and rate control, significantly improves the picture quality, by up to 4 dB, over the existing methods for distributed image compression.
Turbo codes is a promising technique for distributed source coding (DSC) in sensor networks because of its simple encoding implementation in sensors and the high decoding performance at the receiver. Different than the scenario in channel coding that only one distortion from physical channel exists, two types of distortion
from both physical channel and BSC co-exist in a distributive source coding scenario. In this paper, first, the conventional Turbo decoding is modified to handle BSC distortion. Then, it is further modified to decode mixed data with both types of distortion simultaneously. By redefining channel reliabilities and calculating the extrinsic information considering both distortions, the new decoding algorithm well matches the realistic DSC scenario and indeed improves decoding performance.
KEYWORDS: Binary data, Detection and tracking algorithms, Electrical engineering, Data compression, Forward error correction, Data communications, Multimedia, Computer programming, Distortion, Signal to noise ratio
In this paper, we discuss the maximum a-posteriori probability (MAP) decoding of variable length codes(VLCs) and propose a novel decoding scheme for the Huffman VLC coded data in the presence of noise. First, we provide some simulation results of VLC MAP decoding and highlight some features that have not been discussed yet in existing work. We will show that the improvement of MAP decoding over the conventional VLC decoding comes mostly from the memory information in the source and give some observations regarding the advantage of soft VLC MAP decoding over hard VLC MAP decoding when AWGN channel is considered. Second, with the recognition that the difficulty in VLC MAP decoding is the lack of synchronization between the symbol sequence and the coded bit sequence, which makes the parsing from the latter to the former extremely complex, we propose a new MAP decoding algorithm by integrating the information of self-synchronization strings (SSSs), one important feature of the codeword structure, into the conventional MAP decoding. A consistent performance improvement
and decoding complexity reduction over the conventional VLC MAP decoding can be achieved with the new scheme.
Multiple description coding (MDC) is a well-known robust data compression algorithm designed to minimize the distortion caused by data loss in packet-based communication systems. Several MDC schemes for transmitting wavelet compressed images have been developed. However, these MDC schemes cannot be adopted for digital mobile wireless applications where both packet loss and bit error are present, because individual description in these MDC schemes usually does not have adequate error resilience capability to combat the bit error in transmission. In this paper, we propose an algorithm to achieve robust communication over error prone transmission channels with both packet loss and bit error. We integrate the multiple description scalar quantization (MDSQ) with the multiple wavelet tree image coding method in order to provide an excellent error resilient capability. Two descriptions are generated independently by using index assignment of MDSQ. For each description, multiple sub-sampling is applied to split the wavelet coefficients of the source image into multiple sub-sources. Each sub-source is then entropy coded using the SPIHT algorithm and followed by a channel coding scheme that combines cyclic redundancy code (CRC) and rate compatible punctured convolutional (RCPC) code to offer unequal error protection to the entropy coded bits. The unequal error protection channel coding rate is designed based on the bit error sensitivity of different bit planes to achieve maximum end-to-end quality of service. Experimental results show that the proposed scheme not only exhibits an excellent error resilient performance but also demonstrates graceful degradation over error prone channels with changing rate of packet loss and bit error.
In this paper, we present techniques based on multiple wavelet-tree coding for robust image transmission. The algorithm of set partitioning in hierarchical trees (SPIHT) is a state-of-the-art technique for image compression. This variable length coding (VLC) technique, however, is extremely sensitive to channel errors. To improve the error resilience capability and in the meantime to keep the high source coding efficiency through VLC, we propose to encode each wavelet tree or a group of wavelet trees using SPIHT algorithm independently. Instead of encoding the entire image as one bitstream, multiple bitstreams are generated. Therefore, error propagation is limited within individual bitstream. Two methods based on subsampling and human visual sensitivity are proposed to group the wavelet trees. The multiple bitstreams are further protected by the rate compatible puncture convolutional (RCPC) codes. Unequal error protection are provided for both different bitstreams and different bit segments inside each bitstream. We also investigate the improvement of error resilience through error resilient entropy coding (EREC) and wavelet tree coding when channels are slightly corruptive. A simple post-processing technique is also proposed to alleviate the effect of residual errors. We demonstrate through simulations that systems with these techniques can achieve much better performance than systems transmitting a single bitstream in noisy environments.
A scheme with three key components including wavelet tree coding, error resilient entropy coding (EREC), and error concealment is proposed for robust image coding and transmission over noisy channels. First, we individually encode the spatial-orientation trees in the wavelet domain using the algorithm of set partitioning in hierarchical trees (SPIHT). Error propagation is thus limited because multiple independent bit streams are generated. Meanwhile, a high source coding efficiency is also preserved because the self-similarity property in each wavelet tree remains intact. Next, we use EREC to reorganize these variable-length bit streams into fixed-length data slots before multiplexing and transmission. Therefore, the synchronization of the start of each bit stream can be automatically obtained at the receiver. Finally, to alleviate the possible catastrophic image degradation that may result from errors in the beginning of the bit streams, we propose an error concealment technique to constrain the EREC decoding as well as to postprocess the decoded wavelet coefficients. As a result of the error concealment, the EREC decoding complexity is reduced and the reconstructed image quality is significantly improved. Experimental results demonstrate an excellent error resilient performance of the proposed scheme.
In this paper, a scheme for robust image coding and transmission over noisy channels is presented. First, the spatial-orientation trees in wavelet domain are independently encoded using the algorithm of set partitioning in hierarchical trees (SPIHT). We generate multiple bitstreams in order to limit the error propagation. Because the self-similarity across subbands is preserved in each wavelet tree, a high coding efficiency is also obtained. However, when these bitstreams with variable length are transmitted sequentially, error impact may still be propagated across the boundaries between two consecutive bitstreams and may cause catastrophic decoding. Therefore, we apply error resilient entropy coding (EREC) to reorganize these bitstreams into fixed length slots so that synchronization of the start of each bitstream is automatically achieved at the receiver. This is particularly useful for the progressive data where bits located at the beginning of bitstream are always more important than those at the end. Finally, to further alleviate the impact of errors in the beginning of the bitstreams, parity bits are added and exploited by an error concealment technique conducted only in the lowest frequency subband. Experimental results demonstrate an excellent error resilient performance of the proposed scheme.
In this paper, we propose a novel combined source and channel coding scheme for image transmission over noisy channels. The main feature of the proposed scheme is a systematic decomposition of image sources so that unequal error protection can be applied according to not only bit error sensitivity but also visual context importance. The wavelet transform is adopted to hierarchically decompose the image. The association between the wavelet coefficients and what they represent spatially in the original image is fully exploited. Such decomposition generates wavelet blocks that can be classified based on their corresponding image context. The classification produces wavelet trees in each class with similar context and statistics and therefore enables high performance source compression using SPIHT. The channel coding assigns unequal error protection to different classes and to different bit planes so that the image transmission scheme is robust in terms of both subjective and objective visual quality. To further improve the quality of the received image, a post-processing method was proposed to restore the degradation due to the channel decoding residual error. Experimental results show that the proposed scheme has a good performance for image transmission over noisy channels. In particular, the reconstructed images consistently illustrate better visual quality.
Access to the requested content is limited to institutions that have purchased or subscribe to SPIE eBooks.
You are receiving this notice because your organization may not have SPIE eBooks access.*
*Shibboleth/Open Athens users─please
sign in
to access your institution's subscriptions.
To obtain this item, you may purchase the complete book in print or electronic format on
SPIE.org.
INSTITUTIONAL Select your institution to access the SPIE Digital Library.
PERSONAL Sign in with your SPIE account to access your personal subscriptions or to use specific features such as save to my library, sign up for alerts, save searches, etc.