This paper reports on a series of experiments designed to examine the subjective quality of HDTV
encoded video. A common set of video sequences varying in spatial and temporal complexity will be
encoded at different H.264 encoding profiles. Each video clip used in the test will be available in 720p,
1080i and 1080p HDTV formats. Each video clip will then be encoded in CBR mode using H.264 at bitrates
ranging from 1 Mbit/s up to 15 Mbit/s with the other encoding parameters identical (e.g., identical
key frame intervals and CABAC entropy mode). This approach has been chosen to enable direct
comparison across bit-rates and formats most relevant to HDTV broadcast services. Three subjective
tests will be performed, one test for each HDTV format. The single-stimulus subjective test method with
the ACR rating scale will be employed. A total of 15 non-expert subjects will participate in each test. A
different sample of subjects will participate in each test, making 45 subjects in total. The test results will
be available by end of the summer 2009.
To predict subjective quality it is necessary to develop and validate approaches that accurately predict video quality. For
perceptual quality models, developers have implemented methods that utilise information from both the original and the
processed signals (full reference and reduced reference methods). For many practical applications, no reference (NR)
methods are required. It has been a major challenge for developers to produce no reference methods that attain the
necessary predictive performance for the methods to be deployed by industry. In this paper, we present a comparison
between no reference methods operating on either the decoded picture information alone or using a bit-stream / decoded
picture hybrid analysis approach. Two NR models are introduced: one using decoded picture information only; the other
using a hybrid approach. Validation data obtained from subjective quality tests are used to examine the predictive
performance of both models. The strengths and limitations of the two NR methods are discussed.
This paper describes an experiment examining subjective ratings in response to variations in the reproduction quality of a
video signal. Additionally, the test was designed to examine if pricing affected subjective judgements. Test materials
were created with either constant quality or variable quality where quality was manipulated by reference to the video
frame rate. Subjects were required to provide both quality and acceptability ratings for each test sequence. Two levels of
variable quality were created: one in which the quality varied between medium and high quality (low variability), the
other being variability between low and high quality (high variability). Subjects were assigned to one of three price
bands prior to beginning the test. The test found that, for equivalent average quality sequences, subjects preferred
constant quality to high variability. There was no difference in ratings for constant quality and low variability sequences.
The results indicate that video encoding methods may take advantage of some variation in video quality provided the
perceptual impact of changes in quality are not marked.
The Video Quality Experts Group (VQEG) is a group of experts from industry, academia, government and standards
organizations working in the field of video quality assessment. Over the last 10 years, VQEG has focused its efforts on
the evaluation of objective video quality metrics for digital video. Objective video metrics are mathematical models that
predict the picture quality as perceived by an average observer. VQEG has completed validation tests for full reference
objective metrics for the Standard Definition Television (SDTV) format. From this testing, two ITU Recommendations
were produced. This standardization effort is of great relevance to the video industries because objective metrics can be
used for quality control of the video at various stages of the delivery chain.
Currently, VQEG is undertaking several projects in parallel. The most mature project is concerned with objective
measurement of multimedia content. This project is probably the largest coordinated set of video quality testing ever
embarked upon. The project will involve the collection of a very large database of subjective quality data. About 40
subjective assessment experiments and more than 160,000 opinion scores will be collected. These will be used to
validate the proposed objective metrics. This paper describes the test plan for the project, its current status, and one of
the multimedia subjective tests.
Traditionally, subjective quality assessments are made in isolation of mediating factors (e.g. interest in content, price).
This approach is useful for determining the pure perceptual quality of content. Recently, there has been a growing
interest in understanding users' quality of experience. To move from perceptual quality assessment to quality of
experience assessment, factors beyond reproduction quality must be considered. From a commercial perspective, content
and price are key determinants of success. This paper investigates the relationship between price and quality. Subjects
selected content that was of interest to them. Subjects were given a budget of ten pounds at the start of the test. When
viewing content, subjects were free to select different levels of quality. The lowest quality was free (and subjects left the
test with ten pounds). The highest quality used up the full budget (and subjects left the test with no money). A range of
pricing tariffs was used in the test. During the test, subjects were allowed to prioritise quality or price. The results of the
test found that subjects prioritised quality over price across all tariff levels. At the higher pricing tariffs, subjects became
more price sensitive. Using data from a number of subjective tests, a utility function describing the relationship between
price and quality was produced.
The perceptual quality of mobile video is affected by many factors, including codec selection, compression rate,
network performance and device characteristics. Given the options associated with generating and transmitting mobile
video, there is an industry requirement for video quality measurement tools to ensure that the best achievable quality is
delivered to customers. International standards bodies are now considering alternative multimedia perceptual quality
methods for mobile video. In order to fairly evaluate the performance of objective perceptual quality metrics, it is
important that the subjective testing provides stable and reliable subjective data. This paper describes subjective studies
examining the effect of viewing distance in subjective quality assessment of low resolution video.
Video quality can be measured using both subjective and objective assessment methods. Subjective experiments are crucially important as they constitute a benchmark for evaluating the performance of objective quality metrics. Subjective quality assessment of television pictures has received extensive attention from experts over
the past decades. On the other hand, emerging applications such as PC-based video streaming and mobile video streaming require new subjective test methodologies. Although some recent studies have compared different test methodologies and procedures, most concerned television pictures. No studies for multimedia-type video really
validated the repeatability and reliability of the assessment method and the experimental procedure. This paper outlines a methodology for conducting subjective evaluation of video quality for multimedia applications in a repeatable and reliable manner across different laboratories. Using video material at low-resolution, low-bit rate
and low-frame rate, the same experiment was conducted by two different laboratories, i.e. test material was identical in both experiments. Laboratory set-up was different, i.e. different computers and display panels were used, and viewing distance was not fixed. Results show that quality ratings obtained in both experiments are statistically identical. This is an important validation step for the Video Quality Experts Group, which will conduct an ambitious campaign of subjective experiments using many different test laboratories.
Access to the requested content is limited to institutions that have purchased or subscribe to SPIE eBooks.
You are receiving this notice because your organization may not have SPIE eBooks access.*
*Shibboleth/Open Athens users─please
sign in
to access your institution's subscriptions.
To obtain this item, you may purchase the complete book in print or electronic format on
SPIE.org.
INSTITUTIONAL Select your institution to access the SPIE Digital Library.
PERSONAL Sign in with your SPIE account to access your personal subscriptions or to use specific features such as save to my library, sign up for alerts, save searches, etc.