Docsity
Docsity

Prepare for your exams
Prepare for your exams

Study with the several resources on Docsity


Earn points to download
Earn points to download

Earn points by helping other students or get them with a premium plan


Guidelines and tips
Guidelines and tips

Introduction to Broadcasting Media, Lecture notes of Journalism

Teaches the basic principles of broadcasting mainly TV and radio

Typology: Lecture notes

2018/2019

Uploaded on 08/19/2023

nhlanganiso-arty-kunene
nhlanganiso-arty-kunene 🇸🇿

5 documents

1 / 41

Toggle sidebar

Related documents


Partial preview of the text

Download Introduction to Broadcasting Media and more Lecture notes Journalism in PDF only on Docsity! Introduction to Broadcast Media Tutorial Cutting to the chase: Broadcast media is radio and television. Even amidst the pop culture dominance of the internet, broadcast media still commands the largest share of the advertising pie nationwide. Put the audio and visual media to work for you as your company earns larger market share, stronger branding, and increased sales. If you are looking for cost-efficient lead generation, you need to be looking at radio and television advertising. Not only are radio and television the main media for advertising today, they are continually developing new ways to reach their audience. The SyFy cable network launched a show (“Defiance”) that combines interactions on a video game with the plot of a series show. Radio stations are supplementing on-air campaigns with digital media to provide on-air and on-screen promotions to those who stream the station through their computer. Multiple studies have shown that combining radio and television can help advertisers reach audiences not achievable with only one medium or the other. Broadcast Media  Broadcast television  Cable television  On-demand television  TV/web integration  Local, network, and national radio  On-air endorsements  Long-form programming  Multi-language programming The Power of Radio Radio reaches more Americans than any other advertising media. As an example, let’s look at Los Angeles, CA. It is the #1 radio revenue market in the world and generates more than $1 billion dollars in sales each year. In that market alone, more than 9 million people listen to radio each week. People are loyal to radio and love listening to their favorite DJ or talk show host. The shows become part of their routines as they drive to and from work or run errands or take kids to school. There is probably at least one conversation in your office every day that starts with, “I heard on the radio this morning…” The reason? More adults in L.A. listen to radio in a week than will visit Google+ in a month! Radio offers a unique method to achieve Top-Of-Mind-Awareness (TOMA). As people listen to radio advertising and don’t rely on visual cues they would get from TV or a website, your ad is playing in a “theater of the mind”. For example, the phrase “a soft pillow” could conjure an image of a white silk pillowcase on a down pillow for one person whereas another person could be thinking of the cute yellow pillow they had as a child. That openness for interpretation means the quality of your copywriting is vital to success. You have an opportunity to connect with a listener through their own experiences, ideas, and dreams. The Power of Television We just mentioned a unique power of radio to achieve TOMA. Television advertising -another part of broadcast media- is the most powerful medium currently available to put your brand at the forefront of your customers’ minds. The combination of audio and visual messages allows for a dual delivery of your marketing message. Television Advertising Choices There is a huge range of choices when it comes demographic targeting with television advertising. The most basic is network vs. cable. Attach your brand to the prestige and authority of companies such as ABC, CBS, NBC, or Fox. Take advantage of the huge variety of cable networks that enable you to selectively target viewers based on income, hobbies, ethnicity, favorite sports, gender, sexual orientation, education level, or any combination you may need. Much has been said about the impact of TiVO/DVR devices and people skipping commercials. Multiple studies have shown that advertising on TV continues to be one of the most effective marketing methods available. Only about 50% of DVR-owning households actually skip commercials. And many of those that skip have been shown to retain what they see in fast-forward or -most importantly- see something that catches their attention and will go back to watch the full ad. The newest addition to television advertising success is the multi-screen viewer. Millions of Americans watch TV while also surfing the internet on their desktop, laptop, tablet, or smartphone. These potential customers can see your add on television and surf immediately over to your website to learn more about your company or product. Conversely, a potential customer can share reactions on Facebook or Twitter to their favorite shows and see your mobile or other online ad appear. A great example of this was the recent airing of “Sharknado” on the SyFy cable network. This B-level movie on a low tier network generated more than 300,000 live Tweets while it was airing. How Do Radio and TV Help Phone") systems, which were telephone-based distribution systems allowing subscribers to listen to live opera and theatre performances over telephone lines, created by French inventor Clément Ader in 1881. Telephone broadcasting also grew to include telephone newspaper services for news and entertainment programming which were introduced in the 1890s, primarily located in large European cities. These telephone- based subscription services were the first examples of electrical/electronic broadcasting and offered a wide variety of programming.  Radio broadcasting (experimentally from 1906, commercially from 1920); audio signals sent through the air as radio waves from a transmitter, picked up by an antenna and sent to a receiver. Radio stations can be linked in radio networks to broadcast common radio programs, either in broadcast syndication, simulcast or subchannels.  Television broadcasting (telecast), experimentally from 1925, commercially from the 1930s: an extension of radio to include video signals.  Cable radio (also called "cable FM", from 1928) and cable television (from 1932): both via coaxial cable, originally serving principally as transmission media for programming produced at either radio or television stations, but later expanding into a broad universe of cable-originated channels.  Direct-broadcast satellite (DBS) (from c. 1974) and satellite radio (from c. 1990): meant for direct-to-home broadcast programming (as opposed to studio network uplinks and downlinks), provides a mix of traditional radio or television broadcast programming, or both, with dedicated satellite radio programming. (See also: Satellite television)  Webcasting of video/television (from c. 1993) and audio/radio (from c. 1994) streams: offers a mix of traditional radio and television station broadcast programming with dedicated Internet radio and Internet television. Economic models There are several means of providing financial support for continuous broadcasting:  Commercial broadcasting: for-profit, usually privately owned stations, channels, networks, or services providing programming to the public, supported by the sale of air time to advertisers for radio or television advertisements during or in breaks between programs, often in combination with cable or pay cable subscription fees.  Public broadcasting: usually non-profit, publicly owned stations or networks supported by license fees, government funds, grants from foundations, corporate underwriting, audience memberships, contributions or a combination of these.  Community broadcasting: a form of mass media in which a television station, or a radio station, is owned, operated or programmed, by a community group to provide programs of local interest known as local programming. Community stations are most commonly operated by non- profit groups or cooperatives; however, in some cases they may be operated by a local college or university, a cable company or a municipal government. Broadcasters may rely on a combination of these business models. For example, in the United States, National Public Radio (NPR) and the Public Broadcasting Service (PBS, television) supplement public membership subscriptions and grants with funding from the Corporation for Public Broadcasting (CPB), which is allocated bi-annually by Congress. US public broadcasting corporate and charitable grants are generally given in consideration of underwriting spots which differ from commercial advertisements in that they are governed by specific FCC restrictions, which prohibit the advocacy of a product or a "call to action". The first regular television broadcasts started in 1937. Broadcasts can be classified as "recorded" or "live". The former allows correcting errors, and removing superfluous or undesired material, rearranging it, applying slow-motion and repetitions, and other techniques to enhance the program. However, some live events like sports television can include some of the aspects including slow- motion clips of important goals/hits, etc., in between the live television telecast. American radio-network broadcasters habitually forbade prerecorded broadcasts in the 1930s and 1940s requiring radio programs played for the Eastern and Central time zones to be repeated three hours later for the Pacific time zone (See: Effects of time on North American broadcasting). This restriction was dropped for special occasions, as in the case of the German dirigible airship Hindenburg disaster at Lakehurst, New Jersey, in 1937. During World War II, prerecorded broadcasts from war correspondents were allowed on U.S. radio. In addition, American radio programs were recorded for playback by Armed Forces Radio radio stations around the world. A disadvantage of recording first is that the public may know the outcome of an event from another source, which may be a "spoiler". In addition, prerecording prevents live radio announcers from deviating from an officially approved script, as occurred with propaganda broadcasts from Germany in the 1940s and with Radio Moscow in the 1980s. Many events are advertised as being live, although they are often "recorded live" (sometimes called "live-to-tape"). This is particularly true of performances of musical artists on radio when they visit for an in-studio concert performance. Similar situations have occurred in television production ("The Cosby Show is recorded in front of a live television studio audience") and news broadcasting. A broadcast may be distributed through several physical means. If coming directly from the radio studio at a single station or television station, it is simply sent through the studio/transmitter link to the transmitter and hence from the television antenna located on the radio masts and towers out to the world. Programming may also come through a communications satellite, played either live or recorded for later transmission. Networks of stations may simulcast the same programming at the same time, originally via microwave link, now usually by satellite. Distribution to stations or networks may also be through physical media, such as magnetic tape, compact disc (CD), DVD, and sometimes other formats. Usually these are included in another broadcast, such as when electronic news gathering (ENG) returns a story to the station for inclusion on a news programme. The final leg of broadcast distribution is how the signal gets to the listener or viewer. It may come over the air as with a radio station or television station to an antenna and radio receiver, or may come through cable television or cable radio (or "wireless cable") via the station or directly from a network. The Internet may also bring either internet radio or streaming media television to the recipient, especially with multicasting allowing the signal and bandwidth to be shared. The term "broadcast network" is often used to distinguish networks that broadcast an over-the-air television signals that can be received using a tuner (television) inside a television set with a television antenna from so-called networks that are broadcast only via cable television (cablecast) or satellite television that uses a dish antenna. The term "broadcast television" can refer to the television programs of such networks. Social impact The sequencing of content in a broadcast is called a schedule. As with all technological endeavors, a number of technical terms and slang have developed. A list of these terms can be found at List of broadcasting terms.Television and radio programs are distributed through radio broadcasting or cable, often both simultaneously. By coding signals and having a cable converter box with decoding equipment in homes, the latter also enables subscription-based channels, pay-tv and pay-per-view services. In his essay, John Durham Peters wrote that communication is a tool used for dissemination. Durham stated, "Dissemination is a lens—sometimes a usefully distorting one—that helps us tackle basic issues such as interaction, presence, and space and time...on the agenda of any future communication theory in general" (Durham, 211). Dissemination focuses on the message being relayed from one main source to one large audience without the exchange of dialogue in between. It is possible for the message to be changed or corrupted by government officials once the main source releases it. There is no way to predetermine how the larger population or audience will absorb the message. They can choose to listen, analyze, or simply ignore it. Dissemination in communication is widely used in the world of broadcasting. Broadcasting focuses on getting a message out and it is up to the general public to do what they wish with it. Durham also states that broadcasting is used to address an open-ended destination (Durham, 212). There are many forms of broadcasting, but they all aim to distribute a signal that will reach the target Analog broadcast television systems come in a variety of frame rates and resolutions. Further differences exist in the frequency and modulation of the audio carrier. The monochrome combinations still existing in the 1950s are standardized by the International Telecommunication Union (ITU) as capital letters A through N. When color television was introduced, the hue and saturation information was added to the monochrome signals in a way that black and white televisions ignore. In this way backwards compatibility was achieved. That concept is true for all analog television standards. There were three standards for the way the additional color information can be encoded and transmitted. The first was the American NTSC (National Television Systems Committee) color television system. The European/Australian PAL (Phase Alternation Line rate) and the French-former Soviet Union SECAM (Séquentiel Couleur Avec Mémoire) standard were developed later and attempt to cure certain defects of the NTSC system. PAL's color encoding is similar to the NTSC systems. SECAM, though, uses a different modulation approach than PAL or NTSC. In principle, all three color encoding systems can be combined with any scan line/frame rate combination. Therefore, in order to describe a given signal completely, it's necessary to quote the color system and the broadcast standard as a capital letter. For example, the United States, Canada, Mexico and South Korea use NTSC-M (many of these transitioned or transitioning to digital), Japan uses NTSC-J (discontinued in 2012, when Japan transitioned to digital (ISDB)), the UK uses PAL-I (discontinued in 2012, when UK transitioned to digital (DVB- T)), France uses SECAM-L (discontinued in 2011, when France transitioned to digital (DVB-T)), much of Western Europe and Australia use PAL-B/G (Many of these transitioned or transitioning to DVB-T as digital television standards), most of Eastern Europe uses SECAM-D/K or PAL-D/K and so on. However, not all of these possible combinations actually exist. NTSC is currently only used with system M, even though there were experiments with NTSC-A (405 line) in the UK and NTSC-N (625 line) in part of South America. PAL is used with a variety of 625-line standards (B,G,D,K,I,N) but also with the North American 525-line standard, accordingly named PAL-M. Likewise, SECAM is used with a variety of 625-line standards. For this reason many people refer to any 625/25 type signal as "PAL" and to any 525/30 signal as "NTSC", even when referring to digital signals; for example, on DVD-Video, which does not contain any analog color encoding, and thus no PAL or NTSC signals at all. Even though this usage is common, it is misleading, as that is not the original meaning of the terms PAL/SECAM/NTSC. Although a number of different broadcast television systems were in use worldwide, the same principles of operation apply. In many countries, over-the-air broadcast television of analog audio and analog video signals has been discontinued, to allow the re-use of the television broadcast radio spectrum for other services such as datacasting and subchannels. Displaying an image A cathode-ray tube (CRT) television displays an image by scanning a beam of electrons across the screen in a pattern of horizontal lines known as a raster. At the end of each line the beam returns to the start of the next line; the end of the last line is a link that returns to the top of the screen. As it passes each point the intensity of the beam is varied, varying the luminance of that point. A color television system is identical except that an additional signal known as chrominance controls the color of the spot. Raster scanning is shown in a slightly simplified form below. When analog television was developed, no affordable technology for storing any video signals existed; the luminance signal has to be generated and transmitted at the same time at which it is displayed on the CRT. It is therefore essential to keep the raster scanning in the camera (or other device for producing the signal) in exact synchronization with the scanning in the television. The physics of the CRT require that a finite time interval be allowed for the spot to move back to the start of the next line (horizontal retrace) or the start of the screen (vertical retrace). The timing of the luminance signal must allow for this. The human eye has a characteristic called Phi phenomenon. Quickly displaying successive scan images will allow the apparent illusion of smooth motion. Flickering of the image can be partially solved using a long persistence phosphor coating on the CRT, so that successive images fade slowly. However, slow phosphor has the negative side-effect of causing image smearing and blurring when there is a large amount of rapid on-screen motion occurring. The maximum frame rate depends on the bandwidth of the electronics and the transmission system, and the number of horizontal scan lines in the image. A frame rate of 25 or 30 hertz is a satisfactory compromise, while the process of interlacing two video fields of the picture per frame is used to build the image. This process doubles the apparent number of video frames per second and further reduces flicker and other defects in transmission. Other types of display screens Plasma screens and LCD screens have been used in analog television sets. These types of display screens use lower voltages than older CRT displays. Many dual system television receivers, equipped to receive both analog transmissions and digital transmissions have analog tuner receiving capability and must use a television antenna. Receiving signals The television system for each country will specify a number of television channels within the UHF or VHF frequency ranges. A channel actually consists of two signals: the picture information is transmitted using amplitude modulation on one frequency, and the sound is transmitted with frequency modulation at a frequency at a fixed offset (typically 4.5 to 6 MHz) from the picture signal. The channel frequencies chosen represent a compromise between allowing enough bandwidth for video (and hence satisfactory picture resolution), and allowing enough channels to be packed into the available frequency band. In practice a technique called vestigial sideband is used to reduce the channel spacing, which would be nearly twice the video bandwidth if pure AM was used. Signal reception is invariably done via a superheterodyne receiver: the first stage is a tuner which selects a television channel and frequency-shifts it to a fixed intermediate frequency (IF). The signal amplifier performs amplification to the IF stages from the microvolt range to fractions of a volt. Extracting the sound At this point the IF signal consists of a video carrier signal at one frequency and the sound carrier at a fixed offset. A demodulator recovers the video signal. Also at the output of the same demodulator is a new frequency modulated sound carrier at the offset frequency. In some sets made before 1948, this was filtered out, and the sound IF of about 22 MHz was sent to an FM demodulator to recover the basic sound signal. In newer sets, this new carrier at the offset frequency was allowed to remain as intercarrier sound, and it was sent to an FM demodulator to recover the basic sound signal. One particular advantage of intercarrier sound is that when the front panel fine tuning knob is adjusted, the sound carrier frequency does not change with the tuning, but stays at the above- mentioned offset frequency. Consequently, it is easier to tune the picture without losing the sound. So the FM sound carrier is then demodulated, amplified, and used to drive a loudspeaker. Until the advent of the NICAM and MTS systems, television sound transmissions were invariably monophonic. Structure of a video signal The video carrier is demodulated to give a composite video signal; this contains luminance, chrominance and synchronization signals; this is identical to the video signal format used by analog video devices such as VCRs or CCTV cameras. c) (R-Y) / (B-Y), used in the first color receiver on the market (Westinghouse, not RCA), d) (R-Y) / (G-Y), (as used in the RCA Victor CTC-4 chassis), e) (R-Y) / (B-Y) / (G-Y), f) (X) / (Z), as used in many receivers of the late 50's and throughout the 60's. In the end, further matrixing of the above color-difference signals c through f yielded the three color-difference signals, (R-Y), (B-Y), and (G-Y). The R,G,B signals in the receiver needed for the display device (CRT, Plasma display or LCD display) are electronically derived by matrixing as follows: R is the additive combination of (R-Y) with Y, G is the additive combination of (G-Y) with Y, and B is the additive combination of (B-Y) with Y. All of this is accomplished electronically. It can be seen that in the combining process, the low resolution portion of the Y signals cancel out, leaving R,G, and B signals able to render a low-resolution image in full color. However, the higher resolution portions of the Y signals do not cancel out, and so are equally present in R, G, and B, producing the higher definition (higher resolution) image detail in monochrome, although it appears to the human eye as a full-color and full resolution picture. In the NTSC and PAL color systems, U and V are transmitted by using quadrature amplitude modulation of a subcarrier. This kind of modulation applies two independent signals to one subcarrier, with the idea that both signals will be recovered independently at the receive end. Before transmission, the subcarrier itself, is removed from the active (visible) portion of the video, and moved, in the form of a burst, to the horizontal blanking portion, which is not directly visible on screen. (More about the burst below.) For NTSC, the subcarrier is a 3.58 MHz sine wave. For the PAL system it is a 4.43 MHz sine wave. After the above-mentioned quadrature amplitude modulation of the subcarrier, subcarrier sidebands are produced, and the subcarrier itself is filtered out of the visible portion of the video, since it is the subcarrier sidebands that carry all of the U and V information, and the subcarrier itself carries no information. The resulting subcarrier sidebands is also known as "chroma" or "chrominance". Physically, this chrominance signal is a 3.58 MHz(NTSC) or 4.43 MHz(PAL) sine wave which, in response to changing U and V values, changes phase as compared to the subcarrier, and also changes amplitude. As it turns out, the chroma amplitude (when considered together with the Y signal) represents the approximate saturation of a color, and the chroma phase against the subcarrier as reference, approximately represents the hue of the color. For particular test colors found in the test color bar pattern, exact amplitudes and phases are sometimes defined for test and trouble shooting purposes only. Although, in response to changing U and V values, the chroma sinewave changes phase with respect to the subcarrier, it's not correct to say that the subcarrier is simply "phase modulated". That is because a single sine wave U test signal with QAM produces only one pair of sidebands, whereas real phase modulation under the same test conditions would produce multiple sets of sidebands occupying more frequency spectrum. In NTSC, the chrominance sine wave has the same average frequency as the subcarrier frequency. But a spectrum analyzer instrument shows that, for transmitted chrominance, the frequency component at the subcarrier frequency is actually zero energy, verifying that the subcarrier was indeed removed before transmission. These sideband frequencies are within the luminance signal band, which is why they are called "subcarrier" sidebands instead of simply "carrier" sidebands. Their exact frequencies were chosen such that (for NTSC), they are midway between two harmonics of the frame repetition rate, thus ensuring that the majority of the power of the luminance signal does not overlap with the power of the chrominance signal. In the British PAL (D) system, the actual chrominance center frequency, with equal lower and upper sidebands, is 4.43361875 MHz, a direct multiple of the scan rate frequency. This frequency was chosen to minimize the chrominance beat interference pattern that would be visible in areas of high color saturation in the transmitted picture. At certain times, the chrominance signal represents only the U signal, and 70 nanoseconds (NTSC) later, the chrominance signal represents only the V signal. (This is the nature of the quadrature amplitude modulation process that created the chrominance signal.) About 70 nanoseconds later still, -U, and another 70 nanoseconds, -V. So to extract U, a synchronous demodulator is utilized, which uses the subcarrier to briefly gate (sample) the chroma every 280 nanoseconds, so that the output is only a train of discrete pulses, each having an amplitude that is the same as the original U signal at the corresponding time. In effect, these pulses are discrete- time analog samples of the U signal. The pulses are then low-pass filtered so that the original analog continuous-time U signal is recovered. For V, a 90 degree shifted subcarrier briefly gates the chroma signal every 280 nanoseconds, and the rest of the process is identical to that used for the U signal. Gating at any other time than those times mentioned above will yield an additive mixture of any two of U, V, -U, or -V. One of these "off-axis" (that is, off the U and V axis) gating methods is called I/Q demodulation. Another much more popular "off-axis" scheme was the X/Z demodulation system. Further matrixing recovered the original U and V signals. This scheme was actually the most popular demodulator scheme throughout the 60's. The above process uses the subcarrier. But as previously mentioned, it was deleted before transmission, and only the chroma is transmitted. Therefore, the receiver must reconstitute the subcarrier. For this purpose, a short burst of subcarrier, known as the color burst, is transmitted during the back porch (re- trace blanking period) of each scan line. A subcarrier oscillator in the receiver locks onto this signal (see phase-locked loop) to achieve a phase reference, resulting in the oscillator producing the reconstituted subcarrier. (A second use of the burst in more expensive or newer receiver models is a reference to an AGC system to compensate for chroma gain imperfections in reception.) NTSC uses this process unmodified. Unfortunately, this often results in poor color reproduction due to phase errors in the received signal, caused sometimes by multipath, but mostly by poor implementation at the studio end. With the advent of solid state receivers, cable TV, and digital studio equipment for conversion to an over-the-air analog signal, these NTSC problems have been largely fixed, leaving operator error at the studio end as the sole color rendition weakness of the NTSC system. In any case, the PAL D (delay) system mostly corrects these kind of errors by reversing the phase of the signal on each successive line, and the averaging the results over pairs of lines. This process is achieved by the use of a 1H (where H = horizontal scan frequency) duration delay line. (A typical circuit used with this device converts the low frequency color signal to ultrasound and back again). Phase shift errors between successive lines are therefore cancelled out and the wanted signal amplitude is increased when the two in-phase (coincident) signals are re-combined. NTSC is more spectrum efficient than PAL, giving more picture detail for a given bandwidth. This is because sophisticated comb filters in receivers are more effective with NTSC's 4 field color phase cadence compared to PAL's 8 field cadence. However, in the end, the larger channel width of most PAL systems in Europe still give their PAL systems the edge in transmitting more picture detail. In the SECAM television system, U and V are transmitted on alternate lines, using simple frequency modulation of two different color subcarriers. In some analog color CRT displays, starting in 1956, the brightness control signal (luminance) is fed to the cathode connections of the electron guns, and the color difference signals (chrominance signals) are fed to the control grids connections. The adjustment took the form of horizontal hold and vertical hold controls, usually on the front panel along with other common controls. These adjusted the free-run frequencies of the corresponding timebase oscillators. By the early 1980s the efficacy of the synchronization circuits, plus the inherent stability of the sets' oscillators, had been improved to the point where these controls were no longer necessary. Components of a television system A typical analog monochrome television receiver is based around the block diagram shown below: Sync separator PAL videosignal frames. Left to right: frame with scan lines (overlapping together, horizontal sync pulses show as the doubled straight horizontal lines), vertical blanking interval with vertical sync (shows as brightness increase of the bottom part of the signal in almost the leftmost part of the vertical blanking interval), entire frame, another VBI with VSYNC, beginning of third frame Image synchronization is achieved by transmitting negative-going pulses; in a composite video signal of 1 volt amplitude, these are approximately 0.3 V below the "black level". The horizontal sync signal is a single short pulse which indicates the start of every line. Two timing intervals are defined – the front porch between the end of displayed video and the start of the sync pulse, and the back porch after the sync pulse and before displayed video. These and the sync pulse itself are called the horizontal blanking (or retrace) interval and represent the time that the electron beam in the CRT is returning to the start of the next display line. The vertical sync signal is a series of much longer pulses, indicating the start of a new field. The sync pulses occupy the whole of line interval of a number of lines at the beginning and end of a scan; no picture information is transmitted during vertical retrace. The pulse sequence is designed to allow horizontal sync to continue during vertical retrace; it also indicates whether each field represents even or odd lines in interlaced systems (depending on whether it begins at the start of a horizontal line, or midway through). In the television receiver, a sync separator circuit detects the sync voltage levels and sorts the pulses into horizontal and vertical sync. Loss of horizontal synchronization usually resulted in an unwatchable picture; loss of vertical synchronization would produce an image rolling up or down the screen. Counting sync pulses, a video line selector picks a selected line from a TV signal, used for teletext, on-screen displays, station identification logos as well as in the industry when cameras were used as a sensor. Timebase circuits In an analog receiver with a CRT display sync pulses are fed to horizontal and vertical timebase circuits (commonly called "sweep circuits" in the United States), each consisting of an oscillator and an amplifier. These generate modified sawtooth and parabola current waveforms to scan the electron beam in a linear way. The waveform shapes are necessary to make up for the distance variations from the electron beam source and the screen surface. The oscillators are designed to free-run at frequencies very close to the field and line rates, but the sync pulses cause them to reset at the beginning of each scan line or field, resulting in the necessary synchronization of the beam sweep with the originating signal. The output waveforms from the timebase amplifiers are fed to the horizontal and vertical deflection coils wrapped around the CRT tube. These coils produce magnetic fields proportional to the changing current, and these deflect the electron beam across the screen. In the 1950s, the power for these circuits was derived directly from the mains supply. A simple circuit consisted of a series voltage dropper resistance and a rectifier valve (tube) or semiconductor diode. This avoided the cost of a large high voltage mains supply (50 or 60 Hz) transformer. This type of circuit was used for thermionic valve (vacuum tube) technology. It was inefficient and produced a lot of heat which led to premature failures in the circuitry. In the 1960s, semiconductor technology was introduced into timebase circuits. During the late 1960s in the UK, synchronous (with the scan line rate) power generation was introduced into solid state receiver designs. These had very complex circuits in which faults were difficult to trace, but had very efficient use of power. In the early 1970s AC mains (50 or 60 Hz), and line timebase (15,625 Hz), thyristor based switching circuits were introduced. In the UK use of the simple (50 Hz) types of power circuits were discontinued. The reason for design changes arose from the electricity supply contamination problems arising from EMI, and supply loading issues due to energy being taken from only the positive half cycle of the mains supply waveform. CRT flyback power supply design and operation principles Most of the receiver's circuitry (at least in transistor- or IC-based designs) operates from a comparatively low-voltage DC power supply. However, the anode connection for a cathode-ray tube requires a very high voltage (typically 10–30 kV) for correct operation. This voltage is not directly produced by the main power supply circuitry; instead the receiver makes use of the circuitry used for horizontal scanning. Direct current (DC), is switched though the line output transformer, and alternating current (AC) is induced into the scan coils. At the end of each horizontal scan line the magnetic field, which has built up in both transformer and scan coils by the current, is a source of latent electromagnetic energy. This stored collapsing magnetic field energy can be captured. The reverse flow, short duration, (about 10% of the line scan time) current from both the line output transformer and the horizontal scan coil is discharged again into the primary winding of the flyback transformer by the use of a rectifier which blocks this negative reverse emf. A small value capacitor is connected across the scan switching device. This tunes the circuit inductances to resonate at a much higher frequency. This slows down (lengthens) the flyback time from the extremely rapid decay rate that would result if they were electrically isolated during this short period. One of the secondary windings on the flyback transformer then feeds this brief high voltage pulse to a Cockcroft–Walton generator design voltage multiplier. This produces the required The actual authorized frequency bands are defined by the ITU and the local regulating agencies like the Federal Communications Commission (FCC) in the USA. Broadcast engineering Broadcast engineering is the field of electrical engineering, and now to some extent computer engineering and information technology, which deals with radio and television broadcasting. Audio engineering and RF engineering are also essential parts of broadcast engineering, being their own subsets of electrical engineering. Broadcast engineering involves both the studio end and the transmitter end (the entire airchain), as well as remote broadcasts. Every station has a broadcast engineer, though one may now serve an entire station group in a city, or be a contract engineer who essentially freelances his or her services to several stations (often in small media markets) as needed. Duties Modern duties of a broadcast engineer include maintaining broadcast automation systems for the studio and automatic transmission systems for the transmitter plant. There are also important duties regarding radio towers, which must be maintained with proper lighting and painting. Occasionally a station's engineer must deal with complaints of RF interference, particularly after a station has made changes to its transmission facilities. Broadcast quality Broadcast quality is a term stemming from quad videotape to denote the quality achieved by professional video cameras and time base correctors (TBC) used for broadcast television, usually in standard definition. As the standards for commercial television broadcasts have changed from analog television using analog video to digital television using digital video, the term has generally fallen out of use. Manufacturers have used it to describe both professional and prosumer or "Semi- Professional" devices. A camera with the minimum requirements typically being the inclusion of three CCDs and relatively low-compression analog recording or digital recording capability with little or no chroma subsampling, and the ability to be genlocked. The advantages of three CCDs include better color definition in shadows, better overall low-light sensitivity, and reduced noise when compared to single-CCD systems. With continuing improvements in image sensors, resolution, recording media, and codecs, by 2006 the term no longer carried much weight in the marketplace. The term is also used in its literal sense in broadcasting jargon in judging the fitness of audio or video for broadcast. Broadcast television systems Broadcast television systems are encoding or formatting standards for the transmission and reception of terrestrial television signals. There were three main analog television systems in use around the world until the late 2010s (expected): NTSC, PAL, and SECAM. Now in digital television (DTV), there are four main systems in use around the world: ATSC, DVB, ISDB and DTMB. Frames Ignoring color, all television systems work in essentially the same manner. The monochrome image seen by a camera (later, the luminance component of a color image) is divided into horizontal scan lines, some number of which make up a single image or frame. A monochrome image is theoretically continuous, and thus unlimited in horizontal resolution, but to make television practical, a limit had to be placed on the bandwidth of the television signal, which puts an ultimate limit on the horizontal resolution possible. When color was introduced, this necessity of limit became fixed. All analog television systems are interlaced: alternate rows of the frame are transmitted in sequence, followed by the remaining rows in their sequence. Each half of the frame is called a video field, and the rate at which field are transmitted is one of the fundamental parameters of a video system. It is related to the utility frequency at which the electricity distribution system operates, to avoid flicker resulting from the beat between the television screen deflection system and nearby mains generated magnetic fields. All digital, or "fixed pixel," displays have progressive scanning and must deinterlace an interlaced source. Use of inexpensive deinterlacing hardware is a typical difference between lower- vs. higher-priced flat panel displays (Plasma display, LCD, etc.). All films and other filmed material shot at 24 frames per second must be transferred to video frame rates using a telecine in order to prevent severe motion jitter effects. Typically, for 25 frame/s formats (European among other countries with 50 Hz mains supply), the content is PAL speedup, while a technique known as "3:2 pulldown" is used for 30 frame/s formats (North America among other countries with 60 Hz mains supply) to match the film frame rate to the video frame rate without speeding up the play back. Viewing technology Analog television signal standards are designed to be displayed on a cathode ray tube (CRT), and so the physics of these devices necessarily controls the format of the video signal. The image on a CRT is painted by a moving beam of electrons which hits a phosphor coating on the front of the tube. This electron beam is steered by a magnetic field generated by powerful electromagnets close to the source of the electron beam. In order to reorient this magnetic steering mechanism, a certain amount of time is required due to the inductance of the magnets; the greater the change, the greater the time it takes for the electron beam to settle in the new spot. For this reason, it is necessary to shut off the electron beam (corresponding to a video signal of zero luminance) during the time it takes to reorient the beam from the end of one line to the beginning of the next (horizontal retrace) and from the bottom of the screen to the top (vertical retrace or vertical blanking interval). The horizontal retrace is accounted for in the time allotted to each scan line, but the vertical retrace is accounted for as phantom lines which are never displayed but which are included in the number of lines per frame defined for each video system. Since the electron beam must be turned off in any case, the result is gaps in the television signal, which can be used to transmit other information, such as test signals or color identification signals. The temporal gaps translate into a comb-like frequency spectrum for the signal, where the teeth are spaced at line frequency and concentrate most of the energy; the space between the teeth can be used to insert a color subcarrier. Hidden signaling Broadcasters later developed mechanisms to transmit digital information on the phantom lines, used mostly for teletext and closed captioning:  PALplus uses a hidden signaling scheme to indicate if it exists, and if so what operational mode it is in.  NTSC has been modified by the Advanced Television Systems Committee to support an anti-ghosting signal that is inserted on a non-visible scan line.  Teletext uses hidden signaling to transmit information pages.  NTSC Closed Captioning signaling uses signaling that is nearly identical to teletext signaling.  Widescreen All 625 line systems incorporate pulses on line 23 that flag to the display that a 16:9 widescreen image is being broadcast, though this option was not used on later analog transmissions. Overscan Television images are unique in that they must incorporate regions of the picture with reasonable-quality content, that will never be seen by some viewers by today's standards. Like the British system A, it was VHF only and remained black & white until its shutdown in 1984 in France and 1985 in Monaco. It was tested with SECAM in the early stages, but later the decision was made to adopt color in 625-lines. Thus France adopted system L on UHF only and abandoned system E. In many parts of the world, analog television broadcasting has been shut down completely, or in process of shutdown; see Digital television transition for a timeline of the analog shutdown. List of analog television systems Pre–World War II systems A number of experimental and broadcast pre WW2 systems were tested. The first ones were mechanically based and of very low resolution, sometimes with no sound. Later TV systems were electronic.  The UK 405 line system was the first to have an allocated ITU System Letter Designation. ITU standards On an international conference in Stockholm in 1961, the International Telecommunication Union designated standards for broadcast television systems. Each standard is designated a letter (A-M); in combination with a color system (NTSC, PAL, SECAM), this completely specifies all of the monaural analog television systems in the world (for example, PAL-B, NTSC-M, etc.). The following table gives the principal characteristics of each standard. Defunct TV systems are shown in grey text, previous ones never designated by ITU are not yet shown. Except for lines and frame rates, other units are megahertz (MHz). Digital television systems The situation with worldwide digital television is much simpler by comparison. Most digital television systems are based on the MPEG transport stream standard, and use the H.262/MPEG-2 Part 2 video codec. They differ significantly in the details of how the transport stream is converted into a broadcast signal, in the video format prior to encoding (or alternatively, after decoding), and in the audio format. This has not prevented the creation of an international standard that includes both major systems, even though they are incompatible in almost every respect. The two principal digital broadcasting systems are ATSC standards, developed by the Advanced Television Systems Committee and adopted as a standard in most of North America, and DVB-T, the Digital Video Broadcast – Terrestrial system used in most of the rest of the world. DVB-T was designed for format compatibility with existing direct broadcast satellite services in Europe (which use the DVB-S standard, and also sees some use in direct-to-home satellite dish providers in North America), and there is also a DVB-C version for cable television. While the ATSC standard also includes support for satellite and cable television systems, operators of those systems have chosen other technologies (principally DVB-S or proprietary systems for satellite and 256QAM replacing VSB for cable). Japan uses a third system, closely related to DVB-T, called ISDB-T, which is compatible with Brazil's SBTVD. The People's Republic of China has developed a fourth system, named DMB-T/H. Digital television Digital television (DTV) is the transmission of television signals, including the sound channel, using digital encoding, in contrast to the earlier television technology, analog television, in which the video and audio are carried by analog signals. It is an innovative advance that represents the first significant evolution in television technology since color television in the 1950s. Digital TV makes more economical use of scarce radio spectrum space; it can transmit multiple channels in the same bandwidth occupied by a single channel of analog television, and provides many new features that analog television cannot. A switchover from analog to digital broadcasting began around 2006 in some countries, and many industrial countries have now completed the changeover, while other countries are in various stages of adaptation. Different digital television broadcasting standards have been adopted in different parts of the world; below are the more widely used standards:  Digital Video Broadcasting (DVB) uses coded orthogonal frequency- division multiplexing (OFDM) modulation and supports hierarchical transmission. This standard has been adopted in Europe, Singapore, Australia and New Zealand.  Advanced Television System Committee (ATSC) uses eight-level vestigial sideband (8VSB) for terrestrial broadcasting. This standard has been adopted by six countries: United States, Canada, Mexico, South Korea, Dominican Republic and Honduras.  Integrated Services Digital Broadcasting (ISDB) is a system designed to provide good reception to fixed receivers and also portable or mobile receivers. It utilizes OFDM and two-dimensional interleaving. It supports hierarchical transmission of up to three layers and uses MPEG-2 video and Advanced Audio Coding. This standard has been adopted in Japan and the Philippines. ISDB-T International is an adaptation of this standard using H.264/MPEG-4 AVC that been adopted in most of South America and is also being embraced by Portuguese-speaking African countries.  Digital Terrestrial Multimedia Broadcasting (DTMB) adopts time-domain synchronous (TDS) OFDM technology with a pseudo-random signal frame to serve as the guard interval (GI) of the OFDM block and the training symbol. The DTMB standard has been adopted in the People's Republic of China, including Hong Kong and Macau.  Digital Multimedia Broadcasting (DMB) is a digital radio transmission technology developed in South Korea as part of the national IT project for sending multimedia such as TV, radio and datacasting to mobile devices such as mobile phones, laptops and GPS navigation systems. Formats and bandwidth Digital television supports many different picture formats defined by the broadcast television systems which are a combination of size and aspect ratio (width to height ratio). With digital terrestrial television (DTT) broadcasting, the range of formats can be broadly divided into two categories: high definition television (HDTV) for the transmission of high-definition video and standard-definition television (SDTV). These terms by themselves are not very precise, and many subtle intermediate cases exist. One of several different HDTV formats that can be transmitted over DTV is: 1280 × 720 pixels in progressive scan mode (abbreviated 720p) or 1920 × 1080 pixels in interlaced video mode (1080i). Each of these uses a 16:9 aspect ratio. HDTV cannot be transmitted over analog television channels because of channel capacity issues. SDTV, by comparison, may use one of several different formats taking the form of various aspect ratios depending on the technology used in the country of broadcast. In terms of rectangular pixels, NTSC countries can deliver a 640 × 480 resolution in 4:3 and 854 × 480 in 16:9, while PAL can give 768 × 576 in 4:3 and 1024 × 576 in 16:9. However, broadcasters may choose to reduce these resolutions to reduce bit rate (e.g., many DVB-T channels in the United Kingdom use a horizontal resolution of 544 or 704 pixels per line). Each commercial broadcasting terrestrial television DTV channel in North America is permitted to be broadcast at a bit rate up to 19 megabits per second. However, the broadcaster does not need to use this entire bandwidth for just one broadcast channel. Instead the broadcast can use the channel to include PSIP and can also subdivide across several video subchannels (a.k.a. feeds) of varying quality and compression rates, including non-video datacasting services that allow one-way high-bit-rate streaming of data to computers like National Datacast. conversely, minimizing artifacts in still backgrounds that may be closely examined in a scene (since time allows). Broadcast, cable, satellite, and Internet DTV operators control the picture quality of television signal encodes using sophisticated, neuroscience-based algorithms, such as the Structural Similarity (SSIM) video quality measurement tool, which was accorded each of its inventors a Primetime Emmy because of its global use. Another tool, called Visual Information Fidelity (VIF), is a top-performing algorithm at the core of the Netflix VMAF video quality monitoring system, which accounts for about 35% of all U.S. bandwidth consumption. Effects of poor reception Changes in signal reception from factors such as degrading antenna connections or changing weather conditions may gradually reduce the quality of analog TV. The nature of digital TV results in a perfectly decodable video initially, until the receiving equipment starts picking up interference that overpowers the desired signal or if the signal is too weak to decode. Some equipment will show a garbled picture with significant damage, while other devices may go directly from perfectly decodable video to no video at all or lock up. This phenomenon is known as the digital cliff effect. For remote locations, distant channels that, as analog signals, were previously usable in a snowy and degraded state may, as digital signals, be perfectly decodable or may become completely unavailable. The use of higher frequencies will add to these problems, especially in cases where a clear line-of-sight from the receiving antenna to the transmitter is not available. Effect on old analog technology Television sets with only analog tuners cannot decode digital transmissions. When analog broadcasting over the air ceases, users of sets with analog-only tuners may use other sources of programming (e.g. cable, recorded media) or may purchase set-top converter boxes to tune in the digital signals. In the United States, a government-sponsored coupon was available to offset the cost of an external converter box. Analog switch-off (of full-power stations) took place on December 11, 2006 in The Netherlands, June 12, 2009 in the United States for full-power stations, and later for Class-A Stations on September 1, 2016, July 24, 2011 in Japan, August 31, 2011 in Canada, February 13, 2012 in Arab states, May 1, 2012 in Germany, October 24, 2012 in the United Kingdom and Ireland, October 31, 2012 in selected Indian cities, and December 10, 2013 in Australia Completion of analog switch-off is scheduled for December 31, 2017 in the whole of India, December 2018 in Costa Rica and around 2020 for the Philippines. Disappearance of TV-audio receivers Prior to the conversion to digital TV, analog television broadcast audio for TV channels on a separate FM carrier signal from the video signal. This FM audio signal could be heard using standard radios equipped with the appropriate tuning circuits. However, after the transition of many countries to digital TV, no portable radio manufacturer has yet developed an alternative method for portable radios to play just the audio signal of digital TV channels. (DTV radio is not the same thing.) Environmental issues The adoption of a broadcast standard incompatible with existing analog receivers has created the problem of large numbers of analog receivers being discarded during digital television transition. One superintendent of Public Works was quoted in 2009 as saying, "Some of the studies I’ve read in the trade magazines say up to a quarter of American households could be throwing a TV out in the next two years following the regulation change". In 2009, an estimated 99 million analog TV receivers were sitting unused in homes in the US alone and, while some obsolete receivers are being retrofitted with converters, many more are simply dumped in landfills where they represent a source of toxic metals such as lead as well as lesser amounts of materials such as barium, cadmium and chromium. According to one campaign group, a CRT computer monitor or TV contains an average of 8 pounds (3.6 kg) of lead.According to another source, the lead in glass of a CRT varies from 1.08 lb to 11.28 lb, depending on screen size and type, but the lead is in the form of "stable and immobile" lead oxide mixed into the glass. It is claimed that the lead can have long-term negative effects on the environment if dumped as landfill. However, the glass envelope can be recycled at suitably equipped facilities. Other portions of the receiver may be subject to disposal as hazardous material. Local restrictions on disposal of these materials vary widely; in some cases second-hand stores have refused to accept working color television receivers for resale due to the increasing costs of disposing of unsold TVs. Those thrift stores which are still accepting donated TVs have reported significant increases in good-condition working used television receivers abandoned by viewers who often expect them not to work after digital transition. In Michigan in 2009, one recycler estimated that as many as one household in four would dispose of or recycle a TV set in the following year. The digital television transition, migration to high-definition television receivers and the replacement of CRTs with flatscreens are all factors in the increasing number of discarded analog CRT-based television receivers. Narrowcasting Narrowcasting has traditionally been understood as the dissemination of information (usually via Internet, radio, newspaper, or television) to a narrow audience; not to the broader public at-large. Also called niche marketing or target marketing, narrowcasting involves aiming media messages at specific segments of the public defined by values, preferences, demographic attributes, and/or subscription. Narrowcasting is based on the postmodern idea that mass audiences do not exist. While the first uses of the term appeared within the context of subscription radio programs in the late 1940s, the term first entered the common lexicon due to computer scientist and public broadcasting advocate J. C. R. Licklider, who in a 1967 report envisioned "a multiplicity of television networks aimed at serving the needs of smaller, specialized audiences. 'Here,' stated Licklider, 'I should like to coin the term "narrowcasting," using it to emphasize the rejection or dissolution of the constraints imposed by commitment to a monolithic mass-appeal, broadcast approach.'" The term "narrowcasting" can also apply to the spread of information to an audience (private or public) which is by nature geographically limited—a group such as office employees, military troops, or conference attendees—and requires a localized dissemination of information from a shared source. Reality television Reality television is a genre of television programming that documents supposedly unscripted real-life situations, and often features an otherwise corresponds to high level IF. AM modulator produces two symmetrical side bands in the modulated signals. Thus, IF band width is two times the video band width. (i.e. if the VF bandwidth is 4.2 MHz, the IF bandwidth is 8.4 MHz.) However, the modulator is followed by a special filter known as Vestigal sideband (VSB) filter. This filter is used to suppress a portion of one side band, thus bandwidth is reduced. (Since both side bands contain identical information, this suppression doesn't cause a loss in information.) Although the suppression causes phase delay problems the VSB stage also includes correction circuits to equalise the phase. Output stages The modulated signal is applied to a mixer (also known as frequency converter). Another input to the mixer which is usually produced in a crystal oven oscillator is known as subcarrier. The two outputs of the mixer are the sum and difference of two signals. Unwanted signal (usually the sum) is filtered out and the remaining signal is the radio frequency (RF) signal. Then the signal is applied to the amplifier stages. The number of series amplifiers depends on the required output power. The final stage is usually an amplifier consisting of many parallel power transistors. But in older transmitters tetrodes or klystrons are also utilized. In modern solid-state VHF and UHF transmitters, LDMOS power transistors are the device of choice for the output stage, with the latest products employing 50V LDMOS devices for higher efficiency and power density. Even higher energy efficiency is possible using Envelope Tracking, which in the broadcast industry is often referred to as 'drain modulation'. Combining aural and visual signals There are two methods:  Split sound system: Actually there are two parallel transmitters one for aural and one for visual signal. The two signals are combined at the output via a high power combiner. In addition to a combiner, this system requires separate mixer and amplifiers for aural and visual signals. This is the system used in most high power applications.  Intercarrier system : There are two input stages one for AF and one for VF. But the two signals are combined in low power IF circuits (i.e., after modulators) The mixer and the amplifiers are common to both signals and the system needs no high power combiners. So both the price of the transmitter and the power consumption is considerably lower than that of split sound system of the same power level. But two signals passing through amplifiers produce some intermodulation products. So intercarrier system is not suitable for high power applications and even at lower power transmitters a notch filter to reject the cross modulation products must be used at the output. The output power The output power of the transmitter is defined as the power during sync pulse. (Real output power is variable depending on the content.) But the output power of the transmitting equipment and the output power of the antenna are two different quantities. The output power of the antenna is known as ERP which is actually the transmitter power times the antenna gain. Transposer In broadcasting, a transposer or translator is a device in or beyond the service area of a radio or television station transmitter that rebroadcasts signals to receivers which can’t properly receive the signals of the transmitter because of a physical obstruction (like a hill). A translator receives the signals of the transmitter and rebroadcasts the signals to the area of poor reception. Sometimes the translator is also called a relay transmitter, rebroadcast transmitter or transposer, Since translators are used to cover a small shadowed area, their output powers are usually lower than that of the radio or television station transmitters feeding them.
Docsity logo



Copyright © 2024 Ladybird Srl - Via Leonardo da Vinci 16, 10126, Torino, Italy - VAT 10816460017 - All rights reserved