에디팅의 역사를 간략하게 살펴본다. 어떻게 linear- nonlinear로 발전하는지, 이것의 함의하는 것이 무엇인지.
1948, three years after the war ended, that the first commercial broadcasts of television began in the United States – the medium caught on and exploded during the 50s.
To “record” television, the networks turned to a device called a kinescope
THE ARRIVAL OF TAPE : 1950s
In 1951 Engineers working for Bing Crosby’s production company – yep that Big Crosby, were the first to record video images onto magnetic tape.
In 1956, after 5 years of hard work by brilliant engineers overcoming myriad of hurdles, Ampex would release the first commercially available video tape recorder – the 2 Inch Quadruplex video tape.
THE FIRST VIDEO TAPE RECORDER : 1956
By the 1959, videotape was almost fully accepted by television industry.
CUTTING TAPE :linear editing
specialized splicer which had to cut the tape exactly during a vertical retrace signal without disturbing the odd/even-field ordering.
And of course, you had to do all this without actually seeing what frame you were on because the quadruplex tape was incapable of holding still frames.
LINEAR EDITING : 1970s
using two video decks, linear fashion – this was linear editing.
But linear editing did nothing to advance the craft creatively. Editing became almost a strictly technical profession. linear editing은 현실적으로 수정이 힘들다. 돌이키기가 힘듬.
And because a show was assembled in a linear fashion, any changes to the beginning of a show would mean everything after would have to be reassembled so there was no such thing as a rough cut.
NON-LINEAR EDITING :
almost a rejection of the strict time code rules of linear editing and going back to the freedom of cutting actual film – a system that would eventually be called Non-Linear editing. Non linear editing was nondestructive.
There was no generation loss. It was a much more natural way of editing.
1971 시작 : but too expensive
Through the 80s it was really matter of waiting for computational power and storage capability to catch up.
in 1988, EMC2 introduced the first All Digital Offline NonLinear editor with data stored on optical disks.
1989 :Avid1 – a macintosh based Non Linear Editor.
it enters the last part of the 20th century, computer scientists, programmers, and mathematicians would pioneer the revolution that would join film and television as more or less interchangeable visual mediums.
THE EVOLUTION OF MODERN NON-LINEAR EDITING: PART 2 – THE DIGITAL REVOLUTION
An analog recording would look like the original wave – all the details intact, it’s a copy.
) a Digital recording, breaks the wave into chunks called samples and the measures the amplitude of the wave at each sample and stores these measurements in a stream of binary code.
resistance to noise – introduce noise into an analog signal and you’re going to destroy the signal.
digital has no noise. or quality loss.
Digital signals can also be synced up and read by computers which analog can’t. And very importantly for video, patterns can be found in the sequence of 1s and 0s in digital signals, so digital can be compressed – and that is key for making video as ubiquitous as it is today.
THE FIRST DIGITAL TAPES : 1986
By the late 1970s and into the 80s,
The first commercially available digital video tape was made available in 1986 with the Sony D1 video tape recorder.
The Sony D1 would be challenged by Ampex with D2 in 1988, and Panasonic with D3 in 1991.
CHROMA SUBSAMPLING : 이게 중요한 이유는 make data smaller / compression
Video is made of the primary colors Red Green and Blue – but storing signals in RGB leads to a lot redundancies. So the RGB signal is converted to what’s called a YCbCr colorspace. Y stands for Luma or brightness, Cb is the difference in the blue channel and Cr is the difference in red channel.
Now by separating out color from the brightness we can start to compress the color information reducing the resolution of the Cb and Cr channels.
THE COMPRESSION REVOLUTION
get the video data even smaller.
1995 saw the introduction of DVD optical discs.
in 1995 the DV Format was introduced.
DV cameras had IEEE1394 (Firewire) connections which mean people could get a perfect digital copy of their video onto their computer without having to specialized hardware to encode the file.
1995 :Armed with relatively inexpensive cameras digital video production began to take off.
PRESSURE FROM BELOW
As computers continued getting more powerful and storage cheaper and cheaper, Software based Non-linear editors like Adobe Premiere and Media 100 kept nipping at the heels on Avid forcing the company to constantly lower the price of their system.
The hired the lead developer of Adobe Premiere Randy Ubillos (you-bil-los) to create a program called “Keygrip” based on Apple’s Quicktime codecs.
in 1998. The buyer’s name was Steve Jobs and his company Apple would release the software the following year as Final Cut Pro.
HIGH-DEFINITION VIDEO AND FILM SCANNING
The divide between television/video production and film production began to close with the adoption of high definition video production.
1996 ; High Def video since the 70s and experiments in HD broadcast were being conducted by the late 80s in Japan. The first public HD broadcast in the United States occurred on July 23, 1996.
the mid to late 90s, Hollywood studios were beginning to use DI or digital intermediaries to create special effects.
So it wasn’t long before Hollywood started asking, can we just skip the whole 35mm film step all together??
2002 : The first major motion picture shot entirely on digital was Star Wars II: attack of the clones on a pre-production Sony HDW-F900
it became conceivable to capture straight onto a digital format, edit online which means working with the original full quality files rather than a low quality working file, and even project digital files – all without celluloid film.
incredibly efficient compression techniques like MPEG-4 and H.265 and a powerful network of data distribution with broadband internet.
Each step, each advancement adding more and more tools for us filmmakers to realize our dreams.