Inbound Marketing, Brian Halligan & Dharmesh Shah

Great book on getting found using Google, Social Media and Blogs. Another must read for all entrepreneurs.

51rV5IyKSiL._BO2,204,203,200_PIsitb-sticker-arrow-click,TopRight,35,-76_AA240_SH20_OU01_.jpg (240×240)

Broadcast schedules continue to dominate viewing – Deloitte article

People will continue to consume most audio and video according to a linear broadcast schedule, despite the
availability of on-demand alternatives. Deloitte predicts that through 2010 over 90% of all television
watched and over 80% of all audio content consumed will be from traditional broadcast. It concludes that
advertisers should not necessarily accept the common perception that television audiences are in long-term
“The sovereignty of the schedule runs counter to many commentators’ expectations,” observes Deloitte in
its Media Predictions 2010. Although many industry executives and their families and friends may already
largely bypass broadcast schedules, these still dominate the mass market.
Average television viewing is between 20 to 30 hours a week in most major markets, compared to 1.5 to 2
hours for all forms of nonlinear viewing, including DVD, DVR, or VOD.
The supremacy of the schedule is put down to “ease of use and inertia”. For some the ability to choose
what and when to watch or hear is a necessity, but for those that are less constrained by time, choosing
programmes one-by-one is “tedious and superfluous”.
It is proposed that the availability of on-demand can actually increase the demand for scheduled
programming, as catch-up services encourage people to watch or listen to the next scheduled broadcast.
The emphasis on on-demand and the supposed imminent demise of linear broadcasting are put down to
misinterpretation of market data. This includes self-reported surveys in which respondents tend to reflect
an idealised view rather than their actual viewing and listening habits. This may overstate the use of new
media and new devices, while overlooking hours spent with traditional media, such as listening to the radio
while commuting. Industry metrics for online viewing are also not directly comparable with those for
“It may be that in the long run, the majority of all audio and video consumed will be nonlinear,” concedes
Deloitte, but rather than resenting the “tyranny of the schedule,” hundreds of millions of individuals will
continue to spend at least 40% of their waking hours watching television or listening to the radio in the
traditional manner.
The lesson from this is that consumers do not necessarily embrace the possibilities of new technology and
the behaviours of early adopters do not always become mainstream. However, consumers will value and
pay for the availability of choice and even appear content to purchase devices and subscribe to services
that they hardly ever use.
These and other predictions from Deloitte are consistent with the cautions that informitv has often issued
to media and technology companies that are convinced consumers will want to watch whatever they want,
whenever they want, wherever they want. The irony is that the mass market is actually driven by people
being told what they want, and when and where they can get it, generally by the mass media.
Technologists often tend to overstate the pace of change. This is particularly notable of male, middle class,
middle aged executives who by their own admission do not watch much television and appeal to
observations of their teenage children, who may be equally unrepresentative of the population as a whole.
The key to understanding changing consumer behaviour is sociology as much as technology. The reasons
that people watch television or listen to the radio are largely social, to do with a need for connection with a
wider community. While new communications technologies, such as the internet, can fulfil this in new
ways, traditional broadcast media remain remarkably powerful means to inform, educate and entertain, to
contribute to the narrative and structure of our lives, and it seems there is little immediate sign of this
Copyright © 2010 informitv
All rights reserved.
ISSN 1759-8796

Glossary of TV 2.0 terms

1.     Asset

Generally the “asset” being managed is collected and stored in a digital format. There is usually a target version of that referred to as “essence” and is generally the highest resolution and fidelity representation. The asset is detailed by its “metadata”.

2.     Digital Asset Management (DAM)

Consists of tasks and decisions surrounding ingesting, annotating, cataloguing, storage and retrieval of digital assets, such as digital photographs, animations, videos and music. Digital asset management systems are computer software and/or hardware systems that aid in the process of digital asset management.

The term “Digital Asset Management” (DAM) also refers to the protocol for downloading, renaming, backing up, rating, grouping, archiving, optimizing, maintaining, thinning, and exporting files.

3.     EDL (Edit Decision List)

A list containing the decisions describing where to edit material referenced to that materials time code. EDLs can be produced during an off-line session and passed to the on-line suite to control the conforming of the final edit.

4.     Enterprise Content Management (ECM)

The more recent concept of Enterprise Content Management (ECM) often describes solutions which address similar features but in a wider range of industries or applications. Enterprise-level solutions often involve scalable, reliable, configurable products that can handle vast numbers of assets (files) as well as large numbers of simultaneous users, workflows, or use cases (multiple applications simultaneously operating against the system).

Enterprise systems may, but do not necessarily, include customized products or features added on to the base system or custom developed to match an organization’s workflow. Enterprise class systems are also applicable to small to medium businesses (SMBs), or departments or work groups within an organization. In many cases these systems enter a company in one department and eventually expand to others or the entire enterprise as its utility becomes proven, understood and valued.

5.     Essence

Essence refers to the actual Media target file, usually video and audio, but it could also be still pictures, graphics etc.

6.     Folksonomy

Folksonomy, also known as collaborative tagging, social classification, social indexing and social tagging, is the practice and method of collaboratively creating and managing tags to annotate and categorize content. In contrast to traditional subject indexing, metadata is generated not only by experts but also by creators and consumers of the content. Usually, freely chosen keywords are used instead of a controlled vocabulary. Thus, Folksonomy is user generated taxonomy.

Folksonomies became popular on the Web around 2004 with social software applications such as social bookmarking or annotating photographs. Websites that support tagging and the principle of folksonomy are referred to in the context of Web 2.0 because participation is very easy and tagging data is used in new ways to find information. The term folksonomy is also used to denote only the set of tags that are created in social tagging

7.     HDTV – High Definition Television

A television format with a screen aspect ratio of 16:9 and approximately twice the resolution in both horizontal and vertical dimensions of existing standard definition television (SDTV). There is no agreement for world HDTV studio standards. In Europe, 1250/50, with its simple relationship to 625/50 is favoured, while in the USA the ATSC describes different picture sizes and frame rates, not all of which are HD. The most talked about of these is 1080i (1080 active lines, interlaced) with some interest in the 720p.

8.     Media Asset Management (MAM)

The term “Media Asset Management” is sometimes used as a sub-category of “Digital Asset Management”, mainly for audio or video content.

9.     Metadata

Metadata (Meta data, or sometimes meta-information) is “data about data”, of any sort in any media. An item of metadata may describe an individual datum, or content item, or a collection of data including multiple content items and hierarchical levels, for example a database schema.

Metadata is the description of the asset and the description depth can vary depending on the needs of the system, designer, or user. Metadata can describe, but is not limited to, the description of: asset content (what is in the package?); the means of encoding/decoding (e.g. JPEG, tar, MPEG 2); provenance (history to point of capture); ownership; rights of access; as well as many others. There exist some predefined standards and template for metadata such as Dublin Core and PBCore. In cases of systems that contain large size asset essences, such as MPEG 2 and JPEG2000 for the cases of images and video, there are usually related “proxy” copies of the essence. Both the essence and proxy copies are described by metadata

10.           Non-linear editing

Non-linear distinguishes editing operation from the ‘linear’ methods used with tape. Non-linear refers to not having to edit material in the sequence of the final programme and does not involve copying to make edits.  It allows any part of the edit to be accessed and modified without having to re-edit or re-copy the material that is already edited and follows that point. The term has been widely used in association with off-line editing systems storing compressed pictures but on-line non-linear systems are increasingly on offer.  Non-compressed systems, which have no compromise of picture quality, are becoming more widely used.

11.           Off-line editing

The preliminary or rough edit, performed either on a low-cost editing system or remotely via the web, usually using proxies. This allows editors to make decisions and gain necessary approvals before making the more expensive and demanding on-line edit. Since the actual edit session in a professional video facility has been very expensive, it has been traditional to make all editing decisions in advance. This is called off-line editing or off-lining.

12.           On-line editing

The final edit using the original master material to produce the finished high(er) quality piece. An on-line edit suite usually has a full range of high-end video devices which would normally be too expensive to use during an off-line edit session.

13.           Open platform

A system designed around a powerful general purpose computer. The software then determines the functions being performed, e.g. video effects, compositing etc., without the need for many different and dedicated hardware ‘Black Boxes’.

14.           Proxy

A proxy copy is a lower resolution representation of the essence that can be used as a reference in order to reduce the overall bandwidth requirements of the DAM system infrastructure. It can be generated and retained at the time of ingestion of the asset simultaneous or subsequent to the essence, or it can be generated on the fly using transcoders.

15.           Post Production

The process of completing a production, including editing, audio sweetening, colour correction, etc. to a finished master videotape

16.           Search-oriented architecture (SOA)

This term refers to the use of a search engine as the main integration component in an information system.

In a search-oriented architecture the data tier may be replaced or placed behind another tier which contains a search engine and search engine index which is queried in place of the database management system. Queries from the business tier are made in the search engine query language instead of SQL. The search engine itself crawls the relational database management system in addition to other traditional data sources such as web pages or traditional file systems and consolidates the results when queried.

17.           Service-oriented architecture (SOA)

Service oriented architecture is a methodology for systems development and integration where functionality is grouped around business processes and packaged as interoperable services. SOA also describes IT infrastructure which allows different applications to exchange data with one another as they participate in business processes. The aim is a loose coupling of services with operating systems, programming languages and other technologies which underlie applications.

SOA separates functions into distinct units, or services, which are made accessible over a network in order that they can be combined and reused in the production of business applications. These services communicate with each other by passing data from one service to another, or by coordinating an activity between two or more services. SOA concepts are often seen as built upon and evolving from older concepts of distributed computing and modular programming.

18.           Simple Object Access Protocol (SOAP )

SOAP is a protocol for exchanging XML-based messages over a computer network using HTTP.  SOAP forms the foundation layer of the Web services stack, providing a basic messaging framework that more abstract layers can build on.  It is an XML based protocol that consists of three parts: an envelope that defines a framework for describing what is in a message and how to process it, a set of encoding rules for expressing instances of application-defined datatypes, and a convention for representing remote procedure calls and responses.

19.           Software as a Service (SaaS)

Software that is hosted (managed and maintained externally) and usually offered as a service or by subscription via the web.

20.           SQL

Structured Query Language is a standard interactive and programming language for querying and modifying data and managing databases. Although SQL is both an ANSI and an ISO standard, many database products support SQL with proprietary extensions to the standard language. The core of SQL is formed by a command language that allows the retrieval, insertion, updating, and deletion of data, and performing management and administrative functions. SQL also includes a Call Level Interface (SQL/CLI) for accessing and managing data and databases remotely.

21.           Taxonomy

Taxonomies, or taxonomic schemes, are the practice and science of classification and are composed of things that are arranged frequently in a hierarchical structure. Typically they are related by type – subtype relationships, also called parent-child relationships. In such a relationship the subtype has by definition the same constraints as the type, plus one or more additional constraints. For example, car is a type of vehicle. So any car is also a vehicle, but not every vehicle is a car. Therefore, a thing needs to satisfy more constraints to be a car than to be a vehicle.

22.           The Long Tail

The phrase The Long Tail was first coined by Chris Anderson in an October 2004 Wired magazine article to describe the niche strategy of businesses, such as or Netflix, that sell a large number of unique items in relatively small quantities.

The distribution and inventory costs of these businesses allow them to realize significant profit out of selling small volumes of hard-to-find items to many customers, instead of only selling large volumes of a reduced number of popular items. The group of persons that buy the hard-to-find or “non-hit” items is the customer demographic called the Long Tail.

Given a large enough availability of choice, a large population of customers, and negligible stocking and distribution costs, the selection and buying pattern of the population results in a power law distribution curve, or Pareto distribution. This suggests that a market with a high freedom of choice will create a certain degree of inequality by favouring the upper 20% of the items (“hits” or “head”) against the other 80% (“non-hits” or “long tail”).

23.           Timecode

Timecode is a sequence of numeric codes generated at regular intervals by a timing system. Time codes are used extensively for synchronization and for logging material in recorded media.

24.           Video Server

A disk based machine capable of delivering multiple streams of real time video.

25.           Web 1.0

For the most part websites between 1994 and 2004 were a strictly one-way published media, similar to the Gopher protocol that came before it. Personal web pages were common in Web 1.0, consisting of mainly static pages hosted on free hosting services such as Geocities and AOL.

26.           Web 2.0

This term describes the changing trend in the use of World Wide Web technology and web design that aims to enhance creativity, information sharing, and, most notably, collaboration among users. These concepts have led to the development and evolution of web-based communities and hosted services, such as social-networking sites, video sharing sites, wikis, blogs, and folksonomies. The term became notable after the first O’Reilly Media Web 2.0 conference in 2004. According to Tim O’Reilly:

“Web 2.0 is the business revolution in the computer industry caused by the move to the Internet as a platform, and an attempt to understand the rules for success on that new platform.”

Web 2.0 is about building applications and services around the unique features of the Internet, as opposed to building applications and expecting the Internet to suit as a platform (effectively “fighting the Internet”). Web 2.0 elevates software above the level of a single device, leveraging the power of the “Long Tail”, and with data as a driving force. The architecture is one of participation where users can contribute website content creates network effects. Web 2.0 technologies tend to foster innovation in the assembly of systems and sites composed by pulling together features from distributed, independent developers. (This could be seen as a kind of “open source” or possible “Agile” development process, consistent with an end to the traditional software adoption cycle, typified by the so-called “perpetual beta”.)

27.           Web 3.0

Web 3.0, a phrase coined by John Markoff of the New York Times in 2006, refers to a supposed third generation of Internet-based services that collectively comprise what might be called ‘the intelligent Web’—such as those using semantic web, micro-formats, natural language search, data-mining, machine learning, recommendation agents, and artificial intelligence technologies—which emphasize machine-facilitated understanding of information in order to provide a more productive and intuitive user experience.

Sometimes defined as the third decade of the Web (2010–2020) during which several major complementary technology trends will reach new levels of maturity simultaneously including:

  • transformation of the Web from a network of separately silo’d applications and content repositories to a more seamless and interoperable whole.
  • ubiquitous connectivity, broadband adoption, mobile Internet access and mobile devices;
  • network computing, software-as-a-service business models, Web services interoperability, distributed computing, grid computing and cloud computing;
  • open technologies, open APIs and protocols, open data formats, open-source software platforms and open data;
  • open identity, open reputation, roaming portable identity and personal data;
  • the intelligent web, Semantic Web technologies such as semantic application platforms, and statement-based data-stores;
  • distributed databases, the “World Wide Database” (enabled by Semantic Web technologies); and
  • intelligent applications, natural language processing, machine learning, machine reasoning, autonomous agents.

Thanks to Wikipedia for much of the source data.