SYNCHRONIZED MULTIMEDIA PRESENTATION
BACKGROUND
A. Technical Field
[0001] The present invention relates to a system and method for displaying synchronized multimedia presentations on a terminal(s) attached to a network, and more particularly relates to a platform that synchronizes secondary multimedia files to a primary multimedia file within a presentation.
B. Background of the Invention
[0002] Recent advances in technology as well as physical expansions of network infrastructures have increased the available bandwidth a large number of existing networks, including the Internet. This increase in bandwidth has greatly expanded a typical network's capacity to stream large amounts of data to client terminals on a network. Due to this increase in network capacity, service and content providers' ability to simultaneously stream multiple files to a client has vastly increased over the past years. These advancements have also increased the quality and quantity of files that a provider may deliver to a particular client. For example, in the recent past, if a user wanted to view a video clip, the video file needed to be delivered to and saved locally on a corresponding client terminal before actually viewing the clip. However, currently a provider may deliver content to a client in real-time, allowing a user to view a file as it is being streamed (i.e., Webcasts, server-side cached real-time multimedia files). [0003] Due to the increase in network capacity, many bandwidth considerations that had often constrained software developers in producing new software applications are no longer relevant. As a result, software products requiring higher bandwidth are being produced and made available to customers. These new software
products provide customers greater control and quality in a variety of network and multimedia applications. Additionally, advances in data compression and file protocol have also played a significant role in increasing network utilization. All of these advancements have allowed service providers to effectively stream large multimedia files to a client terminal. Examples of these multimedia files include RealPlayer™ audio/video files, mpeg, avi, and asf.
[0004] The quality and functionality of HTML-based applications have drastically increased. Many businesses and educational institutions have implemented HTML applications to service customers, employees and students. Examples of these types of applications include commercials, instructions, shopping, and Internet educational courses. Typically, a user will interface with an HTML-based application that will perform some function (i.e., display a product and corresponding price, give a detailed description of the installation of a certain product, display a student's course schedule).
[0005] The quality and functionality of audio/video applications have drastically increased as well, due in main part to the increases in network bandwidth. These video applications typically require large amounts of bandwidth in order to function properly on a corresponding client. Content providers, as well as businesses, provide these video applications to generate income or advertise a certain product. Examples of these high bandwidth video applications include Webcasts and streaming video. Typically, a user will only need to initiate a multimedia video/audio stream to display the file on a client terminal.
[0006] Generally, multimedia files, including both HTML-based and audio/video, are viewed independently within a single presentation. For example, when a user is viewing a presentation containing streaming video, the display window generally contains only the streaming video file. Also, an individual may give a presentation to a group of people and have a multimedia file (e.g., a Powerpoint™ slide show) streamed from a remote server to a conference room or other location where the presentation is being given. These multimedia files may be streamed to other types of presentations including Internet classes, sales presentations or on-demand movie/television programs. [0007] Many businesses are continually trying to improve the quality of their website above other competitors. This need for quality is most relevant with businesses
within the online shopping space. An important aspect to these web businesses is the level of interaction a website affords individuals. Additionally, businesses are always trying to improve the visual quality of their websites.
[0008] The use of multiple multimedia files that are concurrently displayed allows a user a higher level of interaction and a content provider an increase in the quality and quantity of content presentations available for delivery. Additionally, a content provider may monitor the use and receive feedback when multimedia files are viewed concurrently. As a result, the provider is able to enable a quality viewing experience for an individual and at the same time monitor a viewer's reactions and preferences to specific characteristics within the presentation.
[0009] Typically, an individual giving a presentation must control a streaming multimedia file. For example, a person using a slide show must click through each slide or manually play and stop a streaming video. This requirement often times reduces the flow of a presentation and may distract both the presenter and audience. Also, the individual may make an error in controlling these multimedia files resulting in an extended break during which the multimedia presentation must be corrected.
[0010] Although a streamed multimedia file enhances a presentation, the level of interactivity between users is limited and between other files is non-existent. For example, an individual may scroll through a slide presentation or stop a streaming video; however, an individual may not effectively display a multi-file multimedia presentation that includes multiple files being concurrently displayed and controlled. This limitation within existing multimedia platforms is caused by a lack of file synchronization or cooperation that displays multiple files concurrently.
[0011] Accordingly it is desirable to provide a platform that displays synchronized multiple multimedia files. Additionally, it is desirable to provide a platform that allows an individual to interact with a multi-file multimedia presentation that is synchronized to a single primary multimedia file.
SUMMARY OF THE INVENTION
[0012] The present invention provides a system and method for synchronizing and displaying multimedia files in a presentation. Specifically, the present invention creates
a platform on which rich content presentations are shown by synchronizing at least one secondary multimedia file to a primary multimedia file.
[0013] The system comprises a first server coupled to a network, a second server coupled to the network, a database coupled to the second server, and at least one client coupled to the network. These networked devices allow multiple files to be transmitted, synchronized and displayed through the network. As a result, a presentation may be stored remotely, transmitted to a client, synchronized locally at the client and displayed on an attached display device accordingly.
[0014] The first server streams a primary multimedia file across the network to the client. This first server may be a video server that streams a video file to the client. This video file is buffered at the client and shown by the display device that is coupled to the client. Specifically, the first multimedia file may be shown in a particular window within the display device. For example, a streaming video may be shown within an ActiveX controlled window within a web browser. Other multimedia files within the presentation are synchronized to this primary multimedia file to provide a rich multi-file presentation.
[0015] The second server transmits a secondary multimedia file across the network to the client. The secondary multimedia file may be an HTML file(s), graphic files, or text files. The secondary multimedia file may be pre-fetched from the second server before the presentation begins. In such an instance, the secondary multimedia file is stored locally on the client in a storage device such as a cache or hard disk drive. This pre-fetching increases the available bandwidth at the client for the primary multimedia file. However, pre-fetching the secondary multimedia file is not required; rather, both the primary and secondary multimedia files may be streamed concurrently during the presentation. [0016] The second server also transmits a synchronization file across the network to the client. The synchronization file contains a plurality of indices that synchronize the secondary multimedia file(s) to the primary multimedia file. For example, the synchronization file may contain a first index that triggers a particular HTML file to be shown when a frame within the primary file is reached. Specifically, as a video is shown in a first window, an HTML picture may be shown in a second window after the 200th frame in the video is shown. This synchronization allows a viewer to see multiple windows showing
synchronized multimedia content during the presentation. This synchronization file may be pre-fetched from the second server before the presentation begins. Additionally, the synchronization file may be processed at the second server allowing the second server to control the synchronization of the presentation. It is important to note that the first server and the second server may be located on the same physical device on the network.
[0017] The client receives these multimedia files from the servers and displays the synchronized multimedia presentation to a viewer. As mentioned above, the presentation may be synchronized at the client according to the synchronization file. In this instance, the client has a certain level of intelligence required to perform these synchronization functions. A processor in the client may access the synchronization file and determine the indices that synchronize the secondary multimedia file to the primary multimedia file. The processor then monitors the primary multimedia file to determine when a particular index is reached (e.g., a particular frame). In response, a secondary multimedia file is displayed in a window on the display device. This process continues until the primary multimedia file is complete and/or all the secondary multimedia files have been shown.
[0018] The display device may comprise a single computer monitor with a web browser having multiple windows. For example, a web browser may have a first window in which the primary multimedia file (e.g., video) is shown. The web browser may have a second window in which the secondary multimedia file(s) is shown. The web browser may have other windows that provide further windows for displaying secondary multimedia files or offer other functions that may be integrated within a presentation. For example, a window within the web browser may be used to provide real-time chatting for a viewer. The viewer may be able ask questions regarding the presentation or discuss the presentation with other viewers at other locations.
[0019] The present invention also offers a security platform that may be integrated within the presentation. Specifically, a viewer may only have limited or no access to the presentation until fulfilling a security function. For example, a viewer may be required to supply a password in order to access particular presentations. Additionally, a viewer may not be given rights to the presentation file that would allow the user to modify the settings of the presentation. Specifically, a viewer may need to supply a password in
order to adjust the presentation in any manner. Also, the presentation may be secured by allowing a viewer to access the presentation from a particular location. For example, a viewer may be sent an email containing a hyperlink referring a particular presentation. Viewing rights within the email may allow the viewer to see the presentation (e.g., from a particular network address).
[0020] The present invention may also provide viewer interaction features within the presentation platform. Specifically, a presentation may have various interactive features that allow a viewer to select items/topics of interest or respond to questions. Responses to these interactive features may operate as indices within the synchronization file that trigger a particular secondary multimedia file. For example, a viewer may be asked a question regarding a particular item shown in the primary multimedia file. In response to the answer, a specific secondary multimedia file may be shown in a separate window that gives further detail of a particular item or topic. This feature allows a viewer to tailor the presentation by providing requests for more information on particular topics. [0021] The present invention may also monitor these interactive functions in order to provide valuable information to a service provider or other individual/entity responsible for the presentation. This information is transmitted from the client to a database coupled to the network. This transmission may occur intermittently during the presentation or may be transmitted after the presentation has concluded. Thereafter, the information may be organized and analyzed. For example, a presentation may ask a viewer a question regarding a color of a particular product. The response is then transmitted to a database containing responses from a large number of viewers. A manufacturer may analyze this information in order to determine whether a product should be manufactured in a particular color. [0022] The features and advantages described in this summary and the following detailed description are not all-inclusive, and particularly, many additional features and advantages will be apparent to one of ordinary skill in the art in view of the drawings, specification, and claims hereof.
BRIEF DESCRIPTION OF THE DRAWINGS
[0023] Fig. 1 is an illustration of a graphical user interface used to display a synclironized multimedia presentation according to an embodiment of the present invention.
[0024] Fig. 2 is an illustration of a system used to deliver the synchronized multimedia presentation to a display device according to an embodiment of the present invention.
[0025] Fig. 3 is a block diagram of a device that may be used to display the synchronized multimedia presentation according to an embodiment of the present invention.
[0026] Fig. 4 is an illustration representing the hierarchical structure of a synchronized multimedia presentation according to an embodiment of the present invention. [0027] Fig. 5 is an illustration of a storage and control device used for a synchronized multimedia presentation according to an embodiment of the present invention
[0028] Fig. 6 is a flow diagram of synchronized multimedia presentation according to an embodiment of the present invention.
[0029] Fig. 7 is a flow diagram of file synchronization within a multimedia presentation according to an embodiment of the present invention.
[0030] Fig. 8 is a flow diagram of a user response storage system implemented within a synchronized multimedia presentation according to an embodiment of the present invention.
[0031] Fig. 9 is a flow diagram of a user interaction monitor implemented within a synchronized multimedia presentation according to an embodiment of the present invention.
[0032] Fig. 10 is a general graphical user interface on which a synchronized multimedia presentation may be initiated and viewed.
[0033] Fig. 11 is an illustration of a synchronized multimedia presentation system having multiple display devices.
DETAILED DESCRIPTION OF THE PREFERRED EMBODIMENTS [0034] A system and method for displaying, modifying, and creating a presentation containing synchronized concurrently-displayed files is described. In the following description, for purposes of explanation, numerous specific details are set forth in order to provide a thorough understanding of the invention. It will be apparent, however, to one skilled in the art that the invention can be practiced without these specific details. In other instances, structures and devices are shown in block diagram form in order to avoid obscuring the invention.
[0035] Reference in the specification to "one embodiment" or "an embodiment" means that a particular feature, structure, or characteristic described in connection with the embodiment is included in at least one embodiment of the invention. The appearances of the phrase "in one embodiment" in various places in the specification are not necessarily all referring to the same embodiment.
[0036] Some portions of the detailed description that follows are presented in terms of algorithms and symbolic representations of operations on data bits within a computer memory. These algorithmic descriptions and representations are the means used by those skilled in the data processing arts to most effectively convey the substance of their work to others skilled in the art. An algorithm is here, and generally, conceived to be a self- consistent sequence of steps leading to a desired result. The steps are those requiring physical manipulations of physical quantities. Usually, through not necessarily, these quantities take the form of electrical or magnetic signals capable of being stored, transferred, combined, compared, and otherwise manipulated. It has proven convenient at times, principally for reasons of common usage, to refer to these signals as bits, values, elements, symbols, characters, terms, numbers, or the like. [0037] It should be borne in mind, however, that all of these and similar terms are to be associated with the appropriate physical quantities and are merely convenient labels applied to these quantities. Unless specifically stated otherwise as apparent from the following discussion, it is appreciated that throughout the description, discussions utilizing terms such as "processing" or "computing" or "retrieving" or "indexing" or "displaying" or the like, refer to the action and processes of a computer system, or similar electronic
computing device, that manipulates and transforms data represented as physical (electronic) quantities within the computer system's registers and memories into other data similarly represented as physical quantities within the computer system memories or registers or other such information storage, transmission or display devices. [0038] The present invention also relates to apparatus for performing the operations herein. This apparatus may be specially constructed for the required purposes, or it may comprise a general-purpose computer selectively activated or reconfigured by a computer readable storage medium, such as, but is not limited to, any type of disk including floppy disks, optical disks, CD-ROMs, and magnetic-optical disks, read-only memories (ROMs), random access memories (RAMs), EPROMs, EEPROMs, magnetic or optical cards, or any type of media suitable for storing electronic instructions, and each coupled to a computer system bus.
[0039] The algorithms and displays presented herein are not inherently related to any particular computer or other apparatus. Various general-purpose systems may be used with programs in accordance with the teachings herein, or it may prove convenient to construct more specialized apparatus to perform the required method steps. The required structure for a variety of these systems will appear from the description below. In addition, the present invention is not described with reference to any particular programming language. It will be appreciated that a variety of programming languages may be used to implement the teachings of the invention as described herein.
[0040] Moreover, the present invention is claimed below operating or working with a plurality of servers attached to a network. As such, software implementing the present invention may be stored locally on a client terminal or in conjunction with one or a plurality of servers on a network. Additionally, the software may be stored within other storage devices (i.e., database, SAN) connected to the network.
A. Overview of a User Interface for a Synchronized Multimedia Presentation [0041] The present invention is directed to a system and method for displaying presentations containing synchronized multimedia files. Generally, a user can view the presentation on a display device attached to a network. For example, the present invention provides a viewing system allowing a user to view synchronized presentations on a web
browser 100 after initiating or accessing an appropriate file through a network. Figure 1 shows an embodiment of the present invention displayed within a web browser.
[0042] The web browser 100 is one example of a display device on which the present invention may be shown. Examples of web browsers include Microsoft Explorer and Netscape Navigator. The browser 100 may be partitioned into different windows on which various different types of multimedia files may be shown. For example, within the web browser 100, a first display window 110 displays a primary multimedia file. Generally, this primary multimedia file is a video file (i.e., mpeg, avi, real, asf, mp3, etc.) although it may be any type of multimedia file including those conforming to VoIP and H.323 standards. The size, shape and color quality of the window may all be pre-determined and/or adjusted by the viewer.
[0043] A second display window 120 within the browser 100 displays a secondary multimedia file. The secondary multimedia file may be an HTML based file or any type of file including a video file, a Microsoft Powerpoint™ file, an image file, or a word processing file. The secondary multimedia file may be shown concurrently with the primary multimedia file during a presentation. The secondary multimedia files may also be automatically converted to a standard file type by the platform. For example, a Powerpoint™ slide may be converted to an XML slide to facilitate easier control and viewing of the slide. During the actual presentation, the secondary multimedia file is controlled by a synchronization file.
[0044] The synchronization file synchronizes the display of the secondary multimedia file to the primary file. According to one embodiment, the synchronization file is retrieved from a remote location before the presentation begins. The synchronization file contains at least one index that relates the display of the secondary multimedia file to the primary multimedia file. Specifically, the secondary multimedia file is displayed by triggering the pre-determined indices within the synchronization file as the primary file progresses. An index may relate to a frame or moment of time within the primary multimedia file or any other indicator through which the secondary multimedia file may be controlled. Once the index is realized and the secondary multimedia file is displayed, the secondary multimedia file remains displayed for an interval of time. This interval of time
may be a default time value, relate to the triggering of a subsequent secondary file, or a predetermined amount of time.
[0045] Although there are only two display windows displaying multimedia files in this example, it is important to note that according to the present invention there can be numerous windows concurrently displaying numerous multimedia files.
[0046] A text message window 190 displays text to a viewer during the presentation. Like the secondary multimedia files, the text messages may be displayed by triggering a predetermined index within the synchronization file. Once the index is realized and the text message is displayed, the text message remains displayed for an interval of time. This interval of time may be a default time value, relate to the triggering of a subsequent secondary file, or a pre-determined amount of time. Additionally, the text message window 190 may facilitate real-time network chatting that allows a viewer to communicate with another individual through the network. In this instance, this type of chatting function would operate according to the H.323 standard or other standards that allow multiple applications to operate on a network concurrently.
[0047] A presentation indices window 130 displays a summary of indices contained within the presentation. This summary allows the viewer to quickly scan the presentation but also review or jump to specific secondary multimedia files indexed to the primary multimedia file. Additionally, the presentation indices window 130 may be hidden from the user or decreased to allow the expansion of other windows (e.g., sedond display window 120) within the browser 100.
[0048] A presentation control window 135 contains a plurality of controls that allow a viewer to manipulate a presentation. For example, a viewer may play, pause, rewind, fast forward or select continual play from a variety of controls within this window 135. Additionally, a status bar (not shown) may be implemented to show a currently displayed frame or time position relative to the entire video file. A user may also control other viewing options within the presentation by accessing controls such as window size, font, font size, font color, and window arrangement. Also, the presentation control window 135 may contain a counter that counts the number of frames that have been displayed as well as displaying the total number of frames within a certain video file.
[0049] The web browser may also have a thumbnail view option to choose video, slide or a full thumbnail view. A video thumbnail is a snapshot of the frame at the index point. This snapshot is automatically extracted at the time of authoring. A slide thumbnail is a smaller view of the slide document (HTML, MS PowerPoint slide ...etc). If a thumbnail cannot be extracted, a generic image/blank is shown. The full thumbnail view may show both the video thumbnail and the corresponding slide thumbnail together for an index point. A thumbnail item may also show a time stamp of the index, image and the annotation.
[0050] Having just described an overview of a display device for showing synchronized presentations, the systems and methods used to display the synchronized presentations are described below. First, the system and its components will be described. Second, the software architecture and their various modules will be described. Third, methods of operation of the present invention will be described. Finally, an exemplary example of a display device capable of showing synchronized presentations is described in detail.
B. Synchronized Multimedia Presentation System [0051] Figure 2 shows a block diagram of a networked system used to provide synchronized multimedia presentations according to an embodiment of the present invention. This system includes a network to which a client 205, a web server 220 and a video server 215 are coupled. A database 225 is coupled to the web server 220 to allow stored data to be transmitted onto the network 210. As previously described, a synchronized multimedia presentation contains a primary multimedia file, for example a video file, stored on the video server 215 attached to the network 210. The synchronized multimedia presentation also contains a secondary multimedia file, for example an HTML-based file but may be any type of file including another video file. The secondary multimedia file is indexed to the primary multimedia file by a synchronization file. Both the secondary multimedia file and the synchronization file are stored on the database 225 and are transmitted onto the network 210 via the web server 220. Additionally, the database 225 may be used to store other type of data relating to the presentation, for example, feedback
information regarding a viewer's response to the presentation may be transmitted from the client 205 to the web server 220 and stored in the database 225.
[0052] According to a first embodiment of the present invention, components of a synchronized multimedia presentation are streamed from the video server 215 and the web server 220 to the client 205. In this example, a video file, is stored on the video server 215 and streamed to the client 205 during a presentation. Secondary multimedia files are stored in the database 225 and transmitted to the client from the web server 220. Generally, there will be only one secondary multimedia file, typically an HTML-based file. However, multiple secondary multimedia files may be used to implement numerous files being synchronized and displayed within the presentation. Additionally, the secondary multimedia files do not need to be HTML-based files but can be any type of file including video, text, image files.
[0053] An individual may view the synchronized presentation by logging onto the network 210 and initiating the presentation. For example, the individual may use a uniform resource locator to address the presentation across the network 210 and display it within a web browser. This initiation include entering a password, opening an email message or clicking on a hyperlink.
[0054] The presentation may be transmitted to the client 205 using various types of methods. For example, the secondary multimedia files may be pre-fetched from the web server 220 and buffered in the client terminal 205 before the presentation begins. Also, the synchronization file, containing indexing information, stored within the database 225 may be pre-fetched and buffered in the client terminal 205. Once these files are buffered, a video file may be streamed to the client terminal 205 from the video server 215. Software, stored either locally on the client 205 or remotely on a server, monitors the video file as it is displayed and synchronizes the secondary multimedia file(s) according to the indexing information within the synchronization file.
[0055] The client 205 provides a screen(s) on which the presentation may be shown as well as a processing device and storage device that allow the components of the presentation to be synchronized. It is important to note that the web server 220 and the video server 215 may be located on the same physical device. For example, a single server may operate both as the web server 220 and the video server 215.
[0056] Figure 3 shows an example of a client 205 on which a synchronized multimedia presentation may be shown. The client terminal comprises a control unit 300 coupled, via a bus, to a display 305, a keyboard 310, a cursor control 315, a network controller 320, and an I/O device 325. [0057] The control unit 300 is typically a personal computer or computing box attached to a network. However, it may also be a personal digital assistant or any other device able to receive, process and display data. In one embodiment, the control unit 300 has an operating system (i.e., Windows, UNIX, etc.) upon which multiple applications operate. The control unit 300 comprises a processor 350, main memory 335, and a data storage device all connected to a bus 330.
[0058] A processor 350 processes data signals and may comprise various computing architectures including a complex instruction set computer (CISC) architecture, a reduced instruction set computer (RISC) architecture, or an architecture implementing a combination of instruction sets. Although only a single processor is shown, multiple processors may be attached.
[0059] Main memory 335 may store instructions and/or data that may be executed by processor 350. The instructions and/or data may comprise code for performing any and/or all of the techniques described herein. Main memory 335 may be a dynamic random access memory (DRAM) device, a static random access memory (SRAM) device, or some other memory device known in the art. The memory 335 preferably includes a web browser 340 is of a conventional type that provides access to the Internet and processes HTML, XML or other mark up language to generated images on the display device 305. For example, the web browser 340 could be Netscape Navigator or Microsoft Internet Explorer. [0060] Data storage device 345 stores data and instructions for processor 350 and may comprise one or more devices including a hard disk drive, a floppy disk drive, a CD-ROM device, a DVD-ROM device, a DVD-RAM device, a DVD-RW device, a flash memory device, or some other mass storage device known in the art.
[0061] System bus 330 represents a shared bus for communicating information and data throughout control unit 300. System bus 330 may represent one or more buses including an industry standard architecture (ISA) bus, a peripheral component interconnect
(PCI) bus, a universal serial bus (USB), or some other bus known in the art to provide similar functionality.
[0062] Additional components coupled to control unit 300 through system bus
330 include display device 305, keyboard 310, cursor control device 315, network controller 320 and audio device 325. Display device 305 represents any device equipped to display electronic images and data as described herein. Display device 305 may be a cathode ray tube (CRT), liquid crystal display (LCD), or any other similarly equipped display device, screen, or monitor. In one embodiment, display device 305 is equipped with a touch screen in which a touch-sensitive, transparent panel covers the screen of display device 305. [0063] Keyboard 310 represents an alphanumeric input device coupled to control unit 300 to communicate information and command selections to processor 350. Cursor control 315 represents a user input device equipped to communicate positional data as well as command selections to processor 350. Cursor control 315 may include a mouse, a trackball, a stylus, a pen, a touch screen, cursor direction keys, or other mechanisms to cause movement of a cursor. Network controller 320 links control unit 300 to a network that may include multiple processing systems. The network of processing systems may comprise a local area network (LAN), a wide area network (WAN) (e.g., the Internet), and/or any other interconnected data path across which multiple devices may communicate. [0064] One or more I/O devices 325 are coupled to the system bus 330. For example, the I/O device 325 may be an audio device equipped to receive audio input and transmit audio output. Audio input may be received through various devices including a microphone within audio device 325 and network controller 320. Similarly, audio output may originate from various devices including processor 350 and network controller 320. In one embodiment, audio device 325 is a general purpose; audio add-in/expansion card designed for use within a general purpose computer system. Optionally, audio device 325 may contain one or more analog-to-digital or digital-to-analog converters, and/or one or more digital signal processors to facilitate audio processing.
[0065] It should be apparent to one skilled in the art that control unit 300 may include more or less components than those shown in Figure 3 without departing from the spirit and scope of the present invention. For example, control unit 300 may include additional memory, such as, for example, a first or second level cache, or one or more
application specific integrated circuits (ASICs). Similarly, additional components may be coupled to control unit 350 including, for example, image scanning devices, digital still or video cameras, or other devices that may or may not be equipped to capture and/or download electronic data to control unit 300. [0066] Figure 4 is a block diagram representing a general hierarchical model of the multiple layers within an embodiment of the present invention. A display composite 400 is generated through the combination of a video layer 410, an HTML layer 420, and a control/synchronization layer 430. The video layer 410 comprises a video file (i.e., primary multimedia file) that is streamed from the video server 215 and displayed in a first display window 110. Specifically, this video is streamed from a video server during the presentation and buffered at a client. The video stream is then processed and displayed in a window on the web browser. 100
[0067] An HTML layer 420 comprises an HTML file(s) that is pre-fetched from the database 225 via the web server 220. The HTML file is stored within the client prior to the streaming of the video file begins. The HTML file(s) is synchronized to the streaming video and displayed in the second display window 120 within the web browser 100. Note that the HTML files need not be pre-fetched but can be synchronized at the server-side and streamed from the server according to the synchronization.
[0068] A control layer 430 controls and synchronizes both the video file and the HTML file. In one example, this control data may be stored within a single synchronization file that is pre-fetched from the database 225 via the web server 220. The control layer 430 manages the display options of the window in which the video is shown and the window in which the HTML file is shown. These display options include the size and brightness of the window, the volume of an audio file, etc. Additionally, the control layer 430 synchronizes the display of both the video and HTML file. As described above, the display of the HTML file(s) may be triggered by a particular index (e.g., time index or frame index) in the video file. If multiple HTML files are used, then each file may have an index to the video file that will trigger its display.
[0069] The composite layer 400 controls the incorporation of the video presentation and HTML presentation within a single browser. The composite layer 400 utilizes a variety of applications to implement this incorporation including ActiveX, and the
underlying operating system (Microsoft Windows™ or Unix™). Thus, the multi-layered presentation provides a user an independently controlled presentation while still allowing a presentation designer the ability to incorporate user interactivity within the presentation.
C. Client-Side Control of Synchronized Presentation [0070] Figure 4 shows a more detailed drawing of the main memory 335 of the client 105. The main memory generally comprises an operating system 500 whereon a number of applications operate. A bus 330 couples the operating system 500 to the applications module 505. In a preferred embodiment, a video storage module 510 is coupled to the bus 330. The video storage module 510 stores/buffers a primary video file received from the video server 215. The primary video file may be stored locally or streamed in real time from the video server 215 and buffered in the video storage module. Examples of a video storage module include a portion of a hard disk drive or a RAM module.
[0071] An HTML storage module 520 is coupled to the bus 330. The HTML storage module 520 stores secondary multimedia files received from the web server 220. These secondary multimedia files may be either converted to an HTML format by an author prior to storage or automatically converted by the HTML storage module 520 to an HTML based file. For example, a text file may automatically converted to an HTML file using one of numerous conversion files known within the art. These secondary multimedia files may be stored locally within the HTML storage module 520, pre-fetched from the web server 220 and buffered in the module 520, or streamed in real time and buffered in the module 520. Examples of an HTML storage module 420 include a portion of a hard disk drive or a RAM module.
[0072] A synchronization control module 525 is coupled to the bus 330. The synchronization control module 525 stores a synchronization file containing indexing information and file addressing information created during the authoring of the presentation. Generally, this information is buffered in the synchronization control module 525 before the presentation begins. However, the synchronization control module 525 may receive indexing data during the presentation. The processor 350 accesses this indexing information from the synchronization control module 525 during the presentation in order to
properly synchronize the HTML file(s) to the video file. Examples of a synchronization control module 425 include a portion of a hard disk drive or a RAM module.
[0073] A graphical user interface control module 530 is coupled to a bus 330.
The graphical user interface control module 530 stores graphical display options created by the author during the creation of a presentation. Examples of these options include the size of each of the display windows, the volume of the audio, and the duration a secondary multimedia file is displayed. Once these display options are stored within the module 530, the processor 350 may access this data in order to provide the correct graphics on the display 305. Examples of a graphical control module 530 include a portion of a hard disk drive or a RAM module.
D. Methods for Displaying a Synchronized Multimedia Presentation [0074] Figure 6 is a flowchart showing a first method for displaying synchronized multimedia files contained within a presentation. In order to display a presentation, an individual must initiate 605 the presentation from the 205 connected to the network 210. The presentation may be initiated within a browser using a uniform resource locator. After the presentation has been initiated, certain files may be pre-fetched from a server and buffered at a client 205. For example, secondary multimedia file, such as HTML files stored on the database 225, may be pre-fetched 610 and buffered at the client 205 before the presentation begins. Additionally, synchronization data may be pre-fetched 615 from the database 225 and buffered at the client 205 before the presentation begins.
However, it is important to note that the presentation may be displayed without pre-fetching any secondary multimedia files.
[0075] A primary video file is streamed 620 from the video server 215 through the network 210 to the client 205. Synchronization data is used by the processor 350 to synchronize and display 625 the secondary multimedia file(s) in relation to the primary video file. The synchronization data is either processed locally or remotely on a server. In one example, the client terminal 205 process the synchronization data and displays both files accordingly. In a second example, synchronization data is processed remotely on a server, either the video server 215 or the web server 220. In this example, both servers
would communicate with each other in order to trigger the transmission and display of secondary multimedia files from the web server 220.
[0076] Figure 7 is a flowchart showing a method for synchronizing the display of a secondary multimedia file to a primary multimedia file. Initially, a primary multimedia file is shown 705 within a display. A processor, within a display device or coupled to the display device, monitors 710 the video file in order to properly identify synchronization indices relating to the secondary multimedia file(s). For example, the progression of frames within a video file may be monitored in order to identify a frame index relating to a secondary multimedia file. Similarly, the time progression of a video file as it is being displayed may be momtored to identify a timing index relating to a secondary multimedia file.
[0077] Once an index is identified, a secondary multimedia file is displayed
720 within another window. For example, a particular HTML file is displayed in a second window when an index at the 200th frame of the video file is reached. This HTML file may continue to be displayed until the next index is reached or for a particular period of time. However, during this process, the video file continues playing 730. This process continues until the entire video is played.
E. Methods for Providing User Interactivity with the Presentation [0078] Figure 8 shows a flowchart describing a user response storage system that may be implemented within a synchronized multimedia. During the presentation, a user may be prompted by a question or other inquiry 805. For example, a user may be asked what her/she thinks of a certain part of the presentation. This question may be embedded within a secondary multimedia file. The system may wait 810 for a response from the user before continuing on with the presentation or may simply continue the presentation. This response may be a user activating a graphical icon in the presentation or an audio response that is recorded by the display device.
[0079] After the user responds, the system transmits the response to a storage system either locally or remotely on an attached network. For example, the response may be transmitted 815 across the network 210 to the web server 220 whereupon it is stored 820 in the database 225. In a another embodiment, the response information may be stored in a
different location on the network. In any event, this response information may then be accessed 825 from the database 225 and analyzed 830.
[0080] Figure 9 illustrates a method for monitoring interactive functions that may be embedded 905 within the primary video file. Interactive functions include pausing, stopping, or rewinding the video file. Also, more complex interactive may be embedded in the primary video that allow a user to identify or activate points of interest within the primary video. For example, a user may be able to use a mouse to click on an item shown within the primary video to identify the item as something of interest. This click may then initiate a secondary multimedia file in a different window that describes the item of interest in more detail. This interactive primary video display also allows a viewer to interact with the video by clicking on certain frames, by editing a video or multimedia file, or other ways of interaction. This interaction is monitored 910 by a client (e.g., 205) on which the presentation is shown and stored locally. Next, the client transmits 915 the interaction data through an attached network to a web server. This data may be transmitted during the presentation, or stored locally and transmitted at the end of the presentation.
[0081] The interaction data is then stored 920 in a storage device that is coupled to the network. For example, this interaction data may be transmitted to the web server 220 and stored in the attached database 225. Thereafter, if a provider or a business wants to analyze the stored data, the interaction data may be retrieved 925 from the database and subsequently analyzed 930.
F. Embodiment of Other User Interfaces for Initiating a Synchronized Multimedia Presentation [0082] Figure 10 illustrates a browser window 1000 for accessing a synchronized multimedia presentation. An addressing windowlOlO may be used to access the presentation by a specific uniform resource locator via a web page. To view the web page, the user uses an Internet browser 1000 like Microsoft Explorer™ or Netscape Navigator™. According to this example, the user is given a variety of choices, one of which is to see a list of presentations contained within multiple directories within the presentation view window 1040. Access to this list may be password protected. Accordingly, a presentation security window 1030 within the browser 1000 may require a
password from a user in order to gain access to the presentation directories or open a particular presentation.
[0083] If the user knows which file to view, then the user may select a view a presentation icon (not shown) within the presentation view window 1040. Once the icon is selected, the user may input the name of the presentation or select the presentation from a default home directory. Once a specific presentation is selected, the presentation will be displayed within the browser or a different browser.
[0084] A user may modify settings of the presentation using a presentation control window 1050 within the browser 1000. For example, controls such as various window sizes within the presentation, ActiveX specific controls, brightness, video rate, etc. may be controlled within this window 1050 prior to displaying the presentation. This functionality allows a presenter to tune the presentation before it is given. The presentation controls may be adjusted locally where the presentation is given or adjusted remotely at a terminal coupled to the network. [0085] A user may also initiate a presentation by opening an email and clicking on a hyperlink. The hyperlink will take the browser directly to a URL address where the presentation will automatically begin. This emailing operation allows an author to invite individuals to view a presentation by merely sending out an email containing a hyperlink to a corresponding URL address. Security features may be embedded into the presentation or platform on which the presentation operates that allow the presentation to be initiated by a single or multiple IP addresses corresponding to invitation emails.
G. Multi-Display Synchronized Multimedia Presentation [0086] Figure 11 illustrates a system on which a single presentation may be transmitted to multiple display devices coupled to a network. As previously discussed, the web server 220 and video server 215 transmit multiple multimedia files that are synchronized and incorporated within a presentation. According to one embodiment, the presentation may be shown on multiple display devices. For example, a primary multimedia file, a secondary multimedia file and a synchronization file may be transmitted to a client 1100 coupled to the network 210. The client 1100 comprises a storage device and processing unit that allows the different multimedia files to be synchronized and
incorporated into a single presentation. Once these multimedia files have been appropriately processed, files are transmitted to an aπay of display devices on which the presentation is shown.
[0087] A first display device 1105 is coupled to the client 1100 and shows a primary multimedia file. As previously discussed, this primary multimedia file may be a video file that is streamed onto the client 1100 from the video server 215. The first display device 1105 may be coupled to the client 1100 via a serial or parallel connection, as would be the case if it were a computer monitor. The first display device 1105 may be coupled to the client 1100 via a network connection such as an Ethernet or IP network connection. However, in this instance, the first display device 1105 would likely require a certain level of intelligence in order to properly function on the network.
[0088] A second display device 1110 is coupled to the client 1100 and shows a secondary multimedia file. The secondary multimedia file may be another video file, an HTML file, a text file or any other type of file that will allow it to be shown concuπently with the primary multimedia file. As was the case with the first display device 1105, the second display device 1110 may be networked to the client 1100 or coupled either in series or parallel to the client 1100. A third display device 1115 is coupled to the client 1110 and may show another secondary multimedia file or provide another window that may be implemented within the presentation. For example, this third display device 1115 may show an HTML file(s) or be used as a text chat room that enables a viewer to communicate with someone in real-time as the presentation is progressing.
[0089] It is important to note that some or all of these windows may be combined into a single web browser. Additionally, the present invention facilitates streaming the presentation to multiple viewers that are watching it concurrently. For example, the present invention provides a platform on which an instructor may teach multiple students located in different locations. Specifically, a streaming video file of the instructor is transmitted to each client at which a student is located. Other teaching material may be transmitted to these clients and synchronized to the streaming video. Thus, one skilled in the art will recognize a large number of different types of applications that may operate on this synchronized multimedia presentation platform.
[0090] While the present invention has been described with reference to certain embodiments, those skilled in the art will recognize that various modifications may be provided. Variations upon and modifications to the prefeπed embodiments are provided for by the present invention, which is limited only by the following claims.