Congratulations, you have found where the tech team have their say. .
Please feel free to comment and leave a note for us. The more feed back the more it excites out technical team to write content..
Congratulations, you have found where the tech team have their say. .
Please feel free to comment and leave a note for us. The more feed back the more it excites out technical team to write content..
Our future plans are to present these codes to the open source community.
Kwiktec Cloud Playout System
KwikTec has built a custom Cloud GlobeCast playout system that specifically caters for smaller commercial and community broadcasters. This is perfect for the customer that does not have the budget for an expensive final control software system or for a dedicated fibre or microwave last mile link to connect to GlobeCast’s satellite uplink facility. The playout system consists of three main components: content storage, scheduling and playout.
Connecting via the internet, the customer is able to connect to dedicated server instance, upload and manage existing content and schedule when content will be played out. The playout component will connect into the customers dedicated satellite channel and play the scheduled content out seamlessly and at the best possible quality for the medium.
The playout system provides an end-to-end management platform for customers to upload, schedule and playout content.
All interactions with the playout system happen via the internet, allowing the customer to manage their channel from virtually anywhere in the world.
Content is uploaded to the playout system using a network protocol known as the Secure File Transfer Protocol (or SFTP). This allows the content to be copied over the internet securely and adds an extra layer of protection from any unwanted visitors getting access to the site. SFTP is a very common protocol and SFTP file managers are available for all popular computing platforms (WinSCP, Cyberduck, Filezilla, FireFTP addon for Firefox, etc.). While files can be uploaded from anywhere via the internet, there are some inherent time limitations for uploading large files. A fast ADSL line has a maximum upload capability of 512kbps, which limits you to just over 200MB of data transfer that can be uploaded in an hour. The typical video file that will be used with the playout system can easily be over 1GB for 1 hour of footage.
KwikTec has the option of allowing customers to access a LAN connection at the uplink facility to connect to the Content Storage at high speed (up to 100 mbps). Alternatively you can ship the media on some form of external storage such as an external HDD and someone from KwikTec will upload the files for you.
The playout system will watch for files that are added to the directory and make them available in the scheduler for to be added for scheduled playback.
The scheduler is a web based application that allows you to create a schedule of files and playlists that will be played back at certain times of day. Each customer gets their own interface that will allow them to manage their own channels.
Once the customer has logged in to their interface, with their unique login credentials, they will see the following interface.
This web application shows what is currently playing, the schedule list and allows you add, change and delete items in the schedule.
In the image above, the label shows where the current time is displayed. This is the playout system time (SAST or which is GMT+2) and it is linked to the central clocks at KwikTec that are used to synchronise the other TV stations and satellite broadcasts.
The label shows the now playing indicator. This shows which file is currently being played and how long it has been playing for. This is also a progress indicator bar, and the grey bar grows as the file plays to show the approximate position of the file that is currently playing.
Label shows the schedule summary which gives an indication of when shows are scheduled for, in a particular 24 hour period (00:00:00 to 23:50:00). Navigation is down by click the appropriate << Prev or Next >> buttons and moving to the next or previous day. If you navigate to the current day (i.e. today), the date text changes to red and the current time is indicated with a red line.
For all other days the date text remains black and the red time indicator line is not shown.
Label is where the detailed schedule for that particular day is shown. This is a table of the schedule that lists the type of item (file or playlist), file name, start time, end time and duration and also is where you access the schedule control functions.
Editing the Schedule
The schedule is a simple, linear time based system whereby specific content is selected to play at particular time, on a particular day. There are 2 types of content that the scheduler can work with: individual items, and playlists. A playlist is a collection of individual items that makes repeating a particular sequence of items simpler.
Adding an Item
To add an item, from the main home page, click the Add Item button:
From here, you can select the type, which for an individual item is file. Select the file you want added and the start time (this is accurate to the 1 second) and then click Add .
If you are successful, you will see a pop up that says “success – database updated”. There are some limitations though, you can’t schedule items in the past or schedule an item that overlaps with an existing item.
Working with Playlists
To add or modify playlists, click the Playlists button:
This will present you with a list of the current playlists. To go back to the main screen and the schedule click the Schedule button.
To add a new playlist, click the Add Playlist button:
A dialog will appear. Type in the name of the new playlist, and then click Add . This newly created playlist will appear in the playlist list with a duration of 00:00:00. To edit a particular playlist click the Edit button next the playlist you wish to edit. This will bring up the playlist editor.
To add an item to the playlist, select the item from the dropdown list, and click the button. This will append the item to the end of the list. Use the arrows to move items up and down in the list and the button to delete an item that is no longer required. Once you have finished editing the playlist, click the Ok button. This will take you back to the playlist list and you will see that the playlist duration has been updated. To get back the schedule, click the Schedule button.
Adding a Playlist to the Schedule
The process of adding a playlist to the schedule is exactly the same as adding a individual item (detailed earlier in the document), but you need to change the type from file to playlist. The dropdown selection list will change to reflect the available playlists rather than the available files.
Any playlists that are in the schedule can be identified by the lighter blue colour.
Editing the Schedule
To modify the existing schedule, use one of either the Edit , Add After or Delete buttons.
Edit will bring up the item edit dialog that will allow you change the start time.
Add After brings up the add item dialog, with the start time set to the end of the associated item.
Delete deletes the item from the schedule (but it does not delete the item from the content storage).
The Playout service is fairly straight forward from the customer’s perspective. Any item that is still in the schedule when the appropriate time arrives will be played back through the playout service to the customer’s satellite channel and broadcast live.
This document describes Kwiktec AdServer Setup approach and function overview.
Companion Ad: Commonly text, display ads, rich media, or skins that wrap around the video experience. These ads come in a number of sizes and shapes and typically run alongside or surrounding the video player.
InLine Ad: VAST document that includes all the elements necessary to display the visual experience of the ad.
Linear Video Ad: The ad is presented before, in the middle of, or after the video content is consumed by the user, in very much the same way a TV commercial can play before, during or after the chosen program.
Non-linear Video Ad: The ad runs concurrently with the video content so the users see the ad while viewing thecontent. Non-linear video ads can be delivered as text, graphical ads, or as video overlays.
Primary Ad Server: First ad serving system called by the Video Player or other framework. It is assumed that in most cases a publisher will make all initial ad requests through their Primary Ad Server (whether homegrown or third party), then redirect to other ad servers as needed.
Secondary Ad Server: Ad server used by an ad network or by the buyer of ads to serve creative, track results and optimize creatives.
VAST (Video Ad Serving Template): XML document format describing an ad to be displayed in, over, or around a Video Player or a Wrapper pointing to a downstream VAST document to be requested.
Video Player: Environment in which in-stream video content is played. The Video Player may be built by the publisher or provided by a vendor.
Wrapper Ad: VAST document that points to another VAST document from a different server.
Are compressed videos that take up less space than the full res files, but are hi res “enough” that they make great quality highly compressed videos. Kwiktec uses these files for Transcoding.
Transcoding is the direct digital-to-digital conversion of one encoding to another. Kwiktec does this as part of the ingestion process to convert the Mezzanine file to the various encodings required for streaming.
This profile is generally targeted at light applications such as video conferencing or playback on mobile devices with limited processing power. It provides the least efficient compression among the three choices, and at the lowest CPU overhead on decoding.
This profile has more capabilities than Baseline, which generally translates to better efficiency; yet it comes at the cost of a relatively higher CPU overhead (though less than the High profile). This profile is usually used in medium-quality web video applications.
This is the most efficient profile among the three. It has the most capabilities that pack more quality into a given bit rate, yet it is also the hardest to process because of these added operations. Though originally intended only for high-definition applications such as Blu-ray, this profile is increasingly becoming popular for web video applications
Key Frame Interval (I-frame)
A key frame in filmmaking is a drawing that defines the starting and ending points of any smooth transition.
For dynamic streaming also known as adaptive streaming:
Keeping a constant video keyframe interval instead of a variable keyframe interval ensures that the keyframes in the streams are not very far away from one another. Since the server switches streams on a video keyframe, having them too far away could cause a delay in the switch to happen on the server side.
I-frames are the least compressible but don’t require other video frames to decode.
P-frames can use data from previous frames to decompress and are more compressible than I-frames.
B-frames can use both previous and forward frames for data reference to get the highest amount of data compression.
A sequence of video frames, consisting of two keyframes, one forward-predicted frame and one bi-directionally predicted frame.
There are three types of pictures (or frames) used in video compression: I-frames, P-frames, and B-frames.
An I-frame is an ‘Intra-coded picture’, in effect a fully-specified picture, like a conventional static image file. P-frames and B-frames hold only part of the image information, so they need less space to store than an I-frame, and thus improve video compression rates.
A P-frame (‘Predicted picture’) holds only the changes in the image from the previous frame. For example, in a scene where a car moves across a stationary background, only the car’s movements need to be encoded. The encoder does not need to store the unchanging background pixels in the P-frame, thus saving space. P-frames are also known as delta-frames. A B-frame (‘Bi-predictive picture’) saves even more space by using differences between the current frame and both the preceding and following frames to specify its content.
URL Token Authentication
Token authentication allows you to generate secured links with expiration time.
Secured links provide content only within desired period and only to visitors having the links with the secure hash.
It is not possible to download secured content without valid (unexpired) hash from the CDN resource.
After the expiration time the links are unavailable and new ones must be generated in order to download the secured content again.
The Player is the name of Kwiktec media player that allows for a single implementation to support both iOS streaming in HTML5 and the traditional Flash player.
Each Kwiktec Ad server client is assigned an API Key that is essentially a password used to gain access to the Kwiktec AdServer API.
An Application Alias is a reference to an Application Configuration stored on the Kwiktec platform. This controls the settings e.g. Look and Feel, Advertising, Analytics, Security plus many other features of the application and or player. The Application Alias is used to generate your session token for all secure interactions with the AdServer API.
An endpoint is the entry point to a service provided by the AdServer API.
VAST (Video Ad Serving Template)
XML document format defined by the IAB for describing an ad to be displayed in, over, or around a Video Player or a Wrapper pointing to a downstream VAST document to be requested. Kwiktec support both VAST 1.0 and 2.0 implementations. See: IAB VAST for more information.
VPAID (Video Player-Ad Interface Definition)
The VPAID standard is to address known interoperability issues between publisher’s video players and different ads when the video ad is expressed in innovative formats (such as non-linear and interactive ads) that require a high level of communication and interaction between the ad and the video player. See: IAB VPAID for more information.
Cue Points are invisible markers in a video file which can be used to trigger external events such as:
Synchronising graphics, subtitles, etc.
Providing navigation options.
A tag is made of metadata comprised of a non-hierarchical keyword or term assigned to content in Kwiktec. Machine tags are metadata tags that use a special syntax to define extra information about a tag.
Machine tags have a namespace, a predicate and a value.
The namespace defines a class or a facet that a tag belongs to (‘music’, ‘movies’, etc.)
The predicate is name of the property for a namespace (‘genre’, ‘classification’, etc.)
The value is the value (‘pop’, ’R’, etc.)
There are no rules for machine tags beyond the syntax to specify the parts of a machine tag. This gives a very flexible approach to meta-data to cater to a diverse range of client needs.
External Articles on Machine Tags
Flickr: Discussing Machine tags in Flickr API
REST (Representational State Transfer)
REST, or Representational State Transfer, is a standard way of accessing data stored in a remote system over HTTP. Kwiktec uses this standard for applications and developers to access media information in an XML, JSON or JSONP format via the AdServer API.
Recommended Video Ad Ingestion/Mezzanine File Format
Video Type: H.264
Size/Ratio: 1920×1080 (16:9) 1440 x 1080 (4:3)
Video Bitrate: 4096kbps (low motion) or 6144kbps (Main motion) Main or High profiles
Key Frame Interval: 75
Audio Type: AAC
Audio Sample Frequency: 44.1kHz
Audio Bite Rate: 192kbps
We are gonna cover a low level overview of how live streaming works. It will be very general and is going to cover the general end to end processes in live streaming. We will not go into Kwiktec products in this blog, this is to for the readers to understand a little more.
First take a look at the signal flow
this is the flow of video to user. this video to be received by an Encoder to be compressed then sent to a CDN or content delivery network for Global distribution. From there it’s pulled by a video player and then received by the end-user.
Let’s examine the first step of this flow.
How do you get a source feed to an encoder? The content needs to be captured by your AV gear that is your cameras and mics, from there can go directly into an encoder. In the case the encoder will not be in the same place as your audio visual equipment, which will get into a little bit more later, the video can be transmitted via satellite, Fibre or ethernet to the encoder.
For satellite transmission you’ll need to send the source feed to an uplink antenna (think of the news satellite truck antenna) The antenna will send the feed to a satellite and then it will be put down or down linked by receiving antenna at your acquisition point where your encoder is. For fibre transmission sending the source feed though a physical fibre cable from your source to the acquisition point. Deals can be done in South Africa for ad-hoc fibre. Redundant LTE is a given on all events.
It’s great if you have the opportunity to use ad-hoc fibre, but it was acting quite expensive and is good for longer term productions. i.e. IDOLS. Let’s take a look at some vocabulary when we talk about delivering the feed from your source AV equipment to your encoder.
Signal transport / Signal Flow / signal transmission path are terms used to refer to the path the video travels from source to destination. i.e. you will hear a video producer ask what the signal flow is video is for an event. His wondering are we using a satellite truck? are we having the encoder encode the feed directly on site? What path is the video feed taking?
Signal acquisition refers to the method by which the video is received. It’s usually a term used when you’re using fibre or satellite or LTE .
uplink and downlink uplink and downlink are pretty obvious being in the satellite game. This refers to a transmission up to a satellite or down from a satellite up, in space satellite.
Satellite time and space segment refers to the actual time and node that your booking on the satellite that’s up in space. All of this time as actually Reserve in advance, you have to be very careful about making sure that you are covered on time. If you have an event that is expected to end at noon and you only book satellite time through noon, if your event runs long you may not have the opportunity to extend your satellite time ad-hoc as somebody else may have booked that segment already. So be very careful when scheduling satellite segments. Similarly fibre time is also booked in advance. Fiber optic cable is physical cable that has a data limit to it. You can only have a certain amount of data travelling through at any given time.
The encoder and encoding software can get confusing. When we talk about an encoder this is a physical device that has the encoding software installed on it and is used to compress the data. Your encoder can be anything from your laptop, a physical device that has coding software on it, a consumer level product with software like Flash Media Live or wirecast encoder installed on it, to an advanced sophisticated dedicated encoding machine used in professional workflows.
The encoding software is coding software that actually performs the video compression.
From this point we will start to focus on an online workflow.
The next step is the encoder to the CDN or CDN entry point. After the video is compressed and optimised for the web, playback needs to be globally distributed.
With a CDN entry point you wanna make sure you are balancing the load of requests. You don’t want 200 users trying to do the Stream directly from your encoder because it was not designed for this. The CDN ingests the stream at the edge of the internet and distributes it to processes internally. This transmission from encoder to CDN happens over the Internet. The encoder needs to be located somewhere with reliable Internet connectivity, so it can continuously stream the data to the CDN entry point.
If there isn’t reliable internet where your video source is, that’s when you need to consider bringing in Satellite transmission van. This can be used to transfer your video to your encoder at an off-site location where you have reliable connectivity to stream to the CDN.
Vocabulary covered here
Content distribution network
POP is a reference if you hear a lot, you will want to stream to your nearest point of presents. This is physical location of your CDN servers. For example if you have an event that happening in Franconville your nearest service might be in Paris, you do not want to stream to your entry point sitting in London or Singapore.
CDN Connect info, Ingestion Info, Encoder Output info, CDN entry point and Mount point
5 phrases we talking about here are the same. These are the entry points in the CDN infrastructure you’re sending your traffic from the encoder. So in most cases your CDN is going to give you the address of a server that is it would connecting to your CDN infrastructure.
Next we have is a process we will go into in details a little later, the inner circle or processing. This is the area that does the processing i.e. into adaptive bitrates, transmuxing, encoding, transcoding, DVR, DRM, tokening and much much more.
This will be focus in later topics.
After this we have, the CDN to the user through the player. The CDN is going to generate a playback URL. That playback URL will be plugged into the video player that users use to watch this stream. In many cases the player will automatically detect that playback URL from the CDN. When the user connects the player will determine the closest POP for the optimal playback.
For example if you are in Gatwick UK you may want to connect to the pop in London to watch the Stream rather than connecting through the POP in Singapore.
There is going to be a lot more delay if your trying to travel that distance.
As illustrated, this way you see the CDN can distribute the traffic load to insure the best delivery to the user. Sometimes a CDN will see issues with the local POP and it can automatically re-route traffic to another POP in the area to detouring users around the problem delivery servers.
Important term here is the playback URL.
The playback URL is the playback URL issued from the CDN. This is the URL created to view and actually watch, the Stream on the player.
Tracking back any issues that you may have if the user is having a problem.
1) eliminate user error
2) then check the player make sure that the user is actually connected to the Internet.
3) make sure that the users CPU load isn’t too high
4) make sure that they can watch some other stream on the Internet.
5) if other users are experiencing the same error, you can probably eliminate user error
6) viewer can view many different locations then you can probably eliminate network issues
7) If users in same area can view no problem, you want to check the player next.
8) Is the user’s player actually pointed to playback URL
9) Next you want to check the CDN is CDN properly delivering your stream. Many times you’ll be able to monitor the stream coming right off of the CDN
10) Then you want to check your network if you are seeing your stream coming in on your CDN entry point, if you are seeing an issue here it’s likely a problem with your encoder.
11) make sure that your encoder is not overloaded, check the CPU , check your source
12) Check the source network make sure that everything is coming out of your encoder ok
13) if you’re encoder looks healthy and you are still seeing errors in the stream check your source, did somebody Kick a cable is loose, Audio Cable, power cable to one of the cameras, ext.. this is a very common problems.
Point is to make sure that everything from source to player is all clear and functioning.
Depending on who you talking to you will hear Codec mentioned.
It comes from the word compressor decompressor or coder decoder and it just the technology that is used to compress the data. Common video codecs are h264, MP2, VP6, H265. For audio codecs you’ll hear mp3, aac, mp2a. There are lots of different codecs, but it comes down to what the player is compatible with. Here we will also touch on wrappers and container formats. You can think of a container format as just that, a container.
Example, flash the Technology that youtube prefer is a container format that utilizes the codec h264 for video and aac for audio. This is commonly used as mobile devices have decoders for these codecs built into the hardware. You can swap out a container without changing the actual files type if you need to change the way that the data is being read.
One of the most important determinants as to the quality of your stream is it the bitrate of the stream. The quality of the stream is measured in bits per second and it’s the amount of data transferred every second.
Most commonly you will hear kilobits per second or megabits per second. You will often hear people say megs or k, there are lots of trendy terms around this.
Keep in account that every Stream has its own bitrate and you want to account for the total of all of those bitrates if you’re doing multiple bitrate streaming or Adaptive streaming. Also take into account any number of redundancies that you prepared for.
We always recommend that you stream redundant encoders and always allow a 20% overhead to insure that you have solid network conditions.
You should also make sure that the network is completely dedicated to your streaming service.
You do not want any outside traffic getting on that network and interfering. Remember the higher the bitrate the higher the quality.
The total bitrate of the amount you are streaming cannot exceed the amount of available bandwidth!
This is very important, as often the location if were your filming, does not have sufficient bandwidth and this is where we talk about bringing in Satellite trucks in or fibre and transmit the feed to an off-site location.
Some golden rules:
make sure you don’t cut it too close always leave room for overhead!
If you only have 4 megs don’t stream a 6000k stream.
We recommend leaving about 20% for overhead
Remember to think about what else is on your network. Your office may have an 100meg line but if 1000 people are sharing that network the bandwidth will quickly be used up.
Also consider the bandwidth when your testing vs your live event.
If you’re streaming an event from your office and you have tested the evening before. You may not have anybody on that Network at that time and you come in the next day to do the event all of a sudden you have 8000 people on that network. So you should really always have a dedicated network wherever possible.
Always run a Speedtest.
Never trusted the onsite IT technician if he tells you how much the bandwidth without double checking. It’s really difficult to increase your bandwidth on the day.
Keep in mind the difference between your inbound bandwidth and your outbound bandwidth, when you are streaming the outbound bandwidth is what you care about.
Lastly make sure you are on a reliable network remember an interruption in your connectivity equals a broken or dead stream.
Don’t stream on wifi, don’t stream on wireless cards!!! Dedicated network!!
A couple of other useful definition you will hear in the industry
Multicam player, is a player capable of playing multiple angles for single event.
Example: you can have a football game that may have the director’s cut or the line cut which what you would see on television, and then you can have an end zone camera or a sideline camera or a star camera, back camera ect. The multicam player you can choose to cut between.
The difficulty here is that each of these camera angles needs to be synced so that if the user switches between these cameras are all in time and sound sync. Each camera is treated as its own individual event, so for a 3 camera webcast you would need 3 encoders. (If you intend on all three cameras to be completely different stream.
DVR just like your home DVR refers to a feature in the player where you can pause and rewind. The CDN is going to capture the bits coming to it and record them in real time so that you can seek backwards and pause in your live event. We we configure the DVR, keep in mind how long you want DVR recording window to be. Do we want it to run the full length of the events or only to show the most recent 10 minutes of the event.
Simu-live . This is a term used when event is recorded in advance and then streams back as if it was live at a later stage. This is often done if you have, For example, a workshop that is taken ahead of time but you wanted to be streamed only once and not on demand. Record and edit the show and then play it back using a playout system, use this source for your encoder to encode.
Encode mean simply to compressed data for transmission, usually to a lower data rate
Re-encode is to re save a file and keep the same format, as example after editing a file.
To transcode is to convert to a different format of a similar quality to gain compatibility.
:example you can transcode it Windows media file to a flash file this usually involves generational loss.
The more transcoding you do the worst quality you’re gonna see.
Transmux is when you swap out container the wrapper file.
TESTING – BIG RED FLAG, because it’s the most valuable thing you can do in preparing for your event.
When you’re outside during a live event the chaos can often get a little overwhelming and the more prepared you are, the better off you’ll be.
Never try and go into any event testing software or fiddling, no matter what your using and start from scratch the day of an event.
Here are a few general tips in production
1) verify the proper operation of everything you are using and responsible for
2) the same goes for every camera, every Switch, mixer, receiver, DVR any source that you’re going to be using for your event, every microphone
3) check batteries, check the condition, check charge. Test everything in a real production scenario well in advance of your event, last thing you wanna do is have batteries in microphone die in the middle of your event.
4) Next we need to check everything is routed to properly.
5) Verify the overall quality of your video. Make sure you’re not seeing any Interlacing or fuzziness , colour changes r picture shift. Picture.
6) Check to see all cameras are white balanced
7) Next we check all sound sources. Sounds good on all of your sources next
8) overlooked can be cropping in your encoder. Many times when you switch between cameras you see a little black border around one camera that you didn’t see on another. You can really easily crop this out in most in encoding software. We don’t want to be this in the middle of an event, check all the sources of your video playback, LikwiseEnsure DVR, cameras all sources have the same crop way in advance.
9) audio sync test once your audios are running you want somebody in front of your camera on a microphone to do clap test and to count to 10 to ensure lips are in sync. This applies to each audio source.
10) lastly you want to verify the strength and stability of the outbound network as mentioned. It is very important that your band meets with what you’re going to be streaming.
make sure you configure your encoder at least 2 hours in advance. This is bare minimum!!! There is not alot you can do in 2 hours to fix something if it goes wrong, but it’s better than 15 minutes.
Start your encoder at least 15 minutes before the event schedule to go live, we really prefer 30 minutes to an hour. You do not want to hit start just as someone is going on stage, and there was a problem.
Make sure you test your backup Encoder, stop your primary encoder or just unplug is ethernet cable and make sure that the player automatically rolls over to The Back-up stream. Start your primary primary encoder and unplug your backup encoder and make sure that it rolls back to your primary…ect
You should always always always be taking a local Archive of your event.
Once you have started recording open up the folder that the file is beig recorder into and ensure the file size is growing. Do this every 30 minutes during the event to ensure capture is happening.
Check your channel and your whatch pages and make sure you’re seeing everything correfctly. If you’re streaming for mobile make sure you’re seeing it on mobile devices and thorugh out your event make sure you continuously monitor. Things can change at the drop of a hat.
Network tips again, because they are very important
remember don’t cut it to close, always leave overhead. Remember who else is on network. Always run a speed test. Ensure you have reliable dedicated network.
a few production tips
Backgrounds – make sure that your background provides contrast with your subjects face, clothing and hair. Avoid extremes in contrast your colour. I.E don’t have anything that’s very dark next to something very light, this tend to be hard on codecs.
Avoid using extraneous detail in ligting or patterns. You seem the presener with the pin stripe shirt and the little squiggly lines that appear on the pinstripe shirt because the codec can not handle the detail.
Avoid highly saturated colours. Avoid busy backgrounds with a lot of detail in them and make sure all of your light sources are consistent. Avoid doing an indoor shot with fluorescent lighting next to the window. Hard colour temprarure changes are hard on codecs.
a few tips wardrode tips
If you’re doing interviews at a convention. Remind speakers to remove badges ans shiny jewelry. Stick with solid colours, stripes and patterns can be hard on codecs. Avoid wearing white and light blue because everything will just look white on camera. Have your speakers bring alternative wardrobe options. Jewelry again make sure you avoid shiny and detailed surfaces. If you have a subject wearing glasses just keep your lighting in mind to avoid reflections and refractions. When it comes to video the single most important determinant of quality no matter what you creating is lighting. You need to shoot with zero gain on your camera. Don’t mix colour temperatures as we said.
Soft light really is easier on codecs.
If you can help it when your shooting make sure that every motion of the camera is deliberate and making a point, unnecessary motion causes stress on codecs. Avoid slope and dissolves zooms. If you can do a hard cut codecs prefer this. Try and avoid handheld cameras when you can use a tripod, all of those little shaky movement again add stress. Where possible shoot progressive, so you don’t have to make software handle the deinterlaced filting.
More audio tips
if your source is mono keep it as a Mono feed. Do not you try to force a mono source to an encoder to generate stereo, this just causes audio to be heard twice.
If there is Q&A, make sure you have microphone runners available and instruct everyone to wait until they have a microphone before asking questions. Remember if they’re not Mic’d they can’t be heard.
NB: Make sure all of your speaker’s leave their cellphones on off or off stage.
On redbull we had a lot of electronic buzzing from phones. We wanna avoid hearing that audio interferance we all dread.
If you see a problem you want to make sure that it’s not localised to one user on one network. How many users are seeing the problem? Is just one? that’s likely user error.. This is true for most cases.
Is it many on one network? that’s like network conjestion with in an area.
If you’re seeing many problems on various networks then it might be a possible source error. Hear you want to check the health on your encoder. Does the Stream look and sound healthy, directly on your encoder? if it doesn’t then something is going bad. Remember a golden rule, garbage in garbage. Verify the health of all of the audio and video services routed to the encoder. If all the sources are good, what’s the CPU load? Is the Stream healthy on your back of encoder? Be sure to check your archive file, if you are seeing audio or video problems in your local archive, the problems are being introduced locally.
If Everything looks good on the encoder it might be a network. Run a speed test and ping test to our servers. Check your bandwidth, is it unstable and inadequate? Then your having local network problems. You should contact your local on site IT support to troubleshoot.
If the network is solid and you’re able to connect to other external domains for your only able to connect to our servers then it’s a possible CDN error. In that case we encourage you to reach Network Administrator, but be prepared to present all of the results of these test..