I am currently working with the Microsoft Hololens. It provides a Webinterface where you can see a live video in 720p at 30 fps including holograms and audio. But the video stream has a 2-3 seconds delay. Even if I reduce everything at it's minimum, without audio and holos, it barely changes nothing. The hololens uses a http restful API. I've tested it with vlc player and a selfmade C# app using the windows media player. Still no changes. Any ideas?
To reduce lag for live video stream from a hololens, make sure the wireless network the hololens is connected to is capable of high speed transfer. We use one of these TP Link Routers http://www.tp-link.com/us/products/details/cat-5506_AD7200.html which works very well for doing demo presentations and reduces the latency to almost real time.
Related
I am working on a project in C# using Directshow which mainly focuses on image processing and printing in real time. I have written a code for capturing the images and displaying them in a output window . The code works fine with 4 cameras simultaneously.But the moment I add 5th camera the live feed from the 1st 4 cameras starts to glitch.And when I add 6th Camera the video from all the cameras stops .For my project I need to work with 16 cameras and none of them should have low performance.
I have tried to optimize the graph by making all the graphs at once and then using them for their respective cameras but even then the issue isn't resolved.
I am using this project as reference.
the Live cameras feeds should not glitch since there is more than enough memory and CPU for it to function.
USB bandwidth is a typical problem with several USB cameras connected simultaneously. That is, it is unlikely to be a CPU or RAM issue.
Unable to display two cameras in DirectShow
2 usb cameras not working with OpenCV
Implications of using many USB web cameras
I doubt you can use 16 at a time (even distributed between USB hosts) and in any event with 16 video inputs you are limited to either low resolution capture or, instead, capture of video encoded on hardware, even with a light encoding like JPEG (a lot or even majority of webcams nowadays offer on-camera M-JPEG encoding).
I am challenged with the task to stream video and audio (camera and mic) from my android API 19 devices directly to a server. Displaying (on the device) is also optional.
Following this link I learned that you can use Mediarecorder with a socket instead of an output file. But they describe this method is to slow (3 seconds delay), because of the encoding. They also show a second method with preview frames (as MJpeg) but only for video.
Some apps from the store use the same method with an additional audio recorder, which causes a big delay between image and sound. I was hoping someone knows how to live stream from android to a server over wifi. C# code would be much appreciated. Third-party libraries are also welcome as long as they are free for commercial use. Any help/ideas are welcome and thanks in advance.
I'm making a program for do a livestream of the XSplit Encoder (RTMP Server). I need to have a site to watch that stream, and this program, the player need to have a button to the spectator can choose your video quality, and the stream have to be fluid and have a good quality. Can someone explain this or send me a link to do that? Please
(C#)
This is a VERY large undertaking. And an impossible question to answer unless you narrow the scope. You need a ingest server that takes in RTMP, You need a machine with enough CPU power to do all the transcodes. You need site to playback on. You also need enough bandwidth (CDN) for all your viewers. How many viewers do you need to support? What platforms do you want to play back on? iOS? then you need HLS. Web? then you need RTMP. Or you can use DASH if your ok with limiting it to modern browsers. Do you want it to play in firefox? Then it MUST be flash, because firefox does not have native support for the h.264 codec. But flash wont play in iOS. You can use JWPlaver premium, that will play HLS in flash. Actually is h.264 the codec you intend to use? Have you looked into services such as Zencoder live transcoding? Or Wowza with the transcoding module? Amazon offers preconfigured wowza instances. What is your budget for this project? Why not just use twitch?
Edit: You can probably string something together using ffmpeg:
https://trac.ffmpeg.org/wiki/StreamingGuide
As you already know, when you open up Youtube, there is an option for 1080p videos.
My question is, why doesn't the server treat a client as a media player?
So basically, the client sends ONLY input to the server, as in what buttons are pressed, and the server does the drawing, and sends it back to the client as a video, as Youtube would.
Maybe it might be hard for the server to do all the drawing for several players in a game, but this will prevent cheating almost 100%. (Color-related hacks won't stop..)
Should I put this on gamedev?
But won't this be a good idea if the server is a super computer, and there aren't many players?
This is what companies like OnLive are offering already. http://www.onlive.com/
You just have a problem with the latency. When you are playing a shooter you want to have the lowest latency as possible. Gamers optimize this with a faster mouse, faster screen with less ghosting etc.
When your game is like a video, the latency (ping) will become very large. Only difference is that everyone is lagging.
Servers tend to focus on the game logic rather than any rendering, they are responsible for keeping control and synchronising clients to allow clients to perform more complex calculations as part of the rendering process if we change that we face an entirely different problem ...
A video stream is simply a series of images, a fully featured game by todays standards is built up of a full 3D environment in which complex interactions and physics take place.
That of course raises the question ... how does the server render all those viewpoints at once in order to produce the video stream does it need 1 GPU per player like your home pc?
You also have to consider that the internet is unreliable when it comes to bandwidth. The average users home broadband connection gets fits and spurts of bandwidth which is why video streaming is always seemingly smooth and gaming laggy.
Think about it this way ...
In a game I hit the forward key and all other players around me need to know this instantly.
So I need on demand bandwidth as soon i do something or as soon as somehting around me happens.
In a normal video stream, As long as the server can get the first 30 seconds to me in a hurry the rest can come down as bandwidth allows.
So I might be able to get a quick burst of data for 30 seconds worth of video but then for the next say 25 seconds I need no data at all from the server because I have enough to keep the video player going, something that would rarely happen in a game.
For example ...
How often does someone do something unexpected in call of duty?
I am currently working on a project that requires live video streaming, in
realtime which could be from a file or a webcam cast just like a video chat or ustream.tv. I am not getting exactly how that works and how it changes the bitrate dynamically depending upon the bandwidth of the viewer and the device type.
For example: if i am using a 2mbps connection then the video is streamed to me at a high bitrate and otherwise for mobile devices with a lower bandwidth support.
And I have gone thru examples and searched for softwares like Microsoft's SmoothSTreaming, but I want that to be totally custom made.
If you have used the streaming option in VLC media player, that supports online streaming of videos and also live streaming.
Please help.
Look at the Microsoft Media Platform: Player Framework.