w3hello.com logo
Home PHP C# C++ Android Java Javascript Python IOS SQL HTML videos Categories
Stream Audio/Video from an iphone app using HTTP Live Streaming
If you really want to stream from an iPhone app you can't do this with the iPhone acting as a server. You need a separate server where you can send data from the iPhone app. So you can use the camera or the microphone in the app to get live content and then you can send asynchronously data to the server, which using mediastreamsegmenter and variantplaylistcreator will convert the data to ts segments and then will append them at the end of the m3u8 file and meanwhile another iPhone app can act as a client and watch the live content that you are streaming from the first app. From my experience this is the only way to achieve that. Hope that helps.

Categories : Iphone

Stream video from iPhone to web server - compress and stream step
Take a look at http://www.gdcl.co.uk/2013/02/20/iOS-Video-Encoding.html. This sample uses an AVAssetWriter to encode the video from the camera to H264 and then uses RTSP/RTP to stream the data to a player.

Categories : Iphone

Live video stream using OGG video chunks
I don't believe there are many popular libraries for OGG encoding out there. However, it would probably be best to create your own. I would start here: http://tools.ietf.org/html/rfc5334

Categories : C#

Stream Live mp3 from Icecast to iPhone app
AVPlayer *player = [[AVPlayer alloc]initWithURL:[NSURL URLWithString:urlString]]; [[NSNotificationCenter defaultCenter] addObserver:self selector:@selector(playerItemDidReachEnd:) name:AVPlayerItemDidPlayToEndTimeNotification object:[songPlayer currentItem]]; [player addObserver:self forKeyPath:@"status" options:0 context:nil]; [player play]; } - (void)observeValueForKeyPath:(NSString *)keyPath ofObject:(id)object change:(NSDictionary *)change context:(void *)context { if (object == player && [keyPath isEqualToString:@"status"]) { if (songPlayer.status == AVPlayerStatusFailed) { NSLog(@"AVPlayer Failed"); } else if (songPlayer.status =

Categories : Iphone

How to convert Live video stream from YUV(HDYC) to RGB
I am not sure why you tagged it with h.264, because HDYC is a flavor of UYVY pixel format, layout and subsampling, just with ITU-R Rec. 709 defined color space. So your question is how do you convert BT.709 YUV to RGB with FFmpeg. FFmpeg's libswscale can do this: its sws_scale does the conversion, and its sws_setColorspaceDetails lets you provide color space details for the conversion. /** * Scale the image slice in srcSlice and put the resulting scaled * slice in the image in dst. A slice is a sequence of consecutive * rows in an image. [...] */ int sws_scale(struct SwsContext *c, const uint8_t *const srcSlice[], const int srcStride[], int srcSliceY, int srcSliceH, uint8_t *const dst[], const int dstStride[]); /** [...] * @param table the yuv2rgb coef

Categories : C++

How can I play Apple HLS live stream using html5 video tag
These are the formats you can play using html5 source tags. Think of a video format as a zip file which contains the encoded video stream and audio stream. The three formats you should care about for the web are (webm, mp4 and ogv): .mp4 = H.264 + AAC .ogg/.ogv = Theora + Vorbis .webm = VP8 + Vorbis

Categories : HTML

Smooth video playback with HTTP Live Stream
The basic strategy is more or less as follows. You start by processing the manifest file and downloading the first few segments to fill your buffer. You begin playback once you are happy you have enough data in the buffer, while continuing to download additional segments sequentially until the end of the manifest, at which point you request it again. If you find new segments in the refreshed manifest, you add these URLs to your download queue. If you do not, you wait for a predetermined period of time and refresh the manifest again. For example, your client could poll the M3U8 manifest depending on the (total duration of the segments * number of segments / 2). I know some commercial players enter a paranoid mode when the playback buffer is getting low and the refreshed manifest does

Categories : Misc

Live mjpeg video stream from server to android
You are promted for a login because the webcam is password protected. Normally with webcams you have to pass the username and password as part of the URL. eg. username:password@ip.address.or.dyndns.com:8085/folder/stream.jpg?size=large where the number at the end is the port number.

Categories : Android

recording live video stream from tv card using ffmpeg in windows
First be sure that the video label you use is really the label return by: ffmpeg -list_devices true -f dshow -i dummy More info here But another solution should be ti use the old "Video For Windows" (VFW). To try that, list your device with: ffmpeg -y -f vfwcap -i list And use your device number as value of the -ioption: ffmpeg -y -f vfwcap -r 25 -i 0 out.mp4 And if finally you are able to record your stream, there is different options, but in your case everything is clear describe here ffmpeg -y -f vfwcap -r 25 -i 0 -f image2 -vf fps=fps=1 out%d.jpg

Categories : Windows

Can Weborb be used to do live video streaming from an iPhone through a media server?
The short answer is yes, the RTMP library for iOS can be used with Red5, FMS, WebORB etc. The library is not the server itself, yet client. It establish the RTMP connection to the server and encodes stream before send it to the server. As I remember the library distributive contains some example to demonstrate how streaming works. Unfortunately, the official site doesn't show any examples related to streaming, the available examples can be useful to start work with the library (http://www.themidnightcoders.com/products/weborb-for-mobile/ios-integration/rtmp-ios-examples-integration-between-java-net-and-ios.html). The documentation looks up to date - http://www.themidnightcoders.com/fileadmin/docs/ios/.

Categories : IOS

Which protocol should be used to send video stream to a media server for live streaming?
Please refer this link , it may be useful to You. http://developer.apple.com/library/ios/#documentation/networkinginternet/conceptual/streamingmediaguide/Introduction/Introduction.html http://www.rambla.be/state-play-overview-streaming-protocols http://stackoverflow.com/questions/2719958/how-to-use-http-live-streaming-protocol-in-iphone-sdk-3-0

Categories : Iphone

How can we get H.264 encoded video stream from iPhone Camera?
Short: You can't, the sample buffer you receive is uncompressed. Methods to get hardware accelerated h264 compression: AVAssetWriter AVCaptureMovieFileOutput As you can see both write to a file, writing to a pipe does not work as the encoder updates header information after a frame or GOP has been fully written. So you better don't touch the file while the encoder writes to it as it does randomly rewrite header information. Without this header information the video file will not be playable (it updates the size field, so the first header written says the file is 0 bytes). Directly writing to a memory area is not supported currently. But you can open the encoded video-file and demux the stream to get to the h264 data (after the encoder has closed the file of course)

Categories : Iphone

using ffmpeg to display video on iPhone
You should not be using ffmpeg in an iOS for a number of reasons. First, there are real license issues that put including ffmpeg in a legal grey area when it comes to iOS apps. Second, performance will be very very poor. Third, iOS already includes APIs that have access to the h.264 hardware on the device. You can find my example Xcode project at AVDecodeEncode, this is an example of using my library to decode from h.264 and then encode back to h.264.

Categories : Iphone

I want to display a videothumbnail image of a video from url in my uiimageview for iphone
An example: NSURL *videoURl = [NSURL fileURLWithPath:videoPath]; AVURLAsset *asset = [[AVURLAsset alloc] initWithURL:videoURl options:nil]; AVAssetImageGenerator *generate = [[AVAssetImageGenerator alloc] initWithAsset:asset]; generate.appliesPreferredTrackTransform = YES; NSError *err = NULL; CMTime time = CMTimeMake(1, 60); CGImageRef imgRef = [generate copyCGImageAtTime:time actualTime:NULL error:&err]; UIImage *img = [[UIImage alloc] initWithCGImage:imgRef]; [YourImageView setImage:img]; Hope it helps..

Categories : Iphone

How to display the video captured from iPhone's camera on the screen without saving it?
You want an AVCapturePreviewLayer. https://developer.apple.com/library/ios/documentation/AudioVideo/Conceptual/AVFoundationPG/Articles/04_MediaCapture.html#//apple_ref/doc/uid/TP40010188-CH5-SW22 As far as what to write in your IBAction method, that'll depend on how you want your app to behave. If you want it to load with the capture session and preview running, you don't really need a button. If you're starting your capture session via the button, just put the PreviewLayer code in the same method (after you have inited your capture session).

Categories : Iphone

How to Send and Receive Live audio from one iPhone mic to the another iPhone speaker?
To use the NSData just use the NSData writeToFile:atomically: method. Pass in the NSTemporary() path and append a path component like "temp.mp4". At that point you have a filePath with valid data that you can use to load the audio asset. AVAudioPlayer can support loading from a URL. OR with iOS 7.0+ You can initialize the AVAudioPlayer directly with the NSData object. init(data data: NSData, fileTypeHint utiString: String?) throws et. al...

Categories : Iphone

Best Route to Stream Music from iPhone to iPhone
To capture and play the streamed audio you can use Audio File Stream Services To stream the audio data to another Iphone, you want to take a look at the new Multipeer Connectivity framework in ios7. It allows you to stream data to nearby peers through a multipeer connectivity session.

Categories : Iphone

I want to stream video from my webserver, but I only want it to stream when there is a viewer
Try Page Visibility API. Supported Browsers : http://caniuse.com/#feat=pagevisibility

Categories : Javascript

Live stream extensions in android
Check this answer: Video Streaming over wifi? Then if u want to see the live streaming in android phone then include vlc plugin inside your application and connect through real time streaming protocol(RTSP) Intent i = new Intent("org.videolan.vlc.VLCApplication.gui.video.VideoPlayerActivity"); i.setAction(Intent.ACTION_VIEW); i.setData(Uri.parse("rtsp://10.0.0.179:8086/")); startActivity(i); If u have installed vlc on your android phone, then you can stream using intent and pass the ip address and port no. as shown

Categories : Android

(iPhone) Live FFT from iPod
You can't get the currently playing audio (security sandbox prevents this) on iOS, unless your app is the one playing the audio using certain select APIs (Audio Queue, RemoteIO, etc.) 3 bandpass filters (made with IIR biquads) will be faster than an FFT. But even a full FFT will use a very small percentage of CPU time. An app can't convert or play protected music from the iTunes library in a form where samples can be captured. The FFT is in the Accelerate framework, not in the audio session.

Categories : Iphone

Live JPEG stream from webcam without using Flash
This can be done if you have a bunch of JPEG images using some creative javascript, but it won't be easy. You can always use a setInterval loop to place an image into place and swap it with the previous one, but depending on your frame rate, this might not go as smooth as you hope. Also bear in mind that the images will have to be loaded, so you might want to delay the display of the images and buffer them before trying to swap them into place (for example not make the stream "live", but delay it a few seconds so the images can be buffered).

Categories : Javascript

HTML to check if RTMP stream is live
I doubt there's a way of doing this just using HTML. You can use rtmpdump if you want to get a little technical: http://blog.svnlabs.com/how-to-check-rtmp-source-stream-is-live-or-not/ http://www.youtube.com/watch?v=EN-UYi0UVGE

Categories : Misc

WebRTC - scalable live stream broadcasting / multicasting
"Scalable" broadcasting is not possible on the Internet, because the IP UDP multicasting is not allowed there. But in theory it's possible on a LAN. The problem with Websockets is that you don't have access to RAW UDP by design and it won't be allowed. The problem with WebRTC is that it's data channels use a form of SRTP, where each session has own encryption key. So unless somebody "invents" or an API allows a way to share one session key between all clients, the multicast is useless.

Categories : Javascript

Live Playback of PCM (wave) Audio from Network Stream
I don;t think you can manage it without popping sounds with the SoundPlayer, because there shouldn't be any delay in pushing the buffers. Normally you should always have one extra buffer buffered. But the SoundPlayer only buffers one buffer. Even when the soundplayer gives an event that it is ready, you're already too late to play a new sound. I advise you to check this link: Recording and Playing Sound with the Waveform Audio Interface http://msdn.microsoft.com/en-us/library/aa446573.aspx There are some examples of the SoundPlayer (skip those), but also how to use the WaveOut. Look at the section Playing with WaveOut. The SoundPlayer is normally used for notification sounds.

Categories : C#

Android live stream never starts after MEDIA_INFO_BUFFERING_START with MediaPlayer
http://developer.android.com/guide/topics/media/mediaplayer.html You mast read about "Using wake locks". To ensure that the CPU continues running while your MediaPlayer is playing, call the setWakeMode() method when initializing your MediaPlayer. Once you do, the MediaPlayer holds the specified lock while playing and releases the lock when paused or stopped: mMediaPlayer = new MediaPlayer(); // ... other initialization here ... mMediaPlayer.setWakeMode(getApplicationContext(), PowerManager.PARTIAL_WAKE_LOCK); However, the wake lock acquired in this example guarantees only that the CPU remains awake. If you are streaming media over the network and you are using Wi-Fi, you probably want to hold a WifiLock as well, which you must acquire and release manually. So, when you start preparing

Categories : Android

live stream desktop to android tablet and delay
What are you using at the android device to view the video? The question looks quite generic. Are you just sending raw frames to the receiver? In that case, they might be quite a bit heavy and it take some time to process them. See if you can actually encode them and stream to network. Secondly, It also depends in network latency, how good is your network? Give a try in a WLAN first and then try it between two public IP Numbers. What is the size of your jitter-buffer at the receiver? If you have large jitter-buffer, players set some percentage limit to fill before it can actually kick playing. Apparently if you have large jitter-buffer it might take long time to fill it and so initial delay for your video. So, in test cases shutdown the jitter-buffer. I could also blame the decoding ca

Categories : Android

Recording live stream from IP camera (MJPEG Compression)
According to the Wikipedia page about MJPEG (http://en.wikipedia.org/wiki/Motion_JPEG#M-JPEG_over_HTTP), the MJPEG stream over HTTP is basically a sequence of JPEG frames, accompanied by a special mime-type. In order to capture these and save them to a video file, I am not sure you can simply write the incoming data to an .mpg file and have a working video. To be honest, I am not quite sure why your script does not write anything at all, but I came across the following page, which, although it is written for specific software, provides examples on how to capture an MJPEG stream and pass it on to a browser: http://www.lavrsen.dk/foswiki/bin/view/Motion/MjpegFrameGrabPHP You could try one of their examples, and instead of passing it to the browser, save it to a file. You can see they read

Categories : PHP

Rails 4 live stream + subscription based updates
You can use ensure to ensure that your stream gets closed once writing has failed or the connection has closed or something has gone amiss: def your_action response.headers['Content-Type'] = 'text/event-stream' begin # Do your writing here: response.stream.write("data: ... ") rescue IOError # When the client disconnects, writing will fail, throwing an IOError ensure response.stream.close end end

Categories : Ruby On Rails

How to use live() to detect when an element attribute switchs from display block to display none?
Sounds like you want to "watch" the value of style.display on that div. That's not impossible, but unpractical and unstable. The easiest solution is to add a new change event handler on the dropdown itself. For that, you can use .live (for that jQuery version), .delegate, or .change (as long as you do it when the DOM is loaded). And make sure to register that event after the original event handler for the dropdown is added. To deal with the timeout, set a timer from your handler, and make sure it's longer than the maximum time from the other timer. For example: $(function(){ $('#dropdown').change(function() { setTimeout(function() { console.log('checking display property'); if($('#myitem').css('display') == 'none') { console.log('the

Categories : Jquery

Live streaming video in Qt
Decoding is no problem. You can use the QMediaPlayer class. Under Linux it actually uses the gstreamer libs. Encoding is not implemented. This is a bit beyond the scope of Qt5. And don't let the classes QMediaRecorder or QVideoEncoderSettings confuse you. They are mainly helper classes for QCamera or QRadioTuner, which provide an already encoded stream from their devices. If you need to encode single images into video frames and put them into a container, you need a 3rd party lib like ffmpeg. Streaming a video from on pc to another is easily implemented with Qt5. Encoding or transcoding cannot be done with Qt5 alone.

Categories : Qt

How do you create a video live wallpaper
For creating a video live wallpaper you should first have a look at any open-source project and for this you can check this Project and for any Tutorial you can have a look at this site.

Categories : Android

Is it possible to retrieve if a Youtube video is live?
I've figured it out: The result.snippet contains a key called tags that contains 3 tags when it's a Hangout on Air (#hoa, #HangoutOnAir and something else). That way I can differ it from a HOA.

Categories : Javascript

upload video clips display video thumb images not working? php
In my opinion you are using wrong thumbnail dimensions. Please echo <?php echo $th_width ." ".$th_height; //check their values what are you getting //before to call following function $im_new = imagecreate($th_width,$th_height); ?>

Categories : PHP

How to convert youtube video url to emded code for display video on our site?
<iframe width="420" height="315" src="//www.youtube.com/v/UNIQUE_ID?rel=0" frameborder="0" allowfullscreen></iframe> Just replace UNIQUE_ID with MFjSVGsYXLs. As for regex , this one should get the last string after &v=: [^&v=]+$

Categories : PHP

Capture live video from webcam using Java an jmf
yes, it is perfect but install the jmf and set the path and class path properly . jmf installation: http://www.stierlen2.homepage.t-online.de/modellbau/3dscanner/jmf_installation.htm

Categories : Java

Video Compression Does not work for a live Capture
ICaptureGraphBuilder::SetOutputFileName is not a good choice of API to set the graph up. It does the job well for simple graphs, but as it forwards you back errors without good description and the stage at which the error actually happened, every time you have hard time trying to understand what goes wrong. The problem might be caused by absence of frame rate information on the media type on the output of video compressor, but as the building stage you have on your screenshot you don't even have this media type yet available and you cannot troubleshoot and get this information. Use IGraphBuilder::AddFilter, IGraphBuilder::Connect, IFileSinkiFilter::SetFileName instead to reliably configure the pipeline.

Categories : C#

Streaming Live Video From Webcam Android
Via file transfer, it is easy to read video files. Here's the answer from another post on SO, and one more The code in that post is as follows. public static Bitmap getVideoFrame(FileDescriptor FD) { MediaMetadataRetriever retriever = new MediaMetadataRetriever(); try { retriever.setDataSource(FD); return retriever.getFrameAtTime(); } catch (IllegalArgumentException ex) { ex.printStackTrace(); } catch (RuntimeException ex) { ex.printStackTrace(); } finally { try { retriever.release(); } catch (RuntimeException ex) { } } return null; } Note that this is off-line process. For online realtime transfer, I have no idea.

Categories : Android

How to display a video in a web page with audio and video filters?
you can use the HTML5 tag.example: <audio controls="controls" autoplay="autoplay" src="example.mp3"></audio> and also play the video you can use the tag more information you can read about HTML5 knowledge.

Categories : PHP

How can I choose which live video to play from red5 server using AS3?
You need to have a list of the stream ids published and then play it using a NetStream object. var nc:NetStream = new NetStream(netConnection) video.attachNetStream(nc); nc.play("streamId");

Categories : Actionscript

How to streaming live video from iOS to Flash Media Server
Refer following links for iOS. You will get idea how to move forward. Hope it helps you. 1.http://www.adobe.com/products/flash-media-streaming.html 2.http://www.adobe.com/products/amazon-web-services.html 3.http://www.wowza.com/

Categories : Iphone



© Copyright 2017 w3hello.com Publishing Limited. All rights reserved.