w3hello.com logo
Home PHP C# C++ Android Java Javascript Python IOS SQL HTML videos Categories
Segfault after 15 seconds while reading a RTSP stream with ffmpeg in Android
I was using the wrong args to sws_scale. Should have been: swscale.sws_scale( img_convert_ctx, new PointerPointer(frame), frame.linesize(), 0, avCodecContext.height(), new PointerPointer(picrgb), picrgb.linesize());

Categories : Android

Android FFMPEG: Could not execute the ffmpeg from Java code
Do you have root on the device? Mount '/data' and then enter your same 'ffmpeg' command in the shell and see whether the error is the same. Try using the shell to test out different command expressions. Try 'ffmpeg' alone and with just one input file. See whether those commands produce expected output. My wild guess would be that there is an issue with calling 'ffmpeg.main()' that relates to the details of your build.

Categories : Android

Error in ffmpeg when reading from UDP stream
Incoming frames needs to be handled in a callback function. So the mechanism should be such that a callback gets called whenever there is a new frame. In this way there is no need to manually fine tune the delay. Disclaimer: I have not used ffmpeg APIs.

Categories : C++

C api on capturing video stream from webcam using ffmpeg
You are basically repeating the question you are referring to. FFmpeg is basically the name of the library and the ready to use tool (command line interface). Its back end is a set of libraries: libavcodec, libavformat, swscale etc. There is no comprehensive documentation on these libraries, instead there are samples, there is mailing list and other sparse resources. Also, the libraries are open source and all these are quite usable once you get the pieces together. Specific questions on ffmpeg are asked/answered on StackOverflow as well.

Categories : C

How to copy audio stream using FFMpeg API ( not a command line tool )
Your question is vague without some kind of code to go along with it, as trust me there are a lot of things that can go wrong when using ffmpeg's libraries directly (and on Windows there is no debuging). Unfortunately ffmpeg's libraries are not well documented so it is generally best to read the source code for ffmpeg in order to use its libraries. Find the equivalent command line options to perform what you want and track that through ffmpeg's source to see the library calls.

Categories : Opencv

recording live video stream from tv card using ffmpeg in windows
First be sure that the video label you use is really the label return by: ffmpeg -list_devices true -f dshow -i dummy More info here But another solution should be ti use the old "Video For Windows" (VFW). To try that, list your device with: ffmpeg -y -f vfwcap -i list And use your device number as value of the -ioption: ffmpeg -y -f vfwcap -r 25 -i 0 out.mp4 And if finally you are able to record your stream, there is different options, but in your case everything is clear describe here ffmpeg -y -f vfwcap -r 25 -i 0 -f image2 -vf fps=fps=1 out%d.jpg

Categories : Windows

Crossdevice encoding static file to stream in browser using FFMPEG (segmented h264 ?)
As far as I know you won't find a setting which works on every device. I'd recommend you to check the user agent and then use different settings for different devices. This way you could also use device optimized settings.

Categories : Node Js

FFMpeg - Merge multiple rtmp stream inputs to a single rtmp output
Copy video stream and merge two mono steams Try the amerge audio filter: ffmpeg -i rtmp://ip:1935/live/micMyStream7 -i rtmp://ip:1935/live/MyStream7 -codec:v copy -filter_complex "[0:a][1:a]amerge" -codec:a aac -strict -2 -f flv rtmp://ip:1935/live/bcove7 ...or simply use -ac 2: ffmpeg -i rtmp://ip:1935/live/micMyStream7 -i rtmp://ip:1935/live/MyStream7 -codec:v copy -ac 2 -codec:a aac -strict -2 -f flv rtmp://ip:1935/live/bcove7 I added -codec:v copy to stream copy the video instead of re-encoding it. I am unable to try the commands right now, so they are untested, and I will probably not be able to reply until Monday.

Categories : Java

How to build FFmpeg 2.0 for android on linux?
Either in ./build_android.sh or in ./configure you have an extra ^M. Open the files with vi and clean them from this garbage, see ./configure : /bin/sh^M : bad interpreter

Categories : Android

How to reduce mp4 size by using FFMPEG lib into android
1) Yes, you need to call System.load("libavcodec.so"). You can access the methods via JNI. 2) You need to create JNI methods which are implemented in C and call ffmpeg. JNI tutorial for Android: http://code.google.com/p/awesomeguy/wiki/JNITutorial ffmpeg tutorial: http://dranger.com/ffmpeg/

Categories : Android

Using ffmpeg to watermark a capturing video on Android
Here is an example, how to capture a webcam video on my Windows system and draw a counting timestamp. To list all devices that can be used as input : ffmpeg -list_devices true -f dshow -i dummy To use my webcam as input and draw the timestamp (with -t 00:01:00 1 minute is recorded): ffmpeg -f dshow -i video="1.3M HD WebCam" -t 00:01:00 -vf "drawtext=fontfile=Arial.ttf: timecode='00:00:00:00': r=25: x=(w-tw)/2: y=h-(2*lh): fontcolor=white: box=1: boxcolor=0x00000000@1" -an -y output.mp4 The font-file Arial.ttf was located in the folder I was with the terminal in. (Source : http://trac.ffmpeg.org/wiki/How%20to%20capture%20a%20webcam%20input and http://trac.ffmpeg.org/wiki/FilteringGuide) I hope it may help. Have a nice day ;)

Categories : Android

Android encode video with ffmpeg while it is still recording
Your real time requirement may lead you away from ffmpeg to webrtc and or to html5. some resources; http://dev.w3.org/2011/webrtc/editor/getusermedia.html (section5) https://github.com/lukeweber/webrtc-jingle-client ondello .. they have api rather than going native and trying to get at the video stream or getting at the framebuffer to acquire an xcopy of what is in the video buffer, and to then duplicate the stream an manage a connection (socket or chunked http), you may want to look at api type alternatives....

Categories : Android

How to use SoxController Class In ffmpeg library In android ? Any Example
Let try this one,, I think it's perfect for you because I tested already. it works well /** * Trim File Wave */ public class TrimFileWave extends AsyncTask<String, Void, Void> { private ProgressDialogUtil progressDialogUtil; private Context context; public TrimFileWave(Context context) { this.context = context; progressDialogUtil = new ProgressDialogUtil(context, R.string.progressing); } @Override protected void onPreExecute() { progressDialogUtil.show(); } @Override protected Void doInBackground(String... params) { startTrimFile(); return null; } @Override protected void onPostExecute(Void unused) { progressDialogUtil.dismiss(); } private void startTrimFile() {

Categories : Android

writing custom codecs for android using FFmpeg
You can use ffmpeg as a tool or the ffmpeg set of libraries (libavcodec, libaviformat, …) on Android. You can add or change ffmpeg codecs in a cross- platform manner, because this project puts a strong emphasis on platform independence. You can use the MediaCodec API instead. But there is no way to extend the MediaCodec API (update it is possible to extend MediaCodec, it is documented at http://source.android.com/devices/media.html#codecs ) and no easy way to let ffmpeg use this API.

Categories : Android

Rendering Performance ffmpeg of JAVACV on Android
Ok, finally I have found the solution, but not the originating problem. To create the video I was using a sort of code like: FFmpegFrameRecorder recorder = new FFmpegFrameRecorder(name,width,heigth); recorder.start() while(imagesinfolder){ IplImage img = highgui.cvLoadImage("/path/to/image[i].jpg") recorder.record(img) i++; } It seems to create a memory problem, maybe because IplImage is not cleaned by the garbage collector. My solution is use, instead of highgui.cvLoadImage, the method: opencv_core.cvLoadImage(img); and then, opencv_core.cvReleaseImage(img); in every iteration. FFmpegFrameRecorder recorder = new FFmpegFrameRecorder(name,width,heigth); recorder.start() while(imagesinfolder){ IplImage img = opencv_core.cvLoadImage("/path/to/image[i].

Categories : Android

I got undefined reference to 'avcodec_register_all' Ffmpeg on android
I am not sure this will work or not but you can try this LOCAL_PATH := $(call my-dir) include $(CLEAR_VARS) LOCAL_MODULE := libavcodec LOCAL_SRC_FILES := ffmpeg/$(TARGET_ARCH_ABI)/lib/$(LOCAL_MODULE).so LOCAL_EXPORT_C_INCLUDES := ffmpeg/$(TARGET_ARCH_ABI)/include include $(PREBUILT_SHARED_LIBRARY) include $(CLEAR_VARS) LOCAL_MODULE := libavdevice LOCAL_SRC_FILES := ffmpeg/$(TARGET_ARCH_ABI)/lib/$(LOCAL_MODULE).so LOCAL_EXPORT_C_INCLUDES := ffmpeg/$(TARGET_ARCH_ABI)/include include $(PREBUILT_SHARED_LIBRARY) include $(CLEAR_VARS) LOCAL_MODULE := libavfilter LOCAL_SRC_FILES := ffmpeg/$(TARGET_ARCH_ABI)/lib/$(LOCAL_MODULE).so LOCAL_EXPORT_C_INCLUDES := ffmpeg/$(TARGET_ARCH_ABI)/include include $(PREBUILT_SHARED_LIBRARY) include $(CLEAR_VARS) LOCAL_MODULE := libavformat LOCAL_SRC_FILES :

Categories : Android

Armv6 + ffmpeg + android = sigill (heap)
I found the solution. This problem only occurs on emulator. This may be a problem related to emulator itself. For those who have a similar problem: Try your package on a real device. I tried the same package on a real armv6 device and it worked.

Categories : Android

RTSP streaming on Android client using FFMpeg
I was in a similar situation some time ago (I wanted to stream an mp3 from an RTMP server) and it was extremely frustrating. However, I managed to scrape together some code that actually did what it was supposed to. Some pointers: You don't want to expose ffmpeg's API to your Java code. Instead, consider creating helper functions like openRTSPStream(String url) and keep the ffmpeg stuff in your C/C++ code. I say this because ffmpeg makes heavy use of pointers and dynamic memory allocation that would make it a pain to try and use it from Java. The script you used to compile the library uses the flag --disable-everything which also means that it probably disables RTSP support. I'd recommend that you either remove that flag or run the configure script with --list-protocol, --list-demuxer, -

Categories : Android

Sevenzip extractFile(String file, Stream stream) method stream parameter. c#
Setup a FileStream that SevenZip can write out to: DirectoryInfo directoryInfo = new DirectoryInfo(ApplicationPath); FileInfo[] zipFile = directoryInfo.GetFiles("*.7z"); using (SevenZip.SevenZipExtractor zipExtractor = new SevenZip.SevenZipExtractor(zipFile[0].FullName)) { using (FileStream fs = new FileStream("", FileMode.Create)) //replace empty string with desired destination { zipExtractor.ExtractFile("ConfigurationStore.xml", fs); } }

Categories : C#

Android FFMPEG command line for video filter
Your ffmpeg build is too old to support the curves video filter with presets. Presets were added on 2013-03-25 (lavfi/curves: add presets support) bumping libavfilter to version 3.48.103. You will need to update your ffmpeg.

Categories : Android

Encode Android AudioRecord raw pcm data to other format using ffmpeg
You can create a .wav file with your data in bytes and later convert this audio file with your image directly in a video. 1 image + 1 audio file = 1 video

Categories : Android

android sound overlap command line ffmpeg not working
Check out those resources for Audio/Video tasks with FFmpeg for Android: Tutorial to compile FFmpeg for Android: http://www.roman10.net/how-to-build-ffmpeg-with-ndk-r9/ FFMpeg API examples: http://ffmpeg.org/doxygen/trunk/examples.html Also if you search github for FFmpeg and Android you'll find a lot of resources. Hope this helps, Cheers.

Categories : Android

How to execute command line ffmpeg commands programatically in android?
Recently I came across the similar problem. My solution is to simulate a command line in Java program. Firstly, I add a function to the file "ffmpeg.c": int cmd_simulation(int argc, const char** argv) { OptionsContext o = { 0 }; // int64_t ti; reset_options(&o, 0); av_log_set_flags(AV_LOG_SKIP_REPEATED); parse_loglevel(argc, argv, options); if(argc>1 && !strcmp(argv[1], "-d")){ run_as_daemon=1; av_log_set_callback(log_callback_null); argc--; argv++; } avcodec_register_all(); avfilter_register_all(); av_register_all(); avformat_network_init(); //show_banner(argc, argv, options); term_init(); parse_cpuflags(argc, argv, options); /* parse options */ parse_options(&o, argc, argv, options, opt_output_file); if (nb_output_files <= 0 && nb

Categories : Android

Encode x264 video with ffmpeg for Android with starting offset
Upgrading to ffmpeg 1.2.1 fixed the compatibility issue. After that, my Android phone was able to play the videos just fine. The -ss option is trickier than it at first looks. It has a different meaning based on whether it is before or after -i. It turns out, to make it work, you have to use both. You put the main offset before -i, which makes ffmpeg skip to that point in the stream. But, then you also need a small non-zero offset AFTER -i to make it seek to that point within the stream so audio and video will be in sync. For reference, the final working command is: ffmpeg -ss 00:03:52.00 -i in.mp4 -ss 0.1 -t 01:28:33.00 -c:v libx264 -preset medium -crf 20 -maxrate 400k -bufsize 1835k -c:a libvorbis -sn out.mkv

Categories : Android

Is android ffmpeg library able to play video located in the assets folder
Assets are read-only parts of your APK, and there is no way another APK can access these assets. You can use the native Android classes to play the video(s) you embed in your APK (use mp4 format). If you insist on ffmpeg-assisted playback, you can unpack the asset to private file, or you should write your io handlers for ffmpeg that will use AAsset_read().

Categories : Android

Set activity audio stream to android home audio stream
If you mean which volume it is that gets adjusted when you press the volume keys while at the home screen, then it's most likely the ring volume (i.e. STREAM_RING).

Categories : Java

ffmpeg commands to concatenate different type and resolution videos into 1 video and can be played in android
You can use concat to append all the videos one by one after converting them to a single format. You can also use the below command to convert differently formatted video to one format: ./ffmpeg -i 1.mp4 -acodec libvo_aacenc -vcodec libx264 -s 1920x1080 -r 60 -strict experimental 1.mp4 Convert everything to mp4 and then follow the instructions given in the link above. This will enable you to join all the videos in a single file.

Categories : Android

Android - MediaPlayer's on Prepare Called even before the stream is prepared on Android 4.0+
I don't have much experience with media player. But couple of suggestions/queries from my side No call to prepare. If you are doing it, did you try prepareAsync ? You are not using the mediaplayer instance which is passed to the onPrepared callback. There can be a chance that you are trying to start the wrong mediaplayer.

Categories : Java

Android Stream Video from url
Have you tried with Intent intent = new Intent(Intent.ACTION_VIEW); instead Intent intent = new Intent(Intent.ACTION_MAIN); I think it can helps you: Play youtube video from URL in android

Categories : Android

Best way to stream Ogg Vorbis on Android
My suggestion would be to use the NDK to get a vorbis decoder running natively (perhaps Tremor), and then utilize that MUCH FASTER native decoder from your app instead of using the much slower jogg/jorbis java decoder. (It will probably be at least an order of magnitude faster than the java decoder) You can then use the AudioTrack class to take the native decoded PCM and write it out to the audio device. I would be surprised if someone hasn't already done this, so google around first!

Categories : Android

Gstreamer in Android. UDP stream
Well, I've been able to resolve the issue. The caps should be like this: udpsrc port=5000 caps="application/x-rtp, media=video, clock-rate=90000, encoding-name=H264, sprop-parameter-sets=\"J2QAFKwrQLj/LwDxImo\\=\\,KO4fLA\\=\\=\"", payload=96" ! ... The video is shown with almost no delay

Categories : Android

How to get video frames from mp4 video using ffmpeg in android?
compile ffmpeg.c and invoke its main() via jni. For more details see how to Use FFMPEG on Android . Also refer this github project. And this tutorial too will help you. I hope this will help you.

Categories : Android

Android MP4 stream - Video cannot be played
Try this, it played the video for me <LinearLayout xmlns:android="http://schemas.android.com/apk/res/android" android:layout_width="match_parent" android:layout_height="match_parent" android:orientation="vertical" > <VideoView android:id="@+id/videoViewa" android:layout_width="match_parent" android:layout_gravity="center" android:layout_height="match_parent" /> VideoView videoView; videoView = (VideoView) findViewById(R.id.videoViewa); MediaController mediaController = new MediaController(this); mediaController.setAnchorView(videoView); //URI either from net Uri video = Uri.parse("http://www.fieldandrurallife.tv/videos/Benltey%20Mulsanne.mp4"); videoView.setMediaController(mediaController); vid

Categories : Android

Play Stream in Android from link
m3u8 is not supported by Android in default, if you want to do live streaming you can either use HLS format ( which is only supported in new versions, not backward supported) or you can use Vitamio, which support live stream via m3u8 and a couple of other formats as well http://www.vitamio.org/en/

Categories : Android

Live stream extensions in android
Check this answer: Video Streaming over wifi? Then if u want to see the live streaming in android phone then include vlc plugin inside your application and connect through real time streaming protocol(RTSP) Intent i = new Intent("org.videolan.vlc.VLCApplication.gui.video.VideoPlayerActivity"); i.setAction(Intent.ACTION_VIEW); i.setData(Uri.parse("rtsp://10.0.0.179:8086/")); startActivity(i); If u have installed vlc on your android phone, then you can stream using intent and pass the ip address and port no. as shown

Categories : Android

How to stream video across LAN in android VideoView
I've been able to play smb:// shares over the network in a VideoView by: Using JCIFS to scan for and "see" the share: http://jcifs.samba.org/ Implementing a simple HTTP server (NanoHttpd) to stream the content via http: https://github.com/NanoHttpd/nanohttpd Passing the http://localhost/myvideo link to the VideoView I realise this seems convoluted (and I agree) but it's the only way I've managed to get it working (and working well, with seeking, etc.). I'd be interested if there are better solutions.

Categories : Android

Android: decode bitmap from stream
public static Bitmap decodeSampleBitmapFromFile(String filePath, int reqWidth, int reqHeight) { // First decode with inJustDecodeBounds=true to check dimensions final BitmapFactory.Options options = new BitmapFactory.Options(); options.inJustDecodeBounds = true; BitmapFactory.decodeFile(filePath,options); // Calculate inSampleSize options.inSampleSize = calculateInSampleSize(options, reqWidth,reqHeight); // Decode bitmap with inSampleSize set options.inJustDecodeBounds = false; return BitmapFactory.decodeFile(filePath,options); } use the above static method to get bitmap from the external storage Give the filePath ..... correctly

Categories : Android

Creating video from stream bitmap in android
I'm a little late here but yes there is a library for this. You can find it here, opencv for android . To integrate it with your android project, follow the instructions under "Quick Start for OpenCV and FFmpeg" The best part about this library is all of the jni wrapping is done for you. Although there is lack of documentation, there are a few links of similar projects to yours and how they accomplished it using this library which you can find here. Hope this helps.

Categories : Android

Read input stream onMainThreadException Android
I think android.os.NetworkOnMainThreadException Solution is.... Add: StrictMode.ThreadPolicy policy = new StrictMode.ThreadPolicy.Builder().permitAll().build(); StrictMode.setThreadPolicy(policy); in your class, and ADD this to ManiFestFile: <uses-permission android:name="android.permission.INTERNET"/>

Categories : Android

Android/Java, How to connect and get stream from IPCamera
Try MJpegView https://github.com/michogar/MjpegView. I tried and works fine for video. Now I'm searching a solution for the asf stream.

Categories : Java



© Copyright 2017 w3hello.com Publishing Limited. All rights reserved.