Giter VIP home page Giter VIP logo

android-gpuimage-plus's Introduction

Android-GPUImage-Plus

A C++ & Java library for Image/Camera/Video filters. PRs are welcomed.

New Feature

See the image deform demo.

screenshots screenshots

Gradle dependency

allprojects {
    repositories {
        maven {
            // Use github hosted maven repo for now.
            // Will be uploaded to maven central later.
            url 'https://maven.wysaid.org/'
        }
    }
}

//Choose only one of them
dependencies {
    //All arch: armeabi-v7a, arm64-v8a, x86, x86_64 with video module (ffmpeg bundled)
    implementation 'org.wysaid:gpuimage-plus:3.0.0'

    //All arch: armeabi-v7a, arm64-v8a, x86, x86_64 without video module (no ffmpeg)
    implementation 'org.wysaid:gpuimage-plus:3.0.0-min'
}

The jcenter is out of date, please try the source for now. Latest prebuilt versions will be provided soon.

To compile other versions of ffmpeg, see: https://github.com/wysaid/FFmpeg-Android.git

Build

  • Options to know in local.properties:

    • usingCMakeCompile=true: Compile the native library with CMake if set to true. (Default to false, so you can use the prebuilt libs)
    • usingCMakeCompileDebug=true: Compile the native library in Debug Mode if set to true. (Default to false)
    • disableVideoModule=true: Disable the video recording feature(Useful for image only scenarios). The whole jni module size will be very small. (Default to false)
  • Build with Android Studio and CMake: (Recommended)

    • Put usingCMakeCompile=true in your local.properties
    • Open the repo with the latest version of Android Studio
    • Waiting for the initialization. (NDK/cmake install)
    • Done.
  • Using Visual Studio Code: (Requires WSL(Recommended)/MinGW/Cygwin on Windows.)

    • Setup ENV variable ANDROID_HOME to your Android SDK installation directory.
    • Open the repo with Visual Studio Code
    • Press ⌘ + shift + B (Mac) or ctrl + shift + B (Win/Linux), choose the option Enable CMake And Build Project With CMake.
    • Done.
  • Build with preset tasks: (Requires WSL(Recommended)/MinGW/Cygwin on Windows.)

    # define the environment variable "ANDROID_HOME"
    # If using Windows, define ANDROID_HOME in Windows Environment Settings by yourself.
    export ANDROID_HOME=/path/to/android/sdk
    
    # Setup Project
    bash tasks.sh --setup-project
    
    # Compile with CMake Debug
    bash tasks.sh --debug --enable-cmake --build
    # Compile with CMake Release
    bash tasks.sh --release --enable-cmake --build
    
    # Start Demo By Command
    bash tasks.sh --run
  • Build JNI part with ndk-build: (Not recommended)

    export NDK=path/of/your/ndk
    cd folder/of/jni (android-gpuimage-plus/library/src/main/jni)
    
    #This will make all arch: armeabi, armeabi-v7a arm64-v8a, x86, mips
    ./buildJNI
    #Or use "sh buildJNI"
    
    #Try this if you failed to run the shell above
    export CGE_USE_VIDEO_MODULE=1
    $NDK/ndk-build
    
    #If you don't want anything except the image filter,
    #Do as below to build with only cge module
    #No ffmpeg, opencv or faceTracker.
    #And remove the loading part of ffmpeg&facetracker
    $NDK/ndk-build
    
    #For Windows user, you should include the `.cmd` extension to `ndk-build` like this:
    cd <your\path\to\this\repo>\library\src\main\jni
    <your\path\to\ndk>\ndk-build.cmd
    
    #Also remember to comment out these line in NativeLibraryLoader
    //System.loadLibrary("ffmpeg");
    //CGEFFmpegNativeLibrary.avRegisterAll();

You can find precompiled libs here: android-gpuimage-plus-libs (The precompiled '.so' files are generated with NDK-r23b)

Note that the generated file "libFaceTracker.so" is not necessary. So just remove this file if you don't want any feature of it.

Manual

1. Usage

Sample Code for doing a filter with Bitmap

//Simply apply a filter to a Bitmap.
@Override
protected void onCreate(Bundle savedInstanceState) {
    super.onCreate(savedInstanceState);
    setContentView(R.layout.activity_main);

    Bitmap srcImage = ...;

    //HSL Adjust (hue: 0.02, saturation: -0.31, luminance: -0.17)
    //Please see the manual for more details.
    String ruleString = "@adjust hsl 0.02 -0.31 -0.17";

    Bitmap dstImage = CGENativeLibrary.filterImage_MultipleEffects(src, ruleString, 1.0f);

    //Then the dstImage is applied with the filter.

    //Save the result image to /sdcard/libCGE/rec_???.jpg.
    ImageUtil.saveBitmap(dstImage);
}

2. Custom Shader Filter

2.1 Write your own filter

Your filter must inherit CGEImageFilterInterfaceAbstract or its child class. Most of the filters are inherited from CGEImageFilterInterface because it has many useful functions.

// A simple customized filter to do a color reversal.
class MyCustomFilter : public CGE::CGEImageFilterInterface
{
public:
    
    bool init()
    {
        CGEConstString fragmentShaderString = CGE_SHADER_STRING_PRECISION_H
        (
        varying vec2 textureCoordinate;  //defined in 'g_vshDefaultWithoutTexCoord'
        uniform sampler2D inputImageTexture; // the same to above.

        void main()
        {
            vec4 src = texture2D(inputImageTexture, textureCoordinate);
            src.rgb = 1.0 - src.rgb;  //Simply reverse all channels.
            gl_FragColor = src;
        }
        );

        //m_program is defined in 'CGEImageFilterInterface'
        return m_program.initWithShaderStrings(g_vshDefaultWithoutTexCoord, s_fsh);
    }

    //void render2Texture(CGE::CGEImageHandlerInterface* handler, GLuint srcTexture, GLuint vertexBufferID)
    //{
    //  //Your own render functions here.
    //  //Do not override this function to use the CGEImageFilterInterface's.
    //}
};

Note: To add your own shader filter with c++. Please see the demo for further details.

2.2 Run your own filter

In C++, you can use a CGEImageHandler to do that:

//Assume the gl context already exists:
//JNIEnv* env = ...;
//jobject bitmap = ...;
CGEImageHandlerAndroid handler;
CustomFilterType* customFilter = new CustomFilterType();

//You should handle the return value (false is returned when failed.)
customFilter->init();
handler.initWithBitmap(env, bitmap);

//The customFilter will be released when the handler' destructor is called.
//So you don't have to call 'delete customFilter' if you add it into the handler.
handler.addImageFilter(customFilter);

handler.processingFilters(); //Run the filters.

jobject resultBitmap = handler.getResultBitmap(env);

If no gl context exists, the class CGESharedGLContext may be helpful.

In Java, you can simply follow the sample:

See: CGENativeLibrary.cgeFilterImageWithCustomFilter

Or to do with a CGEImageHandler

3. Filter Rule String

Doc: https://github.com/wysaid/android-gpuimage-plus/wiki

En: https://github.com/wysaid/android-gpuimage-plus/wiki/Parsing-String-Rule-(EN)

Ch: https://github.com/wysaid/android-gpuimage-plus/wiki/Parsing-String-Rule-(ZH)

Tool

Some utils are available for creating filters: https://github.com/wysaid/cge-tools

Tool

License

MIT License

Donate

Alipay:

Alipay

Paypal:

Paypal

android-gpuimage-plus's People

Contributors

cassgenerator avatar eyu0415 avatar jamesxnelson avatar jw20082009 avatar mafanwei avatar niekakerboom avatar vxhviet avatar wysaid avatar

Stargazers

 avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar

Watchers

 avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar

android-gpuimage-plus's Issues

javacv库的问题

现在最新的javacv已经移动到了github,但是我从github编译出的so库并没有ffmpegInvoke和neno的so库,你的库代码应该是googlecode的;
github的库编译之后出现录制过程中会出现闪退的情况,请问你的库是怎么编译出来的

你好请教下cgeVideoUtils.cpp里面有几个疑问

我想在A simple-slow offscreen video rendering function. cgeGenerateVideoWithFilter方法里,自己创建一个shader绘制一个图像,始终有问题。

请问下,这个离屏渲染,有相关的demo吗。

`/创建
float texCoor []= {
0.0f,1.0f,
1.0f,1.0f,
1.0f,0.0f,
0.0f,0.0f
};
float tableVerticesWithTriangles[] = {
-0.8f,0.3f,0,
0.8f,0.3f,0,
0.8f,-0.3f,0,
-0.8f,-0.3f,0};
const char vertex_shader[]="attribute vec4 a_Position;\n"
"attribute vec2 u_Texture;\n"
"varying vec2 vTextureCoord;\n"
"void main() {\n"
" gl_Position=a_Position;\n"
" vTextureCoord = u_Texture;\n"
"}\n";

                                                           const char  fragment_shader[]="precision mediump float;\n"
                                                                                        "varying vec2 vTextureCoord;\n"
                                                                                        "uniform sampler2D sTexture;\n"
                                                                                        "void main(){\n"
                                                                                        "	gl_FragColor = texture2D(sTexture, vTextureCoord);\n"
                                                                                        "}\n";
    GLuint vertexShader = glCreateShader(GL_VERTEX_SHADER);
     glShaderSource(vertexShader,1,(const GLchar **)&vertex_shader,NULL);
      glCompileShader(vertexShader);

       GLuint fragmentShader = glCreateShader(GL_FRAGMENT_SHADER);
       glShaderSource(fragmentShader,1,(const GLchar **)&fragment_shader,NULL);
       glCompileShader(fragmentShader);
       GLuint m_programID= glCreateProgram();
       glAttachShader(m_programID, vertexShader);
       glAttachShader(m_programID, fragmentShader);
       glLinkProgram(m_programID);

      GLuint aPositionLocation =glGetAttribLocation(m_programID, "a_Position");
      GLuint uTextureLocation =glGetAttribLocation(m_programID, "u_Texture");`

渲染代码
`//测试渲染图片
glUseProgram(m_programID);
glVertexAttribPointer(aPositionLocation,3,GL_FLOAT,GL_FALSE,0,tableVerticesWithTriangles);
glEnableVertexAttribArray(aPositionLocation);

                                glVertexAttribPointer(uTextureLocation,2, GL_FLOAT, GL_FALSE, 0,texCoor);
                                glEnableVertexAttribArray(uTextureLocation);

                                glActiveTexture(GL_TEXTURE0);
                                glBindTexture(GL_TEXTURE_2D,texID);
                                glDrawArrays(GL_TRIANGLE_STRIP,0,4);


                                    //测试渲染图片结束`

小米加载x264.142显示无法加载

couldn't find "libx264.142.so"
at java.lang.Runtime.loadLibrary(Runtime.java:366)
at java.lang.System.loadLibrary(System.java:988)
at org.wysaid.nativePort.NativeLibraryLoader.load(NativeLibraryLoader.java:15)
at org.wysaid.nativePort.CGEFrameRenderer.(CGEFrameRenderer.java:14)
at org.wysaid.view.VideoPlayerGLSurfaceView$9$1.run(VideoPlayerGLSurfaceView.java:487)
at android.opengl.GLSurfaceView$GLThread.guardedRun(GLSurfaceView.java:1462)
at android.opengl.GLSurfaceView$GLThread.run(GLSurfaceView.java:1239)
大神,求帮助回复!

你好!

我对视频录制添加滤镜对我非常有帮助!但现在你的源码没有这一部分的代码!能分享出来吗??

Invalid Filter Config

Hello,
I'm taking a look on your library. I have the problem with the filter using image texture.
For example, when I use the filter that using the late_sunset.png, the log says that:
Invalid Filter Config @adjust lut late_sunset.png
I did copied the late_sunset.png into the project. Anything else is good.
Could you point me if i missed anything?
Thanks so much. :)

引用Library Module成功,但是在XML引用org.wysaid.view.CameraRecordGLSurfaceView,报错

IDE:Android Studio
引用第android-gpuimage-plus Library做为Dependency Lib, 没有问题。但是在布局里面,使用了Library中的org.wysaid.view.CameraRecordGLSurfaceView, 运行报错如下:

E/AndroidRuntime: FATAL EXCEPTION: GLThread 1340
Process: com.rocky.TestApp, PID: 26679
java.lang.UnsatisfiedLinkError: dalvik.system.PathClassLoader[DexPathList[[zip file "/data/app/com.rocky.TestApp-1/base.apk"],nativeLibraryDirectories=[/data/app/com.rocky.TestApp-1/lib/arm64, /data/app/com.rocky.TestApp-1/base.apk!/lib/arm64-v8a, /vendor/lib64, /system/lib64]]] couldn't find "libx264.142.so"
at java.lang.Runtime.loadLibrary(Runtime.java:367)
at java.lang.System.loadLibrary(System.java:1076)
at org.wysaid.nativePort.NativeLibraryLoader.load(NativeLibraryLoader.java:15)
at org.wysaid.nativePort.CGEFrameRenderer.(CGEFrameRenderer.java:11)
at org.wysaid.view.CameraGLSurfaceView.onSurfaceCreated(CameraGLSurfaceView.java:413)
at android.opengl.GLSurfaceView$GLThread.guardedRun(GLSurfaceView.java:1503)
at android.opengl.GLSurfaceView$GLThread.run(GLSurfaceView.java:1240)

报错行NativeLibraryLoader.java:15:
System.loadLibrary("x264.142");

后来我在这个库的Demo上试着改了应用的包名,也出现运行报错。把包名改回来,就能正常运行。

大神能给个解释吗?

glReadPixels的效率问题

hi,大师兄,我想对滤镜每帧图片进行加表情图片(边录制边加),但是项目代码中目前没有给出Recorder的native的实现(期待啊),无奈只能在onDrawFrame中读取每一帧图片出来,然后处理,主要处理的方式为:

GLES20.glViewport(0, 0, mRecordWidth, mRecordHeight);
mFrameRecorder.drawCache();
mFrameBuffer.position(0);
GLES20.glReadPixels(0, 0, mRecordWidth, mRecordHeight, GLES20.GL_RGBA, GLES20.GL_UNSIGNED_BYTE, mFrameBuffer);
mFrameBmp.copyPixelsFromBuffer(mFrameBuffer);

这里width和hight是480*480。从这里读取出来得frame已经是你的底层native加好滤镜的帧了,这里拿到bitmap以后我会用canvas将表情图片draw到mFrameBmp上面,然后进行encode(没有重新写ffmpeg的encode,用了javacv)。
这套逻辑基本是可以工作的。但是问题是,上面的这几行代码对延迟相机预览,改动之前相机预览帧率是30,加上这几行以后变成了22帧左右。多次测试以后发现主要是GLES20.glReadPixels造成的延迟。

大师兄,有什么建议吗?

请教下音频和视频合并要怎么做

先感谢一下,最近用你的库在尝试做视频录制的应用。
目前不知道要怎么吧声音文件合成到视频里录制,没有找到相关的api。
有没有相关的api或者其他的库可以参考。
望不吝赐教。

你好,请问下cgeVideoUtils有几个疑问

qq 20161116192641

1、我想在 handler.processingFilters()后面,添加自己的一段代码,比如自己绘制一个图片LOGO,请问下他会绘制到glReadPixels的data里吗。
2、我添加以下代码,但是却编译不过,提示
error:
no matching function for call to 'glShaderSource'
glShaderSource(fragmentShader,1,&fragment_shader,NULL);

但是我单独写一个工程是可以编译过的。

ffmpeg库冲突

目前我们引用了一个第三方库也叫libffmpeg.so,这样就和您这个库冲突了,您能把libffmpeg.so重命名, 然后编译一下libCGE.so吗? 多谢了

有没有比readPixel更高效的方法

大师兄,你好,我正在用GPUImage开发一个滤镜效果,但是我的项目需要我从底层读出 滤镜渲染过的帧数据,然后进行一些 software encode。我看之前有个issue也提到了readPixel的效率问题,但是那位同学的需求可以绕过这个函数。而我的项目需求是必须在Android app层拿到底层的pixel数据,绕不过readPixel这个坎,我看网上有说PBO等方法来提高效率的。不知道大师兄有没有什么可行有效方法?

请问下cgeGenerateVideoWithFilter 中的 传入参数, mute是代码什么?

cgeGenerateVideoWithFilter(const char* outputFilename, const char* inputFilename, const char* filterConfig, float filterIntensity, GLuint texID, CGETextureBlendMode blendMode, float blendIntensity, bool mute, CGETexLoadArg* loadArg)

filterConfig
filterIntensity
blendMode是代码什么
blendIntensity是代表什么
mute
CGETexLoadArg

在华为Mate7上截取视频画面黑屏

你好!
我用您的代码在安卓机器上进行视频录制,当在华为Mate7上测试时,可以播放视频以及给视频加滤镜,但是无法通过VideoPlayerGLSurfaceView中的takeshot方法截图,在其他安卓机器上则没有问题。我debug发现GLES20.glReadPixels方法执行之后的IntBuffer的像素值全是0。在您提供的demo release的apk中也可以复现这问题。您是否方便验证下这是否是个bug?

请问@specail如何使用,能否给个例子

special方法参数解释: @special方法格式为 "@special " 其中参数 为该特效的编号。 此类用于处理所有不具备通用性的特效。直接重新编写一个processor以解决。

这个编号是指的 GLES20.glCreateProgram() 吗?
我试过,但是不行。
specialParser - unresolved index: 12 特效指令 "@special 12" 无法生成任何特效!

望回复,谢谢!

你好,方便提供jni代码吗?

你好,方便提供jni代码吗? android-gpuimage
做相机的实现滤镜貌似会很慢 ! 所以想研究一下用jni 实现的滤镜!

录制视频问题

你好,我下载demo的apk里面录制视频可以用,为什么到代码里面了录制视频的方法被屏蔽,没找到呢?

java.lang.UnsatisfiedLinkError: dlopen failed: /data/app/com.bhtc.huajuan-1/lib/arm/libx264.142.so: has text relocations

java.lang.UnsatisfiedLinkError: dlopen failed: /data/app/com.bhtc.huajuan-1/lib/arm/libx264.142.so: has text relocations
at java.lang.Runtime.loadLibrary(Runtime.java:372)
at java.lang.System.loadLibrary(System.java:1076)
at org.wysaid.nativePort.NativeLibraryLoader.load(NativeLibraryLoader.java:15)
at org.wysaid.nativePort.CGEFrameRenderer.(CGEFrameRenderer.java:11)
at org.wysaid.view.CameraGLSurfaceView.onSurfaceCreated(CameraGLSurfaceView.java:413)
at android.opengl.GLSurfaceView$GLThread.guardedRun(GLSurfaceView.java:1549)
at android.opengl.GLSurfaceView$GLThread.run(GLSurfaceView.java:1286)

音视频不同步问题的一点小灵感

最近在看你的源码, 我发现项目里面的帧率是int型, 但是有的手机帧率是动态的, 也就是说会有小数点, 这样精度会丢失,建议你把类型换成float试试

Recommend Projects

  • React photo React

    A declarative, efficient, and flexible JavaScript library for building user interfaces.

  • Vue.js photo Vue.js

    🖖 Vue.js is a progressive, incrementally-adoptable JavaScript framework for building UI on the web.

  • Typescript photo Typescript

    TypeScript is a superset of JavaScript that compiles to clean JavaScript output.

  • TensorFlow photo TensorFlow

    An Open Source Machine Learning Framework for Everyone

  • Django photo Django

    The Web framework for perfectionists with deadlines.

  • D3 photo D3

    Bring data to life with SVG, Canvas and HTML. 📊📈🎉

Recommend Topics

  • javascript

    JavaScript (JS) is a lightweight interpreted programming language with first-class functions.

  • web

    Some thing interesting about web. New door for the world.

  • server

    A server is a program made to process requests and deliver data to clients.

  • Machine learning

    Machine learning is a way of modeling and interpreting data that allows a piece of software to respond intelligently.

  • Game

    Some thing interesting about game, make everyone happy.

Recommend Org

  • Facebook photo Facebook

    We are working to build community through open source technology. NB: members must have two-factor auth.

  • Microsoft photo Microsoft

    Open source projects and samples from Microsoft.

  • Google photo Google

    Google ❤️ Open Source for everyone.

  • D3 photo D3

    Data-Driven Documents codes.