Giter VIP home page Giter VIP logo

android-yolo-v2's People

Contributors

szaza avatar tehtea avatar

Stargazers

 avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar

Watchers

 avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar

android-yolo-v2's Issues

use new pb file problem.

I use a new pb file and a label file, but when i print BoundingBox ,it show BoundingBox{x=NaN, y=NaN, width=NaN, height=NaN, confidence=NaN, classes=[NaN, NaN, NaN, NaN, NaN, NaN, NaN, NaN, NaN]}

unable to load custom pb "protobuff"

Your code is working fine with your provided pb file for 20 objects.
I replaced your pb file with mine and keeping the same name.
replace the label content with my 3 object name.
And when tired to run the application with my pb file (created from tiny yolo voc - yolo v2) for 3 objects
Im unable to launch the application, getting error : Failed to load model

Please suggest.

Regards
Purohit

Logs on the app :

ava.lang.RuntimeException: Failed to load model from 'file:///android_asset/tiny-yolo-voc-graph.pb'
at org.tensorflow.contrib.android.TensorFlowInferenceInterface.(TensorFlowInferenceInterface.java:100)
at org.tensorflow.yolo.TensorFlowImageRecognizer.create(TensorFlowImageRecognizer.java:42)
at org.tensorflow.yolo.view.ClassifierActivity.onPreviewSizeChosen(ClassifierActivity.java:55)
at org.tensorflow.yolo.view.CameraActivity.lambda$setFragment$0$CameraActivity(CameraActivity.java:116)
at org.tensorflow.yolo.view.CameraActivity$$Lambda$0.onPreviewSizeChosen(Unknown Source)
at org.tensorflow.yolo.view.CameraConnectionFragment.setUpCameraOutputs(CameraConnectionFragment.java:291)
at org.tensorflow.yolo.view.CameraConnectionFragment.openCamera(CameraConnectionFragment.java:298)
at org.tensorflow.yolo.view.CameraConnectionFragment.access$000(CameraConnectionFragment.java:52)
at org.tensorflow.yolo.view.CameraConnectionFragment$1.onSurfaceTextureAvailable(CameraConnectionFragment.java:163)
at android.view.TextureView.getHardwareLayer(TextureView.java:368)
at android.view.View.updateDisplayListIfDirty(View.java:15191)
at android.view.View.draw(View.java:15987)
at android.view.ViewGroup.drawChild(ViewGroup.java:3612)
at android.view.ViewGroup.dispatchDraw(ViewGroup.java:3402)
at android.view.View.updateDisplayListIfDirty(View.java:15209)
at android.view.View.draw(View.java:15987)
at android.view.ViewGroup.drawChild(ViewGroup.java:3612)
at android.view.ViewGroup.dispatchDraw(ViewGroup.java:3402)
at android.view.View.draw(View.java:16220)
at android.view.View.updateDisplayListIfDirty(View.java:15214)
at android.view.View.draw(View.java:15987)
at android.view.ViewGroup.drawChild(ViewGroup.java:3612)
at android.view.ViewGroup.dispatchDraw(ViewGroup.java:3402)
at android.view.View.updateDisplayListIfDirty(View.java:15209)
at android.view.View.draw(View.java:15987)
at android.view.ViewGroup.drawChild(ViewGroup.java:3612)
at android.view.ViewGroup.dispatchDraw(ViewGroup.java:3402)
at android.view.View.updateDisplayListIfDirty(View.java:15209)
at android.view.View.draw(View.java:15987)
at android.view.ViewGroup.drawChild(ViewGroup.java:3612)
at android.view.ViewGroup.dispatchDraw(ViewGroup.java:3402)
at android.view.View.draw(View.java:16220)
at com.android.internal.policy.PhoneWindow$DecorView.draw(PhoneWindow.java:2690)
at android.view.View.updateDisplayListIfDirty(View.java:15214)
at android.view.ThreadedRenderer.updateViewTreeDisplayList(ThreadedRenderer.java:283)
at android.view.ThreadedRenderer.updateRootDisplayList(ThreadedRenderer.java:289)
at android.view.ThreadedRenderer.draw(ThreadedRenderer.java:324)
at android.view.ViewRootImpl.draw(ViewRootImpl.java:2651)
at android.view.ViewRootImpl.performDraw(ViewRootImpl.java:2470)
at android.view.ViewRootImpl.performTraversals(ViewRootImpl.java:2103)
at android.view.ViewRootImpl.doTraversal(ViewRootImpl.java:1139)
at android.view.ViewRootImpl$TraversalRunnable.run(ViewRootImpl.java:6064)
at android.view.Choreographer$CallbackRecord.run(Choreographer.java:860)
at android.view.Choreographer.doCallbacks(Choreographer.java:672)
at android.view.Choreographer.doFrame(Choreographer.java:608)
at android.view.Choreographer$FrameDisplayEventReceiver.run(Choreographer.java:846)
at android.os.Handler.handleCallback(Handler.java:742)
at android.os.Handler.dispatchMessage(Handler.java:95)
at android.os.Looper.loop(Looper.java:157)
at android.app.ActivityThread.main(ActivityThread.java:5571)
at java.lang.reflect.Method.invoke(Native Method)
at com.android.internal.os.ZygoteInit$MethodAndArgsCaller.run(ZygoteInit.java:745)
at com.android.internal.os.ZygoteInit.main(ZygoteInit.java:635)
Caused by: java.io.IOException: Not a valid TensorFlow Graph serialization: NodeDef mentions attr 'dilations' not in Op<name=Conv2D; signature=input:T, filter:T -> output:T; attr=T:type,allowed=[DT_HALF, DT_FLOAT]; attr=strides:list(int); attr=use_cudnn_on_gpu:bool,default=true; attr=padding:string,allowed=["SAME", "VALID"]; attr=data_format:string,default="NHWC",allowed=["NHWC", "NCHW"]>; NodeDef: 0-convolutional = Conv2D[T=DT_FLOAT, data_format="NHWC", dilations=[1, 1, 1, 1], padding="VALID", strides=[1, 1, 1, 1], use_cudnn_on_gpu=true](Pad, 0-convolutional/filter). (Check whether your GraphDef-interpreting binary is up to date with your GraphDef-generating binary.).
at org.tensorflow.contrib.android.TensorFlowInferenceInterface.loadGraph(TensorFlowInferenceInterface.java:392)
at org.tensorflow.contrib.android.TensorFlowInferenceInterface.(TensorFlowInferenceInterface.java:96)
... 52 more

PB files is too huge

I placed both the PB files (180,000++ KB) and labels.txt to my asset folder. However, when i run my project in android studio, it prompts me error.

java.lang.StackOverflowError: stack size 8MB

03-14 17:43:09.973 4017-4030/org.tensorflow.yolo I/art: Background sticky concurrent mark sweep GC freed 0(0B) AllocSpace objects, 0(0B) LOS objects, 0% free, 227MB/227MB, paused 304.711ms total 453.143ms
03-14 17:43:09.986 4017-4017/org.tensorflow.yolo E/AndroidRuntime: FATAL EXCEPTION: main
Process: org.tensorflow.yolo, PID: 4017
java.lang.StackOverflowError: stack size 8MB
at org.tensorflow.yolo.YOLOClassifier.getInstance(YOLOClassifier.java:39)
at org.tensorflow.yolo.YOLOClassifier.getInstance(YOLOClassifier.java:39)
at org.tensorflow.yolo.YOLOClassifier.getInstance(YOLOClassifier.java:39)
at org.tensorflow.yolo.YOLOClassifier.getInstance(YOLOClassifier.java:39)
at org.tensorflow.yolo.YOLOClassifier.getInstance(YOLOClassifier.java:39)
at org.tensorflow.yolo.YOLOClassifier.getInstance(YOLOClassifier.java:39)
at org.tensorflow.yolo.YOLOClassifier.getInstance(YOLOClassifier.java:39)
at org.tensorflow.yolo.YOLOClassifier.getInstance(YOLOClassifier.java:39)
at org.tensorflow.yolo.YOLOClassifier.getInstance(YOLOClassifier.java:39)
at org.tensorflow.yolo.YOLOClassifier.getInstance(YOLOClassifier.java:39)
at org.tensorflow.yolo.YOLOClassifier.getInstance(YOLOClassifier.java:39)
at org.tensorflow.yolo.YOLOClassifier.getInstance(YOLOClassifier.java:39)
at org.tensorflow.yolo.YOLOClassifier.getInstance(YOLOClassifier.java:39)
at org.tensorflow.yolo.YOLOClassifier.getInstance(YOLOClassifier.java:39)
at org.tensorflow.yolo.YOLOClassifier.getInstance(YOLOClassifier.java:39)
at org.tensorflow.yolo.YOLOClassifier.getInstance(YOLOClassifier.java:39)
at org.tensorflow.yolo.YOLOClassifier.getInstance(YOLOClassifier.java:39)
at org.tensorflow.yolo.YOLOClassifier.getInstance(YOLOClassifier.java:39)
at org.tensorflow.yolo.YOLOClassifier.getInstance(YOLOClassifier.java:39)
at org.tensorflow.yolo.YOLOClassifier.getInstance(YOLOClassifier.java:39)
at org.tensorflow.yolo.YOLOClassifier.getInstance(YOLOClassifier.java:39)
at org.tensorflow.yolo.YOLOClassifier.getInstance(YOLOClassifier.java:39)
at org.tensorflow.yolo.YOLOClassifier.getInstance(YOLOClassifier.java:39)
at org.tensorflow.yolo.YOLOClassifier.getInstance(YOLOClassifier.java:39)
at org.tensorflow.yolo.YOLOClassifier.getInstance(YOLOClassifier.java:39)
at org.tensorflow.yolo.YOLOClassifier.getInstance(YOLOClassifier.java:39)
at org.tensorflow.yolo.YOLOClassifier.getInstance(YOLOClassifier.java:39)
at org.tensorflow.yolo.YOLOClassifier.getInstance(YOLOClassifier.java:39)
at org.tensorflow.yolo.YOLOClassifier.getInstance(YOLOClassifier.java:39)
at org.tensorflow.yolo.YOLOClassifier.getInstance(YOLOClassifier.java:39)
at org.tensorflow.yolo.YOLOClassifier.getInstance(YOLOClassifier.java:39)
at org.tensorflow.yolo.YOLOClassifier.getInstance(YOLOClassifier.java:39)
at org.tensorflow.yolo.YOLOClassifier.getInstance(YOLOClassifier.java:39)
at org.tensorflow.yolo.YOLOClassifier.getInstance(YOLOClassifier.java:39)
at org.tensorflow.yolo.YOLOClassifier.getInstance(YOLOClassifier.java:39)
at org.tensorflow.yolo.YOLOClassifier.getInstance(YOLOClassifier.java:39)
at org.tensorflow.yolo.YOLOClassifier.getInstance(YOLOClassifier.java:39)
at org.tensorflow.yolo.YOLOClassifier.getInstance(YOLOClassifier.java:39)
at org.tensorflow.yolo.YOLOClassifier.getInstance(YOLOClassifier.java:39)
at org.tensorflow.yolo.YOLOClassifier.getInstance(YOLOClassifier.java:39)
at org.tensorflow.yolo.YOLOClassifier.getInstance(YOLOClassifier.java:39)
at org.tensorflow.yolo.YOLOClassifier.getInstance(YOLOClassifier.java:39)
at org.tensorflow.yolo.YOLOClassifier.getInstance(YOLOClassifier.java:39)
at org.tensorflow.yolo.YOLOClassifier.getInstance(YOLOClassifier.java:39)
at org.tensorflow.yolo.YOLOClassifier.getInstance(YOLOClassifier.java:39)
at org.tensorflow.yolo.YOLOClassifier.getInstance(YOLOClassifier.java:39)
at org.tensorflow.yolo.YOLOClassifier.getInstance(YOLOClassifier.java:39)
at org.tensorflow.yolo.YOLOClassifier.getInstance(YOLOClassifier.java:39)
at org.tensorflow.yolo.YOLOClassifier.getInstance(YOLOClassifier.java:39)
at org.tensorflow.yolo.YOLOClassifier.getInstance(YOLOClassifier.java:39)
at org.tensorflow.yolo.YOLOClassifier.getInstance(YOLOClassifier.java:39)
at org.tensorflow.yolo.YOLOClassifier.getInstance(YOLOClassifier.java:39)
at org.tensorflow.yolo.YOLOClassifier.getInstance(Y

03-14 17:43:12.696 4017-4017/org.tensorflow.yolo D/Error: ERR: TOTAL BYTES WRITTEN: 39199384
03-14 17:43:12.696 4017-4017/org.tensorflow.yolo E/JavaBinder: !!! FAILED BINDER TRANSACTION !!! (parcel size = 39199476)
03-14 17:43:12.697 4017-4017/org.tensorflow.yolo E/AndroidRuntime: Error reporting crash
android.os.TransactionTooLargeException: data parcel size 39199476 bytes
at android.os.BinderProxy.transactNative(Native Method)
at android.os.BinderProxy.transact(Binder.java:511)
at android.app.ActivityManagerProxy.handleApplicationCrash(ActivityManagerNative.java:4617)
at com.android.internal.os.RuntimeInit$UncaughtHandler.uncaughtException(RuntimeInit.java:109)
at java.lang.ThreadGroup.uncaughtException(ThreadGroup.java:693)
at java.lang.ThreadGroup.uncaughtException(ThreadGroup.java:690)

put the PB and txt files in the assets folder.

I follow the instructions to put the PB and txt files in the assets folder. After the app opens, it will flash back. I am training the yolo2 to do face detection. The output is 30=5*6. What is the reason?

Not getting predictions for custom model.

Hi,
I have trained a custom model using darkflow and i have converted my checkpoint file to .pb file using the command you mentioned. I am getting prediction when i am using the .pb file in darkflow but i am not getting any prediction when i use it in the mobile app. And thr are no crashes in the app as well i am just not getting any predictions.

Unable to locate .pb file in custom model

I have trained my model with a single object 'pothole' and here is the .Pb file https://drive.google.com/file/d/10Y6w9tlHvpKCg-wXcLYhmteN9Va7r6X8/view?usp=sharing .
After making changes in the config file and running it. I am facing this issue.
05/24 21:11:56: Launching 'android-yolo-v2-master' on Pixel 3 API 23.
$ adb shell am start -n "org.tensorflow.yolo/org.tensorflow.yolo.view.ClassifierActivity" -a android.intent.action.MAIN -c android.intent.category.LAUNCHER
Connected to process 6183 on device 'Pixel_3_API_23 [emulator-5554]'.
Capturing and displaying logcat messages from application. This behavior can be disabled in the "Logcat output" section of the "Debugger" settings page.
D/AndroidRuntime: Shutting down VM
E/AndroidRuntime: FATAL EXCEPTION: main
Process: org.tensorflow.yolo, PID: 6183
java.lang.RuntimeException: Failed to load model from 'file:///android_asset/tiny-yolo-voc-graph.pb'
at org.tensorflow.contrib.android.TensorFlowInferenceInterface.(TensorFlowInferenceInterface.java:113)
at org.tensorflow.yolo.TensorFlowImageRecognizer.create(TensorFlowImageRecognizer.java:42)
at org.tensorflow.yolo.view.ClassifierActivity.onPreviewSizeChosen(ClassifierActivity.java:55)
at org.tensorflow.yolo.view.CameraActivity.lambda$setFragment$0$CameraActivity(CameraActivity.java:116)
at org.tensorflow.yolo.view.CameraActivity$$Lambda$0.onPreviewSizeChosen(Unknown Source)
at org.tensorflow.yolo.view.CameraConnectionFragment.setUpCameraOutputs(CameraConnectionFragment.java:291)
at org.tensorflow.yolo.view.CameraConnectionFragment.openCamera(CameraConnectionFragment.java:298)
at org.tensorflow.yolo.view.CameraConnectionFragment.access$000(CameraConnectionFragment.java:52)
at org.tensorflow.yolo.view.CameraConnectionFragment$1.onSurfaceTextureAvailable(CameraConnectionFragment.java:163)
at android.view.TextureView.getHardwareLayer(TextureView.java:368)
at android.view.View.updateDisplayListIfDirty(View.java:15151)
at android.view.View.draw(View.java:15948)
at android.view.ViewGroup.drawChild(ViewGroup.java:3609)
at android.view.ViewGroup.dispatchDraw(ViewGroup.java:3399)
at android.view.View.updateDisplayListIfDirty(View.java:15169)
at android.view.View.draw(View.java:15948)
at android.view.ViewGroup.drawChild(ViewGroup.java:3609)
at android.view.ViewGroup.dispatchDraw(ViewGroup.java:3399)
at android.view.View.draw(View.java:16181)
at android.view.View.updateDisplayListIfDirty(View.java:15174)
at android.view.View.draw(View.java:15948)
at android.view.ViewGroup.drawChild(ViewGroup.java:3609)
at android.view.ViewGroup.dispatchDraw(ViewGroup.java:3399)
at android.view.View.updateDisplayListIfDirty(View.java:15169)
at android.view.View.draw(View.java:15948)
at android.view.ViewGroup.drawChild(ViewGroup.java:3609)
at android.view.ViewGroup.dispatchDraw(ViewGroup.java:3399)
at android.view.View.updateDisplayListIfDirty(View.java:15169)
at android.view.View.draw(View.java:15948)
at android.view.ViewGroup.drawChild(ViewGroup.java:3609)
at android.view.ViewGroup.dispatchDraw(ViewGroup.java:3399)
at android.view.View.draw(View.java:16181)
at com.android.internal.policy.PhoneWindow$DecorView.draw(PhoneWindow.java:2690)
at android.view.View.updateDisplayListIfDirty(View.java:15174)
at android.view.ThreadedRenderer.updateViewTreeDisplayList(ThreadedRenderer.java:281)
at android.view.ThreadedRenderer.updateRootDisplayList(ThreadedRenderer.java:287)
at android.view.ThreadedRenderer.draw(ThreadedRenderer.java:322)
at android.view.ViewRootImpl.draw(ViewRootImpl.java:2615)
at android.view.ViewRootImpl.performDraw(ViewRootImpl.java:2434)
at android.view.ViewRootImpl.performTraversals(ViewRootImpl.java:2067)
at android.view.ViewRootImpl.doTraversal(ViewRootImpl.java:1107)
at android.view.ViewRootImpl$TraversalRunnable.run(ViewRootImpl.java:6013)
at android.view.Choreographer$CallbackRecord.run(Choreographer.java:858)
at android.view.Choreographer.doCallbacks(Choreographer.java:670)
at android.view.Choreographer.doFrame(Choreographer.java:606)
at android.view.Choreographer$FrameDisplayEventReceiver.run(Choreographer.java:844)
at android.os.Handler.handleCallback(Handler.java:739)
at android.os.Handler.dispatchMessage(Handler.java:95)
at android.os.Looper.loop(Looper.java:148)
at android.app.ActivityThread.main(ActivityThread.java:5417)
at java.lang.reflect.Method.invoke(Native Method)
at com.android.internal.os.ZygoteInit$MethodAndArgsCaller.run(ZygoteInit.java:726)
at com.android.internal.os.ZygoteInit.main(ZygoteInit.java:616)
Caused by: java.io.IOException: Not a valid TensorFlow Graph serialization: NodeDef mentions attr 'explicit_paddings' not in Op<name=Conv2D; signature=input:T, filter:T -> output:T; attr=T:type,allowed=[DT_HALF, DT_BFLOAT16, DT_FLOAT, DT_DOUBLE]; attr=strides:

I tried changing the tensor flow version and running it again but it didnt work.
Can you please resolve it

Unable to find method 'org.gradle.api/Lorg/gradle/api/tasks/TaskInputs;'.

Unable to find method 'org.gradle.api.internal.TaskInputsInternal.property(Ljava/lang/String;Ljava/lang/Object;)Lorg/gradle/api/tasks/TaskInputs;'.
Possible causes for this unexpected error include:
Gradle's dependency cache may be corrupt (this sometimes occurs after a network connection timeout.)
Re-download dependencies and sync project (requires network)


Seems that I'm using gradle v5 and your code is gradle v4 API. Am I right?
I'm beginner... can you update your code to v5 gradle API?

Cannot run app in API level 23

Hi. I tried to change the minSdkVersion to 23 but the app crashes. Camera2 API runs from SDK version 21. So, why is the app crashing? The error message is as follows -

java.lang.RuntimeException: Unable to start activity ComponentInfo{org.tensorflow.yolo/org.tensorflow.yolo.view.ClassifierActivity}: android.view.InflateException: Binary XML file line #0: Binary XML file line #0: Error inflating class org.tensorflow.yolo.view.OverlayView
at android.app.ActivityThread.performLaunchActivity(ActivityThread.java:3319)
at android.app.ActivityThread.handleLaunchActivity(ActivityThread.java:3415)
at android.app.ActivityThread.access$1100(ActivityThread.java:229)
at android.app.ActivityThread$H.handleMessage(ActivityThread.java:1821)
at android.os.Handler.dispatchMessage(Handler.java:102)
at android.os.Looper.loop(Looper.java:148)
at android.app.ActivityThread.main(ActivityThread.java:7325)
at java.lang.reflect.Method.invoke(Native Method)
at com.android.internal.os.ZygoteInit$MethodAndArgsCaller.run(ZygoteInit.java:1230)
at com.android.internal.os.ZygoteInit.main(ZygoteInit.java:1120)
Caused by: android.view.InflateException: Binary XML file line #0: Binary XML file line #0: Error inflating class org.tensorflow.yolo.view.OverlayView
at android.view.LayoutInflater.inflate(LayoutInflater.java:551)
at android.view.LayoutInflater.inflate(LayoutInflater.java:429)
at org.tensorflow.yolo.view.CameraConnectionFragment.onCreateView(CameraConnectionFragment.java:135)
at android.app.Fragment.performCreateView(Fragment.java:2281)
at android.app.FragmentManagerImpl.moveToState(FragmentManager.java:984)
at android.app.FragmentManagerImpl.moveToState(FragmentManager.java:1164)
at android.app.BackStackRecord.run(BackStackRecord.java:793)
at android.app.FragmentManagerImpl.execPendingActions(FragmentManager.java:1557)
at android.app.FragmentController.execPendingActions(FragmentController.java:326)
at android.app.Activity.performStart(Activity.java:6942)
at android.app.ActivityThread.performLaunchActivity(ActivityThread.java:3276)
at android.app.ActivityThread.handleLaunchActivity(ActivityThread.java:3415) 
at android.app.ActivityThread.access$1100(ActivityThread.java:229) 
at android.app.ActivityThread$H.handleMessage(ActivityThread.java:1821) 
at android.os.Handler.dispatchMessage(Handler.java:102) 
at android.os.Looper.loop(Looper.java:148) 
at android.app.ActivityThread.main(ActivityThread.java:7325) 
at java.lang.reflect.Method.invoke(Native Method) 
at com.android.internal.os.ZygoteInit$MethodAndArgsCaller.run(ZygoteInit.java:1230) 
at com.android.internal.os.ZygoteInit.main(ZygoteInit.java:1120) 
Caused by: android.view.InflateException: Binary XML file line #0: Error inflating class org.tensorflow.yolo.view.OverlayView
at android.view.LayoutInflater.createView(LayoutInflater.java:657)
at android.view.LayoutInflater.createViewFromTag(LayoutInflater.java:776)
at android.view.LayoutInflater.createViewFromTag(LayoutInflater.java:716)
at android.view.LayoutInflater.rInflate(LayoutInflater.java:847)
at android.view.LayoutInflater.rInflateChildren(LayoutInflater.java:810)
at android.view.LayoutInflater.inflate(LayoutInflater.java:527)
at android.view.LayoutInflater.inflate(LayoutInflater.java:429) 
at org.tensorflow.yolo.view.CameraConnectionFragment.onCreateView(CameraConnectionFragment.java:135) 
at android.app.Fragment.performCreateView(Fragment.java:2281) 
at android.app.FragmentManagerImpl.moveToState(FragmentManager.java:984) 
at android.app.FragmentManagerImpl.moveToState(FragmentManager.java:1164) 
at android.app.BackStackRecord.run(BackStackRecord.java:793) 
at android.app.FragmentManagerImpl.execPendingActions(FragmentManager.java:1557) 
at android.app.FragmentController.execPendingActions(FragmentController.java:326) 
at android.app.Activity.performStart(Activity.java:6942) 
at android.app.ActivityThread.performLaunchActivity(ActivityThread.java:3276) 
at android.app.ActivityThread.handleLaunchActivity(ActivityThread.java:3415) 
at android.app.ActivityThread.access$1100(ActivityThread.java:229) 
at android.app.ActivityThread$H.handleMessage(ActivityThread.java:1821) 
at android.os.Handler.dispatchMessage(Handler.java:102) 
at android.os.Looper.loop(Looper.java:148) 
at android.app.ActivityThread.main(ActivityThread.java:7325) 
at java.lang.reflect.Method.invoke(Native Method) 
at com.android.internal.os.ZygoteInit$MethodAndArgsCaller.run(ZygoteInit.java:1230) 
at com.android.internal.os.ZygoteInit.main(ZygoteInit.java:1120) 
Caused by: java.lang.reflect.InvocationTargetException
at java.lang.reflect.Constructor.newInstance(Native Method)
at android.view.LayoutInflater.createView(LayoutInflater.java:631)
at android.view.LayoutInflater.createViewFromTag(LayoutInflater.java:776) 
at android.view.LayoutInflater.createViewFromTag(LayoutInflater.java:716) 
at android.view.LayoutInflater.rInflate(LayoutInflater.java:847) 
at android.view.LayoutInflater.rInflateChildren(LayoutInflater.java:810) 
at android.view.LayoutInflater.inflate(LayoutInflater.java:527) 
at android.view.LayoutInflater.inflate(LayoutInflater.java:429) 
at org.tensorflow.yolo.view.CameraConnectionFragment.onCreateView(CameraConnectionFragment.java:135) 
at android.app.Fragment.performCreateView(Fragment.java:2281) 
at android.app.FragmentManagerImpl.moveToState(FragmentManager.java:984) 
at android.app.FragmentManagerImpl.moveToState(FragmentManager.java:1164) 
at android.app.BackStackRecord.run(BackStackRecord.java:793) 
at android.app.FragmentManagerImpl.execPendingActions(FragmentManager.java:1557) 
at android.app.FragmentController.execPendingActions(FragmentController.java:326) 
at android.app.Activity.performStart(Activity.java:6942) 
at android.app.ActivityThread.performLaunchActivity(ActivityThread.java:3276) 
at android.app.ActivityThread.handleLaunchActivity(ActivityThread.java:3415) 
at android.app.ActivityThread.access$1100(ActivityThread.java:229) 
at android.app.ActivityThread$H.handleMessage(ActivityThread.java:1821) 
at android.os.Handler.dispatchMessage(Handler.java:102) 
at android.os.Looper.loop(Looper.java:148) 
at android.app.ActivityThread.main(ActivityThread.java:7325) 
at java.lang.reflect.Method.invoke(Native Method) 
at com.android.internal.os.ZygoteInit$MethodAndArgsCaller.run(ZygoteInit.java:1230) 
at com.android.internal.os.ZygoteInit.main(ZygoteInit.java:1120) 
Caused by: java.lang.NoSuchMethodError: No virtual method lines()Ljava/util/stream/Stream; in class Ljava/io/BufferedReader; or its super classes (declaration of 'java.io.BufferedReader' appears in /system/framework/core-libart.jar)
at org.tensorflow.yolo.util.ClassAttrProvider.init(ClassAttrProvider.java:40)
at org.tensorflow.yolo.util.ClassAttrProvider.(ClassAttrProvider.java:27)
at org.tensorflow.yolo.util.ClassAttrProvider.newInstance(ClassAttrProvider.java:32)
at org.tensorflow.yolo.view.OverlayView.(OverlayView.java:40)
at java.lang.reflect.Constructor.newInstance(Native Method) 
at android.view.LayoutInflater.createView(LayoutInflater.java:631) 
at android.view.LayoutInflater.createViewFromTag(LayoutInflater.java:776) 
at android.view.LayoutInflater.createViewFromTag(LayoutInflater.java:716) 
at android.view.LayoutInflater.rInflate(LayoutInflater.java:847) 
at android.view.LayoutInflater.rInflateChildren(LayoutInflater.java:810) 
at android.view.LayoutInflater.inflate(LayoutInflater.java:527) 
at android.view.LayoutInflater.inflate(LayoutInflater.java:429) 
at org.tensorflow.yolo.view.CameraConnectionFragment.onCreateView(CameraConnectionFragment.java:135) 
at android.app.Fragment.performCreateView(Fragment.java:2281) 
at android.app.FragmentManagerImpl.moveToState(FragmentManager.java:984) 
at android.app.FragmentManagerImpl.moveToState(FragmentManager.java:1164) 
at android.app.BackStackRecord.run(BackStackRecord.java:793) 
at android.app.FragmentManagerImpl.execPendingActions(FragmentManager.java:1557) 
at android.app.FragmentController.execPendingActions(FragmentController.java:326) 
at android.app.Activity.performStart(Activity.java:6942) 
at android.app.ActivityThread.performLaunchActivity(ActivityThread.java:3276) 
at android.app.ActivityThread.handleLaunchActivity(ActivityThread.java:3415) 
at android.app.ActivityThread.access$1100(ActivityThread.java:229) 
at android.app.ActivityThread$H.handleMessage(ActivityThread.java:1821) 
at android.os.Handler.dispatchMessage(Handler.java:102) 
at android.os.Looper.loop(Looper.java:148) 
at android.app.ActivityThread.main(ActivityThread.java:7325) 
at java.lang.reflect.Method.invoke(Native Method) 
at com.android.internal.os.ZygoteInit$MethodAndArgsCaller.run(ZygoteInit.java:1230) 
at com.android.internal.os.ZygoteInit.main(ZygoteInit.java:1120) 

Accuracy .

Hi,
Thanks for your work.
I just gave it a run .. to have a feeling how it is working
the network is not really giving the right object .. it gives a prediction of tv monitor while im showing a bottle.
I tried it with and Android mobile without GPU and another Android with GPU. they needed almost the same time. I just noticed that the GPU Android get heated quickly. maybe capture image and send it for prediction would be good idea instead of keep sending frames.

one last thing, any recommendations for the network size? the pb file? I have around 1800 images and many objects inside them.

Conversion of .weights file to .pb file by darkflow

Hi,
I converted the .weights file trained using darknet into .pb file using darkflow and installed APK on Android cellphone, but fails to launch the application.
Is this conversion failed?

procedure:

  1. $ cd ~/darknet
  2. $ ./darknet detector train cfg/voc.data cfg/yolov2-tiny-voc.cfg bin/darknet19_448.conv.23
  3. $ cd ~/darkflow
  4. $ ./flow --module cfg/yolov2-tiny-voc.cfg --load bin/yolov2-tiny- voc-100.weights --savepb

result:
/home/keides2/darkflow/darkflow/dark/darknet.py: 54: UserWarning: ./cfg/yolov2-tiny-voc_100.cfg not found, use cfg / yolov2-tiny-voc.cfg instead
  cfg_path, FLAGS.model))
Parsing cfg / yolov 2 - tiny - voc.cfg
Loading bin / yolov 2 - tiny - voc - 100.weights ...
Successfully identified 63082056 bytes
Finished in 0.0044057369232177734s

Building net ...
Source | Train? | Layer description | Output size
------- + -------- + --------------------------------- - + ---------------
       | | input | (?, 416, 416, 3)
 Load | Yep! | Conv 3x 3 p 1 _ 1 + b norm leaky | (?, 416, 416, 16)
 Load | Yep! | Maxp 2 x 2 p 0 _ 2 | (?, 208, 208, 16)
 Load | Yep! | Conv 3x 3 p 1 _ 1 + b norm leaky | (?, 208, 208, 32)
 Load | Yep! | Maxp 2 x 2 p 0 _ 2 | (?, 104, 104, 32)
 Load | Yep! | Conv 3x3p1_1 + bnorm leaky | (?, 104, 104, 64)
 Load | Yep! | Maxp 2 x 2 p 0 _ 2 | (?, 52, 52, 64)
 Load | Yep! | Conv 3x3p1_1 + bnorm leaky | (?, 52, 52, 128)
 Load | Yep! | Maxp 2 x 2 p 0 _ 2 | (?, 26, 26, 128)
 Load | Yep! | Conv 3x3p1_1 + bnorm leaky | (?, 26, 26, 256)
 Load | Yep! | Maxp 2 x 2 p 0 _ 2 | (?, 13, 13, 256)
 Load | Yep! | Conv 3x3p1_1 + bnorm leaky | (?, 13, 13, 512)
 Load | Yep! | Maxp 2 x 2 p 0 _ 1 | (?, 13, 13, 512)
 Load | Yep! | Conv 3x3p1_1 + bnorm leaky | (?, 13, 13, 1024)
 Load | Yep! | Conv 3x3p1_1 + bnorm leaky | (?, 13, 13, 1024)
 Load | Yep! | Conv 1 x 1 p 0 _ 1 linear | (?, 13, 13, 30)
------- + -------- + --------------------------------- - + ---------------
Running entirely on CPU
2018-07-02 13: 26: 24.956143: I tensorflow / core / platform / cpu_feature_guard.cc: 140] Your CPU support instructions that this TensorFlow binary was not compiled to use: AVX2 FMA
2018-07-02 13: 26: 25.064272: I tensorflow / stream_executor / cuda / cuda_gpu_executor.cc: 898] successful NUMA node read from SysFS had negative value (-1), but there must always be at least one NUMA node, so returning NUMA node zero
2018-07-02 13: 26: 25.065139: I tensorflow / core / common_runtime / gpu / gpu_device.cc: 1356] Found device 0 with properties:
name: GeForce GTX 1080 major: 6 minor: 1 memoryClockRate (GHz): 1.759
pciBusID: 0000: 01: 00.0
totalMemory: 7.93GiB freeMemory: 7.22GiB
2018-07-02 13: 26: 25.065176: I tensorflow / core / common_runtime / gpu / gpu_device.cc: 1435] Adding visible gpu devices: 0
2018-07-02 13: 26: 26.025166: I tensorflow / core / common_runtime / gpu / gpu_device.cc: 923] Device interconnect StreamExecutor with strength 1 edge matrix:
2018-07-02 13: 26: 26.025209: I tensorflow / core / common_runtime / gpu / gpu_device.cc: 929] 0
2018-07-02 13: 26: 26.025235: I tensorflow / core / common_runtime / gpu / gpu_device.cc: 942] 0: N
Finished in 2.7535111904144287s

Rebuild a constant version ...
2018-07-02 13: 26: 26.655822: I tensorflow / core / common_runtime / gpu / gpu_device.cc: 1435] Adding visible gpu devices: 0
2018-07-02 13: 26: 26.655875: I tensorflow / core / common_runtime / gpu / gpu_device.cc: 923] Device interconnect StreamExecutor with strength 1 edge matrix:
2018-07-02 13: 26: 26.655886: I tensorflow / core / common_runtime / gpu / gpu_device.cc: 929] 0
2018-07-02 13: 26: 26.655896: I tensorflow / core / common_runtime / gpu / gpu_device.cc: 942] 0: N
2018-07-02 13: 26: 26.663997: I tensorflow / core / common_runtime / gpu / gpu_device.cc: 1435] Adding visible gpu devices: 0
2018-07-02 13: 26: 26.664021: I tensorflow / core / common_runtime / gpu / gpu_device.cc: 923] Device interconnect StreamExecutor with strength 1 edge matrix:
2018-07-02 13: 26: 26.664029: I tensorflow / core / common_runtime / gpu / gpu_device.cc: 929] 0
2018-07-02 13: 26: 26.664034: I tensorflow / core / common_runtime / gpu / gpu_device.cc: 942] 0: N
Created TensorFlow device (/ job: localhost / replica: 0 / task: 0 / device: GPU: 0 with 6972] Created TensorFlow device MB memory) -> physical GPU (device: 0, name: GeForce GTX 1080, pci bus id: 0000: 01: 00.0, compute capability: 6.1)
Done
(tensorflow) [keides2@bslpc ~/darkflow] $

.tflite

First, thank you.
My question: Does it work with .tflite?

Crashing when using yolo v2

I have trained my a YOLO v2 model on my own dataset. Its weight is around 270 mb. When I try to run it in my app, it keeps crashing. What do I do?

App crash on the simulator

Hello,when i try to build and run the repo on the simulator(Genymotion, Nexus 5, API 23, Android 6.0). The app can get the camera image, but then quickly quit. I wonder if it's because the simulator or the code itself?

The error is like:
dequeueBuffer: createGraphicBuffer failed
can't dequeue multiple buffers without setting the buffer count
at android.hardware.camera2.legacy.SurfaceTextureRenderer.checkEglError(SurfaceTextureRenderer.java:487)
at android.hardware.camera2.legacy.SurfaceTextureRenderer.swapBuffers(SurfaceTextureRenderer.java:480)
at android.hardware.camera2.legacy.SurfaceTextureRenderer.drawIntoSurfaces(SurfaceTextureRenderer.java:681)
at android.hardware.camera2.legacy.GLThreadManager$1.handleMessage(GLThreadManager.java:103)
at android.os.Handler.dispatchMessage(Handler.java:98)
at android.os.Looper.loop(Looper.java:135)
at android.os.HandlerThread.run(HandlerThread.java:61)

Object Detection Not working in Landscape

Hi
Object detection not working in Landscape but its working in portrait. Can anybody please tell me Why Object detection not working in the landscape in Android.
Thanks!

Unable to load the pb file from Custom training model

Hi Szaza,

Thanks a lot for your contribution. I am using Windows 10 and I have incorporated training of my custom model using tiny-yolo to identify 2 classes.
Now, I would like to load it into an android app however I am unable to use the app on my phone, it keeps crashing.
Is it because I did my training of my custom model using tensorflow 1.14.0?
As i am new to this area of Android Studio and applications. I would really appreciate some guidance. Thanks.
Here is the error that i am receiving when i start the app on my mobile device.

2019-12-29 18:41:56.072 28172-28172/org.tensorflow.yolo E/AndroidRuntime: FATAL EXCEPTION: main
Process: org.tensorflow.yolo, PID: 28172
java.lang.RuntimeException: Failed to load model from 'file:///android_asset/tiny-yolo-voc-graph.pb'
at org.tensorflow.contrib.android.TensorFlowInferenceInterface.(TensorFlowInferenceInterface.java:113)
at org.tensorflow.yolo.TensorFlowImageRecognizer.create(TensorFlowImageRecognizer.java:42)
at org.tensorflow.yolo.view.ClassifierActivity.onPreviewSizeChosen(ClassifierActivity.java:55)
at org.tensorflow.yolo.view.CameraActivity.lambda$setFragment$0$CameraActivity(CameraActivity.java:116)
at org.tensorflow.yolo.view.-$$Lambda$CameraActivity$SAOsaS8oaXCIIH441aIwhMnb5HU.onPreviewSizeChosen(Unknown Source:2)
at org.tensorflow.yolo.view.CameraConnectionFragment.setUpCameraOutputs(CameraConnectionFragment.java:291)
at org.tensorflow.yolo.view.CameraConnectionFragment.openCamera(CameraConnectionFragment.java:298)
at org.tensorflow.yolo.view.CameraConnectionFragment.access$000(CameraConnectionFragment.java:52)
at org.tensorflow.yolo.view.CameraConnectionFragment$1.onSurfaceTextureAvailable(CameraConnectionFragment.java:163)
at android.view.TextureView.getTextureLayer(TextureView.java:390)
at android.view.TextureView.draw(TextureView.java:339)
at android.view.View.updateDisplayListIfDirty(View.java:20761)
at android.view.View.draw(View.java:21614)
at android.view.ViewGroup.drawChild(ViewGroup.java:4558)
at android.view.ViewGroup.dispatchDraw(ViewGroup.java:4333)
at android.view.View.updateDisplayListIfDirty(View.java:20747)
at android.view.View.draw(View.java:21614)
at android.view.ViewGroup.drawChild(ViewGroup.java:4558)
at android.view.ViewGroup.dispatchDraw(ViewGroup.java:4333)
at android.view.View.draw(View.java:21891)
at android.view.View.updateDisplayListIfDirty(View.java:20761)
at android.view.View.draw(View.java:21614)
at android.view.ViewGroup.drawChild(ViewGroup.java:4558)
at android.view.ViewGroup.dispatchDraw(ViewGroup.java:4333)
at android.view.View.updateDisplayListIfDirty(View.java:20747)
at android.view.View.draw(View.java:21614)
at android.view.ViewGroup.drawChild(ViewGroup.java:4558)
at android.view.ViewGroup.dispatchDraw(ViewGroup.java:4333)
at android.view.View.updateDisplayListIfDirty(View.java:20747)
at android.view.View.draw(View.java:21614)
at android.view.ViewGroup.drawChild(ViewGroup.java:4558)
at android.view.ViewGroup.dispatchDraw(ViewGroup.java:4333)
at android.view.View.draw(View.java:21891)
at com.android.internal.policy.DecorView.draw(DecorView.java:1082)
at android.view.View.updateDisplayListIfDirty(View.java:20761)
at android.view.ThreadedRenderer.updateViewTreeDisplayList(ThreadedRenderer.java:725)
at android.view.ThreadedRenderer.updateRootDisplayList(ThreadedRenderer.java:731)
at android.view.ThreadedRenderer.draw(ThreadedRenderer.java:840)
at android.view.ViewRootImpl.draw(ViewRootImpl.java:3935)
at android.view.ViewRootImpl.performDraw(ViewRootImpl.java:3709)
at android.view.ViewRootImpl.performTraversals(ViewRootImpl.java:3017)
at android.view.ViewRootImpl.doTraversal(ViewRootImpl.java:1876)
at android.view.ViewRootImpl$TraversalRunnable.run(ViewRootImpl.java:8499)
at android.view.Choreographer$CallbackRecord.run(Choreographer.java:949)
at android.view.Choreographer.doCallbacks(Choreographer.java:761)
at android.view.Choreographer.doFrame(Choreographer.java:696)
at android.view.Choreographer$FrameDisplayEventReceiver.run(Choreographer.java:935)
at android.os.Handler.handleCallback(Handler.java:873)
at android.os.Handler.dispatchMessage(Handler.java:99)
at android.os.Looper.loop(Looper.java:214)
at android.app.ActivityThread.main(ActivityThread.java:7037)
at java.lang.reflect.Method.invoke(Native Method)
at com.android.internal.os.RuntimeInit$MethodAndArgsCaller.run(RuntimeInit.java:494)
at com.android.internal.os.ZygoteInit.main(ZygoteInit.java:965)
2019-12-29 18:41:56.072 28172-28172/org.tensorflow.yolo E/AndroidRuntime: Caused by: java.io.IOException: Not a valid TensorFlow Graph serialization: NodeDef mentions attr 'explicit_paddings' not in Op<name=Conv2D; signature=input:T, filter:T -> output:T; attr=T:type,allowed=[DT_HALF, DT_BFLOAT16, DT_FLOAT, DT_DOUBLE]; attr=strides:list(int); attr=use_cudnn_on_gpu:bool,default=true; attr=padding:string,allowed=["SAME", "VALID"]; attr=data_format:string,default="NHWC",allowed=["NHWC", "NCHW"]; attr=dilations:list(int),default=[1, 1, 1, 1]>; NodeDef: {{node 0-convolutional}}. (Check whether your GraphDef-interpreting binary is up to date with your GraphDef-generating binary.).
at org.tensorflow.contrib.android.TensorFlowInferenceInterface.loadGraph(TensorFlowInferenceInterface.java:561)
at org.tensorflow.contrib.android.TensorFlowInferenceInterface.(TensorFlowInferenceInterface.java:105)
... 53 more

Node Output does not exist in model

This is my custom pb file for detection for single object.I'm getting this error.
Process: org.tensorflow.yolo, PID: 23066
java.lang.RuntimeException: Node 'output' does not exist in model 'file:///android_asset/tiny-yolo-voc-graph.pb'
at org.tensorflow.contrib.android.TensorFlowInferenceInterface.graphOperation(TensorFlowInferenceInterface.java:257)

Change camera source

Hello,

I am developing a drone object detection application with a DJI drone. I have my trained model and it is working by changing the assets as you explained in the guide.

I am now tying to change the camera source with the DJI camera whose minimum resolution is 1280x720. I am able to run the detector on the drone video source but the bitmap on which the detection run is distorted and the bounding boxes displayed are meaningless.

I would like to understand better how the detection process work in this application and how you manage to get the 416x416 bitmap from the android camera correctly in such a way to modify in the same way the stream video coming from the drone camera of this application. Thanks a lot for your time

BR

Can't Compile it

Unable to find method 'org.gradle.api.internal.TaskInputsInternal.property(Ljava/lang/String;Ljava/lang/Object;)Lorg/gradle/api/tasks/TaskInputs;'.
Possible causes for this unexpected error include:
Gradle's dependency cache may be corrupt (this sometimes occurs after a network connection timeout.)
Re-download dependencies and sync project (requires network)

The state of a Gradle build process (daemon) may be corrupt. Stopping all Gradle daemons may solve this problem.
Stop Gradle build processes (requires restart)

Your project may be using a third-party plugin which is not compatible with the other plugins in the project or the version of Gradle requested by the project.

In the case of corrupt Gradle processes, you can also try closing the IDE and then killing all Java processes.

Black Screen

When I run the app it is just showing a black screen and nothing else.

About misjudgment measures

@szaza

Hi,
The results learned with darknet are very good results (loss = 0.15, avg loss = 0.13, precison = 0.87, recall = 0.97, F1 - score = 0.92, mAP = 0.96), but converting the weights file to a pb file afterwards, when this pb file is loaded on andoroid-yolo-v2 and detection of an object is done, it reacts too quickly and classifies the photographed object instantly, so it seems to be erroneously judging .

Can I change the code so that object detection starts after the same object appears for 1 second or more?

Or, I would like to make a change so that the same detection result will be the detection result for the first time three consecutive times, where should I fix the code?

Thnak you in advance,

keides2

java.lang.ArrayIndexOutOfBoundsException

Hi I made my own pb file. However, I got an error that I do not know.

The pb file seems to load well

10/15 10:23:57: Launching android-yolo-v2-master
$ adb install-multiple -r -t -p org.tensorflow.yolo C:\Users\pine\Desktop\Tensorflow Object Dection\Yolo\android-yolo-v2-master\gradleBuild\intermediates\split-apk\debug\dep\dependencies.apk C:\Users\pine\Desktop\Tensorflow Object Dection\Yolo\android-yolo-v2-master\gradleBuild\intermediates\split-apk\debug\slices\slice_5.apk C:\Users\pine\Desktop\Tensorflow Object Dection\Yolo\android-yolo-v2-master\gradleBuild\intermediates\split-apk\debug\slices\slice_8.apk C:\Users\pine\Desktop\Tensorflow Object Dection\Yolo\android-yolo-v2-master\gradleBuild\outputs\apk\debug\android-yolo-v2-master-debug.apk
Split APKs installed
$ adb shell am start -n "org.tensorflow.yolo/org.tensorflow.yolo.view.ClassifierActivity" -a android.intent.action.MAIN -c android.intent.category.LAUNCHER
Client not ready yet..Waiting for process to come online
Connected to process 29722 on device samsung-sm_g950n-ce02171295db6c2501
Capturing and displaying logcat messages from application. This behavior can be disabled in the "Logcat output" section of the "Debugger" settings page.
E/Zygote: isWhitelistProcess - Process is Whitelisted
W/SELinux: SELinux selinux_android_compute_policy_index : Policy Index[2], Con:u:r:zygote:s0 RAM:SEPF_SM-G950N_8.0.0_0016, [-1 -1 -1 -1 0 1]
I/SELinux: SELinux: seapp_context_lookup: seinfo=untrusted, level=s0:c512,c768, pkgname=org.tensorflow.yolo
I/zygote64: Late-enabling -Xcheck:jni
D/ActivityThread: Added TimaKeyStore provider
I/vndksupport: sphal namespace is not configured for this process. Loading /vendor/lib64/egl/libGLES_mali.so from the current namespace instead.
D/libEGL: loaded /vendor/lib64/egl/libGLES_mali.so
I/InstantRun: starting instant run server: is main process
I/zygote64: Do partial code cache collection, code=30KB, data=17KB
After code cache collection, code=30KB, data=17KB
Increasing code cache capacity to 128KB
I/TextToSpeech: Sucessfully bound to com.samsung.SMT
D/OpenGLRenderer: HWUI GL Pipeline
D/ViewRootImpl@334c0e0[ClassifierActivity]: setView = DecorView@d7a8f99[ClassifierActivity] TM=true MM=false
I/TextToSpeech: Connected to ComponentInfo{com.samsung.SMT/com.samsung.SMT.SamsungTTSService}
D/ViewRootImpl@334c0e0[ClassifierActivity]: dispatchAttachedToWindow
I/TextToSpeech: Set up connection to ComponentInfo{com.samsung.SMT/com.samsung.SMT.SamsungTTSService}
V/Surface: sf_framedrop debug : 0x4f4c, game : false, logging : 0
D/ViewRootImpl@334c0e0[ClassifierActivity]: Relayout returned: old=[0,0][0,0] new=[0,0][1080,2220] result=0x7 surface={valid=true 544431398912} changed=true
D/ViewRootImpl@334c0e0[ClassifierActivity]: MSG_RESIZED_REPORT: frame=Rect(0, 0 - 1080, 2220) ci=Rect(0, 0 - 0, 144) vi=Rect(0, 0 - 0, 144) or=1
D/ViewRootImpl@334c0e0[ClassifierActivity]: MSG_WINDOW_FOCUS_CHANGED 1
V/InputMethodManager: Starting input: tba=android.view.inputmethod.EditorInfo@45b8c55 nm : org.tensorflow.yolo ic=null
I/InputMethodManager: startInputInner - mService.startInputOrWindowGainedFocus
I/OpenGLRenderer: Initialized EGL, version 1.4
D/OpenGLRenderer: Swap behavior 2
D/libGLESv1: STS_GLApi : DTS, ODTC are not allowed for Package : org.tensorflow.yolo
D/mali_winsys: EGLint new_window_surface(egl_winsys_display *, void *, EGLSurface, EGLConfig, egl_winsys_surface **, egl_color_buffer_format *, EGLBoolean) returns 0x3000, [1080x2220]-format:1
D/OpenGLRenderer: eglCreateWindowSurface = 0x7ed0a0fd10
V/Surface: sf_framedrop debug : 0x4f4c, game : false, logging : 0
I/CameraManagerGlobal: Connecting to camera service
D/VendorTagDescriptor: addVendorDescriptor: vendor tag id 3854507339 added
I/TensorFlowInferenceInterface: Checking to see if TensorFlow native methods are already loaded
E/zygote64: No implementation found for long org.tensorflow.contrib.android.RunStats.allocate() (tried Java_org_tensorflow_contrib_android_RunStats_allocate and Java_org_tensorflow_contrib_android_RunStats_allocate__)
I/TensorFlowInferenceInterface: TensorFlow native methods not found, attempting to load via tensorflow_inference
I/TensorFlowInferenceInterface: Successfully loaded TensorFlow native methods (RunStats error may be ignored)
I/TensorFlowInferenceInterface: Model load took 13ms, TensorFlow version: 1.6.0-rc1
Successfully loaded model from 'file:///android_asset/five_fist_20000_yoloGraph.pb'
I/YOLO: Sensor orientation: 90, Screen orientation: 0
I/YOLO: Initializing at size 640x480
V/Surface: sf_framedrop debug : 0x4f4c, game : false, logging : 0
I/YOLO: Opening camera preview: 640x480
V/Surface: sf_framedrop debug : 0x4f4c, game : false, logging : 0
V/InputMethodManager: Starting input: tba=android.view.inputmethod.EditorInfo@c89f10d nm : org.tensorflow.yolo ic=null
I/Choreographer: Skipped 41 frames! The application may be doing too much work on its main thread.
I/zygote64: Deoptimizing org.tensorflow.yolo.model.BoundingBox org.tensorflow.yolo.YOLOClassifier.getModel(float[], int, int, int, int, int) due to block bounds check elimination
E/AndroidRuntime: FATAL EXCEPTION: inference
Process: org.tensorflow.yolo, PID: 29722
java.lang.ArrayIndexOutOfBoundsException: length=3549; index=3549
at org.tensorflow.yolo.YOLOClassifier.getModel(YOLOClassifier.java:88)
at org.tensorflow.yolo.YOLOClassifier.classifyImage(YOLOClassifier.java:71)
at org.tensorflow.yolo.TensorFlowImageRecognizer.recognizeImage(TensorFlowImageRecognizer.java:50)
at org.tensorflow.yolo.view.ClassifierActivity.lambda$onImageAvailable$1$ClassifierActivity(ClassifierActivity.java:107)
at org.tensorflow.yolo.view.ClassifierActivity$$Lambda$1.run(Unknown Source:0)
at android.os.Handler.handleCallback(Handler.java:789)
at android.os.Handler.dispatchMessage(Handler.java:98)
at android.os.Looper.loop(Looper.java:164)
at android.os.HandlerThread.run(HandlerThread.java:65)
Application terminated.

Recommend Projects

  • React photo React

    A declarative, efficient, and flexible JavaScript library for building user interfaces.

  • Vue.js photo Vue.js

    🖖 Vue.js is a progressive, incrementally-adoptable JavaScript framework for building UI on the web.

  • Typescript photo Typescript

    TypeScript is a superset of JavaScript that compiles to clean JavaScript output.

  • TensorFlow photo TensorFlow

    An Open Source Machine Learning Framework for Everyone

  • Django photo Django

    The Web framework for perfectionists with deadlines.

  • D3 photo D3

    Bring data to life with SVG, Canvas and HTML. 📊📈🎉

Recommend Topics

  • javascript

    JavaScript (JS) is a lightweight interpreted programming language with first-class functions.

  • web

    Some thing interesting about web. New door for the world.

  • server

    A server is a program made to process requests and deliver data to clients.

  • Machine learning

    Machine learning is a way of modeling and interpreting data that allows a piece of software to respond intelligently.

  • Game

    Some thing interesting about game, make everyone happy.

Recommend Org

  • Facebook photo Facebook

    We are working to build community through open source technology. NB: members must have two-factor auth.

  • Microsoft photo Microsoft

    Open source projects and samples from Microsoft.

  • Google photo Google

    Google ❤️ Open Source for everyone.

  • D3 photo D3

    Data-Driven Documents codes.