Comments (2)
Hi, thanks for your question!
If you ask how we trained the model, then the answer is that for our dataset (Table I in the paper) we only extract training/validation/test samples when the vehicle is either still occluded behind the first (e.g. left corner) while approaching, or within line-of-sight. No samples are extracted when the vehicle disappeared again behind the other (e.g. right) corner. That means that we do not explicitly train the model to give a particular response after the vehicle has crossed the view, and this situation is also not explicitly evaluated in the initial ablation study (Table II).
However, we do see some occurrence of this effect in our online experiments where the trained model is applied in a sliding window. E.g. see Figure 8(a), where you can see that after the crossing, when the probability of "front" goes down, indeed the probability of "right" increases as one might expect. Vice versa the probability of "left" increases in Figure 8(b) after the vehicle moved right to left. So, while we did not explicitly train for this, in practice we can observe that our model will indeed recognize that the vehicle has moved behind the other corner when applied in an online filter. It appears that the reflection patterns that we classify appear sufficiently similar, even though the vehicle is moving away now.
Hope this answers your question.
from occluded_vehicle_acoustic_detection.
Thanks for your prompt answer.
First of all, there have been some methods using millimeter wave radar and computer vision methods to detect non-line-of-sight vehicles. Your idea of using multi-channel acoustic information for non-line-of-sight vehicle detection is amazing.
And your explanation is very detailed, you also gives the possible reason corresponding to the classification results. I agree with you very much. Due to the similarity of reflection patterns, the model classifies the vehicles that are moving away again. The question I asked was well answered by your answer.
Thank you for sharing your code on GitHub.
from occluded_vehicle_acoustic_detection.
Related Issues (3)
Recommend Projects
-
React
A declarative, efficient, and flexible JavaScript library for building user interfaces.
-
Vue.js
🖖 Vue.js is a progressive, incrementally-adoptable JavaScript framework for building UI on the web.
-
Typescript
TypeScript is a superset of JavaScript that compiles to clean JavaScript output.
-
TensorFlow
An Open Source Machine Learning Framework for Everyone
-
Django
The Web framework for perfectionists with deadlines.
-
Laravel
A PHP framework for web artisans
-
D3
Bring data to life with SVG, Canvas and HTML. 📊📈🎉
-
Recommend Topics
-
javascript
JavaScript (JS) is a lightweight interpreted programming language with first-class functions.
-
web
Some thing interesting about web. New door for the world.
-
server
A server is a program made to process requests and deliver data to clients.
-
Machine learning
Machine learning is a way of modeling and interpreting data that allows a piece of software to respond intelligently.
-
Visualization
Some thing interesting about visualization, use data art
-
Game
Some thing interesting about game, make everyone happy.
Recommend Org
-
Facebook
We are working to build community through open source technology. NB: members must have two-factor auth.
-
Microsoft
Open source projects and samples from Microsoft.
-
Google
Google ❤️ Open Source for everyone.
-
Alibaba
Alibaba Open Source for everyone
-
D3
Data-Driven Documents codes.
-
Tencent
China tencent open source team.
from occluded_vehicle_acoustic_detection.