nubots / robocup Goto Github PK
View Code? Open in Web Editor NEWThe NUbot's RoboCup Code
License: GNU General Public License v3.0
The NUbot's RoboCup Code
License: GNU General Public License v3.0
This includes multiple things:
Our RANSAC implementation is amazingly good at what it does, and we should capitalize on it by continuing to improve it in various ways. List any ideas you have on this issue, as it will help us in our search for better ways of doing RANSAC type stuff.
One paper I came across that looks like it could be useful is this one:
http://citeseerx.ist.psu.edu/viewdoc/summary?doi=10.1.1.134.4816
It already has an example of circle-finding, and fits closely with RANSAC's sample-and-see methods.
Currently even given a perfectly classified ball the detection method can fail to correctly fit the bottom - this is a big issue as this leads to large errors in distance calculations.
This may be related to fixing the head pan job too.
Using this issue system seems fairly worthwhile. It could help keep track of important things that more than just one or two team members need to know about, + seems more efficient/reliable than leaving post-it-notes on things / the lab's blackboard / facebook (although they all have their uses).
Any thoughts?
This is a bit of a phantom issue that I can't reproduce very easily so I don't have a gdb dump of it yet. Everyone watch out for hte phantom segfaults!
asdgasdfa
These classes kind of bloated. There's a fair bit I'm not too fond of here.
(a bunch of stuff about the Actionators in general looks a little questionable at the moment)
I feel that we should review these classes after Robocup, and look into making them neater and clearer.
A configurable object may only need to subscribe to changes in small sets of parameters stored in separate parts of the config tree.
Currently, the Configurable object would be forced to subscribe to all changes below some mutual ancestor of the parameters it needs (which, in the worst case, is the root node), causing it to be updated unnecessarily often (every time any parameter is changed, in the worst case).
A performance improvement would be to allow a set of base paths, so that the Configurable can subscribe only to what it needs.
Occasionally (always?) has green horizon segfault (i.e. camera not connected). Probably needs a new camera.
upgrade the OS on all nubots before we leave.
Some times camera settings are not set properly
The firmware on robots 5 and 6 have had their firmware updated/downgraded - but don't work correctly.
We now have more complicated shapes detected by vision but the fieldobjects classes only handle basic descriptors (screen point and size) making displaying exactly what vision outputs difficult for the UNbugger.
As discovered today, the CPU/Ubuntu slows itself when taken off mains power (presumably to save battery power). It caused a very effect on the number of frames received by NUbugger per second (~25-30 to 3-6).
This needs to be further investigated into regarding how to disable it and how long the robot can last on full speed.
The Config System currently stores configurations on disk in a single colossal file.
This means that saving the current configuration to disk takes a long time - no matter how few parameters have actually changed.
A planned extension to the Config System is to allow users to choose entire branches of the tree that should be stored in their own file.
This would allow faster saving (and potentially, loading) of configurations when only a small number of parameters in the tree have changed - which could be useful for e.g. machine learning applications.
Fix delay and data corruption issues with motor communication, and give informative errors when something is wrong.
You can test this by going into zombie mode and moving the head and watch the localisation of the ball move on NUbugger/NUview.
Very important for this to be fixed before RoboCup!
error.log
DarwinCamera::loadCameraOffset(). Unable to load camera offset.
ColourTransitionRule istream operator: UNKOWN_COLOUR match ignored.
NUCameraData::LoadFromConfigFile(). Unable to load camera specifications.
TERMINATION HANDLER: SIGSEGV
./nubotbin(_ZN5NUbot18terminationHandlerEi+0xbf) [0x80cc8af]
[0xb7751400]
./nubotbin(_ZN5Robot17SensorReadManager24FilterLimbSensorFailuresERSt6vectorIiS$
./nubotbin(_ZN5Robot17SensorReadManager31GetFilteredLikelySensorFailuresEPSt6ve$
./nubotbin(_ZN13DarwinSensors30copyFromHardwareCommunicationsEv+0x113) [0x82c13$
./nubotbin(_ZN9NUSensors6updateEv+0x2f) [0x80d107f]
./nubotbin(_ZN15SenseMoveThread3runEv+0x2d) [0x80f62ed]
./nubotbin(_ZN6Thread9runThreadEPv+0xf) [0x81556af]
/lib/i386-linux-gnu/libpthread.so.0(+0x6d4c) [0xb7604d4c]
/lib/i386-linux-gnu/libc.so.6(clone+0x5e) [0xb740adde]
When the code is run on 1 the robot often stops responding and reconnection is required.
This is a test issue.
When the camera is broken, disconnected or otherwise inaccessible we get the following error in Vision.
Program received signal SIGSEGV, Segmentation fault.
[Switching to Thread 0xb23cbb70 (LWP 3108)]
0x081a4f76 in GreenHorizonCH::calculateHorizon() ()
The DarwinCamera class needs to more correctly detect these faults and give useful warning messages instead of allowing Vision to receive unallocated pointers.
Important to fix, but not urgent for RC2013 as most members currently know the root cause of the issue, and it doesn't occur in playable scenarios (i.e. if this segfault happens we couldn't have been playing anyway as we have no camera stream).
Camera needs a new cable.
fix up our scripts so we can script using IK to make balance easier.
The base of the goal is quite jittery. After detecting a goal, vision should look at the base at pixel level to determine a better base position.
Jo tells me the main ubuntu speech synthesisers are espeak and festival.
The ambiguousFieldObjects vector in FieldObjects is emptied on every frame, so all objects added are in fact visible. However the method isObjectVisible() on some field objects (such as yellow goal posts) still returns false.
The speaker volume should be set to maximum at startup.
fix the issue with our cameras not adjusting/balancing properly. Make the image not change!
we need to see more above the horizon for both our visual compass and our goal detection. This needs motion/NUHead.cpp to be updated.
There should be a wiki page for the ConfigSystem that outlines how to use it / how it should be used.
I just noticed the nubots account is actually just a normal account.
I suggest turning it into an organisation! That way we can have administrative rights and teams and all things that are good.
It allows us to manage admins and give (or take) specific permissions out to other users, rather than all logging in under the nubots account and making changes there.
There is a button that turns it into an org so it isn't a big deal.
Thoughts?
Could not SSH into the robot.
Connected a monitor via the HDMI port to see what was happening.
Was presented with the GRUB rescue prompt:
GRUB loading.
error: out of partition
grub rescue>
This may only be a personal issue, or only an issue for a few people but it renders nuview unusable and is terribly difficult to find information on.
Our robots were really dumb last year. We need at least better lost and team coordination behaviour. This probably depends on vision/localisation.
Currently the image storage is in raw binary format, which costs us 307200 bytes per frame leading to very large images. We should implement some form of lossless compression when we move to off-robot logging as per Brendan and Jo's plan.
The Config System is missing a number of validation checks.
Until these (and probably others) are implemented, it will be possible for users to corrupt the robot's configuration state just by calling the Config System's interface methods.
Missing checks include:
(If additional necessary checks are identified, they should be added to this issue)
The robots tend to keep the ball at the very edge of their view when they're trying to look at it.
(This is thought to be caused by a change in the way that localization reports the ball position)
The robot should keep the ball centered horizontally within it's view when it is attempting to track it.
ensure we can localise properly with two sets of same coloured goals
Probable cause is a combination of lens distortion and sensor/image mismatch.
This needs to be fixed for lines to work.
Robot 3 will now run the binary fine, however there it something wrong with at least 1 motor in his left leg.
The torque is on, but the leg is at a completely wrong angle (probably ~30-40 degrees out), suggesting that the motor offsets are wrong?
The rest of him seems to function perfectly fine.
Changes have been made to Robot 1's configuration that have improved its performance.
The other robots need to have the newer configuration.
The robots configuration should be put under source control.
We need to do walk learning on the robots to stop them from falling over.
Being well behind the latest, I'm sure everyone would love these to be updated to a recent OS just so we can use package managers.
Current candidate is Xubuntu. I think Gentoo was also suggested.
If anyone else has a suggestion, please say!
Upgrade strategy:
Motors id#1 (R_SHOULDER_PITCH) and id#11 (R_HIP_PITCH) of robot No.4 aren't working.
They're not stuck + i can move them around fine, but they return errors and dxl_monitor fails to connect to them:
[ID:200(SUB_BOARD)] scan
Check ID:1(R_SHOULDER_PITCH) ... FAIL
Check ID:2(L_SHOULDER_PITCH) ... OK
Check ID:3(R_SHOULDER_ROLL) ... OK
Check ID:4(L_SHOULDER_ROLL) ... OK
Check ID:5(R_ELBOW) ... OK
Check ID:6(L_ELBOW) ... OK
Check ID:7(R_HIP_YAW) ... OK
Check ID:8(L_HIP_YAW) ... OK
Check ID:9(R_HIP_ROLL) ... OK
Check ID:10(L_HIP_ROLL) ... OK
Check ID:11(R_HIP_PITCH) ... FAIL
Check ID:12(L_HIP_PITCH) ... OK
Check ID:13(R_KNEE) ... OK
Check ID:14(L_KNEE) ... OK
Check ID:15(R_ANKLE_PITCH) ... OK
Check ID:16(L_ANKLE_PITCH) ... OK
Check ID:17(R_ANKLE_ROLL) ... OK
Check ID:18(L_ANKLE_ROLL) ... OK
Check ID:19(HEAD_PAN) ... OK
Check ID:20(HEAD_TILT) ... OK
Check ID:111(UNKNOWN) ... OK
Check ID:112(UNKNOWN) ... OK
Check ID:200(SUB_BOARD) ... OK
This is not urgent as the checks that are being done are valid (i.e. balls that are inside the quadrilateral found for a goal are being removed) but this should be done in a cleaner way, perhaps a post processing method that considers all detected object types as a collective rather than purely finding each one individually.
A declarative, efficient, and flexible JavaScript library for building user interfaces.
๐ Vue.js is a progressive, incrementally-adoptable JavaScript framework for building UI on the web.
TypeScript is a superset of JavaScript that compiles to clean JavaScript output.
An Open Source Machine Learning Framework for Everyone
The Web framework for perfectionists with deadlines.
A PHP framework for web artisans
Bring data to life with SVG, Canvas and HTML. ๐๐๐
JavaScript (JS) is a lightweight interpreted programming language with first-class functions.
Some thing interesting about web. New door for the world.
A server is a program made to process requests and deliver data to clients.
Machine learning is a way of modeling and interpreting data that allows a piece of software to respond intelligently.
Some thing interesting about visualization, use data art
Some thing interesting about game, make everyone happy.
We are working to build community through open source technology. NB: members must have two-factor auth.
Open source projects and samples from Microsoft.
Google โค๏ธ Open Source for everyone.
Alibaba Open Source for everyone
Data-Driven Documents codes.
China tencent open source team.