Giter VIP home page Giter VIP logo

getting-started's Introduction

This repository is only used for the GitHub issue tracker.

Contributing via University or other programs

The TimVideos project has a dedicated issue tracker which contains tasks (both small and large) which are suitable for contributing as part of organised programs like;

We are very happy to work with students who wish to contribute to the TimVideos project as part of their university course work (and happy to fill out the required paper work).

Smaller Bugs

If you are looking for smaller bugs to get started with, try the global TimVideos issue tracker and look for bugs marked with the Good First Bug label.

Using the Issue Tracker

Using Labels to Filter

Issue Filter

Issue Information

Issue Block Issue Page

Understanding Labels

All tasks in the tracker should have labels assigned to them. The labels have specific meanings which are listed here;

Difficulty Label

The difficulty label indicates how much knowledge or ability is needed in the skills labels attached.

Easy tasks are suitable for people who might not have the skill(s) / language(s) listed but are willing to learn them. Easy tasks should be comprehensively specified.

Medium tasks are suitable for people who already have some experience with the skill(s) / language(s) and reasonable confident with them. Medium tasks might require some research or independent development.

Challenging tasks are suitable for people who have extensive experience with the skill(s) / language(s) and are confident in their ability with them. These tasks might also be underspecified and require a lot of independent research.

While tasks at any level can be completed by anyone, the levels generally map the following experience levels,

  • Easy tasks are ideally suited to students who are currently starting out at University or wishing to learn new skills.
  • Medium tasks are better suited to students who are at the end of their undergraduate degrees.
  • Challenging tasks are better suited to people who are doing post graduate work.

More Information

More information about the TimVideos project and contributing the TimVideos developer website can be found at http://code.timvideos.us

getting-started's People

Contributors

froggiebecky avatar graphshark avatar iiie avatar luke-john avatar mithro avatar parx avatar

Stargazers

 avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar

Watchers

 avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar

getting-started's Issues

[HDMI2USB] Get Milkymist "Video DJing" functionality on the Numato Opsis board

Get Milkymist "Video DJing" functionality on the Numato Opsis board

Brief explanation

The Milkymist board provided a bunch of cool Video DJing output options (see following pictures), we want the same type of thing working on the Numato Opsis board.

image

Expected results

The Numato Opsis when combined with a Milkymist Expansion board is able to reproduce all functionality of the original Milkymist device.

Detailed Explanation

Very detailed explanation of the options around implementation can be found in the following Google Doc;
https://docs.google.com/document/d/1YyhCqTaUrI_vQjNGHJ34TsJCoDwpMzd2XcavAEllrUg/edit#

Further reading

Knowledge Prerequisites

  • C coding experience
  • Verilog experience very useful

Contacts

[Veyepar] Use avahi / zeroconfig to find our hosts on the network

Brief explanation

We'd like to worry about fewer things, who wouldn't.
If we're setting a room with a couple of computers to record a talk, we shouldn't have to configure the network, it should "just work".
Avahi makes this easy, farther away host take longer to respond.
Make a setup program that updates the configs for either DVSwitch or GSTSwitch.

Expected results

Detailed Explanation

A much longer description of what the thing you want to do.

Further reading

Knowledge Prerequisites

Contacts

LKV373 HDMI framegrabber

There's a HDMI extender device, Lenkeng LKV373, which captures HDMI, and transcodes the video into MJPEG (much like HDMI2USB) and outputs the stream over multicast UDP.

They're designed to run as a pair of sender + receiver, but someone has reverse engineered the wire protocol, and written some notes on reverse engineering the units:

http://danman.eu/blog/?p=110

In the comments there seems to be notes about a TTL serial interface which allows control of encoder parameters. It streams 1080p video at about 18fps -- more than adequate for capturing slides.

In a real-world setup, you would still use the receiver hardware, in order to output the video to a projector. As they output to ethernet, there would be no need to have the "projector laptop" as in a typical capture scenario, or that would be needed with HDMI2USB. With some gstreamer trickery, this could appear as a source.

I can see these on places like Aliexpress for about 80 USD in lots of 1 (of both sender + receiver). I haven't yet ordered one but it looks promising.

[gst-switch] Track the speaker! Zoom! Pan! Tilt!

Brief explanation

Create and test a system for keeping a camera focused on the action.

Detailed Explanation

Make a tripod attachment that controls pan and tilt.

Track where the action is in a video stream, follow the action. But don't make us seasick!

Make a controller for zoom.

Contacts

  • Potential Mentors: TimVideos/Veyepar team

[streaming-system #56] Dockerize flumotion and support components

More technical details at Streaming System Issue #56

Brief explanation

The streaming system uses flumotion for streaming, we want to put it into a docker container to make distribution and bring up easy.

This project is about making distribution and bring up easy, not about sandboxing / isolation.

Expected results

  • Docker image for each type of flumotion system.

Detailed Explanation

Currently we have 1 flumotion per stream. This currently uses 1 machine per stream as managing the configuration of multiple streams on one machine is to hard. If we had docker image we could start up multiple docker containers on a single machine while still having simple configuration setup.

Docker

This project is about making distribution and bring up easy, not about sandboxing / isolation.

Docker is an open-source project to easily create lightweight, portable, self-sufficient
containers from any application. The same container that a developer builds and
tests on a laptop can run at scale, in production, on VMs, bare metal, OpenStack
clusters, public clouds and more.


Portable deployment across machines.
Docker defines a format for bundling an application and all its dependencies into a single
object which can be transferred to any Docker-enabled machine, and executed there with
the guarantee that the execution environment exposed to the application will be the same.


Application-centric*
Docker is optimized for the deployment of applications, as opposed to machines.
This is reflected in its API, user interface, design philosophy and documentation.

http://docs.docker.io/en/latest/faq/#what-does-docker-add-to-just-plain-lxc

Flumotion in the TimVideos streaming system

There are 3 types of flumotion configurations; they should share a base docker image but also be provides a separate docker image correctly configured.

  • Collector - runs on site at a conference and does light encode to compress data so you can send it over the internet to an encoder.
    • Requires dvswitch support.
  • Encoder - runs in the cloud and converts video from the light encode format to something suitable for people to view in their browser.
  • Repeater/Amplifier - runs in the cloud and streams a copy of the encoder output. Allows the system to scale past the limits of a single machine.

There are a couple of support applications we use with flumotion; these should be installed inside the docker image and started as part of the container.

  • watchdog - Restarts failed components and does a full system restart when things go totally wrong.
  • register - Registers with the tracker website (so the stream can be found) and sends statistics and logs for reporting and load balancing support.

Further reading

Knowledge Prerequisites

  • Python coding skills.
  • Some system administration experience very useful.

Contacts

[LiteX] Add support for j2 (j-core) to LiteX

Brief explanation

LiteX currently supports the lm32, mor1k & risc-v architectures. It would be nice if it also support the J2 open processor.

Expected results

LiteX is able to use the J2 core as an option when building SoC components.

Detailed Explanation

J2 open processor

This page describes the j-core processor, a clean-room open source processor and SOC design using the SuperH instruction set, implemented in VHDL and available royalty and patent free under a BSD license.

The current j-core generation, j2, is compatible with the sh2 instruction set, plus two backported sh3 barrel shift instructions (SHAD and SHLD) and a new cmpxchg (mnemonic CAS.L Rm, Rn, @r0 opcode 0010-nnnn-mmmm-0011) based on the IBM 360 instruction. Because it uses an existing instruction set, Linux and gcc and such require only minor tweaking to support this processor.

Further reading

[HDMI2USB] Port Dashio USB3.0 core to use FPGA high speed transceivers

More technical details at;

Brief explanation

The aim of this project is to make it possible to use the "Dashio USB3.0 Core" which currently requires an external USB 3.0 transceiver such as the TUSB1310A to instead use the in built high speed transceivers found in newer Cyclone V parts and most Arria / Stratix parts. This will be done by developing a ULPI and PIPE compatible core using the high speed transceivers.

Detailed Explanation

See the following Google Doc for more information;
https://docs.google.com/document/d/1wbfhdiOPFJV1wKde0blN5f62rWpPOamCfmz2aHgz3l4/pub#

Further reading

Contacts

[LiteX] Create a generic debug interface for soft-CPU cores and connect to GDB

[LiteX] Create a generic debug interface for soft-CPU cores and connect to GDB

Brief explanation

While doing software development being able to use gdb to find out what the CPU is doing is really useful. We would like that for LiteX's multiple soft-CPU implementations running both on real hardware and in simulation.

Expected results

GDB is able to control the soft-CPU implementations on both on real hardware and in simulation.

Detailed Explanation

The student will need to implement the parts shown in blue in the following diagram;

Block Diagram

Further reading

Knowledge Prerequisites

  • List of
  • what knowledge is required
  • to complete this project.

Contacts

[VideoBrick] capture HDMI video and audio at multiple resolutions

[VideoBrick] capture HDMI video and audio at multiple resolutions

More technical details at Link to bug in the

Brief explanation

VideoBrick is an ongoing OSHW project aiming to provide an HDMI-Input interface to a small SBC called Olimex LIME/LIME2 (which is a OSHW project itself). It is designed as a daughter board (shield) containing an HDMI-Receiver IC (analog ADV7611). It converts the HDMI to a parallel signal, which can be captured via the CSI Interface of the SOC. Connecting a HDMI Receiver to a CSI Interface is a bit unusual. But tests on the alpha prototype have shown that this is a feasible approach. The desired outcome of this project is to provide a OSHW, low-cost, standalone single-channel HDMI capture devices, running a complete Linux-Stack.

Expected results

  • stable HDMI-Capture (video and audio) using the V4L2 API
  • working Linux V4L2 driver module supporting multiple input resolutions

Detailed Explanation

VideoBrick hardware prototypes (Rev. Alpha4) are produced and software development already started in 2014 (but paused early 2015). We have prototype hardware and POC code that verfies our assumption that our choosen HDMI-Receiver IC with parallel output can be connected to the parallel CSI interface. The software currently is not stable, doesn't support multiple resolutions and is lacking audio support and because of its scetchy nature needs a complete rewrite. Nevertheless the electrical signal-chain for video input is thus tested. Work on the audio signal chain has not started yet and troubleshooting on the hardware level can be expected.

The task of getting video/audio capture can be subdivided:

  • prototype and/or troubleshooting audio input support
  • designing or borrowing a test enviroment for multiple HDMI-Input resolution/formats
  • implementing/modifying a Linux V4L2 kernelmodule (support for ADV7611 and the Allwinner CSI Interface are already available)

Further reading

Knowledge Prerequisites

  • Linux, C, Shell
  • Understanding of V4L2, HDMI (EDID), I2S
  • Electrical Engineering Troubleshooting
  • Linux Kernel Driver programming

Contacts

[streaming-system] Streaming Web Site?

Hi, so I was looking at your GSOC idea page and was interested in the Streaming Web Site portion of the flow chart but I couldn't really find much about it so I was wondering if I could get like a description about the site

[gstreamer] Create a gstreamer plugin for Lenkeng HDMI over IP extender

Brief explanation

Create a gstreamer plugin for sending and receiving from Lenkeng LKV373 (and compatible) devices.

You'll need to get a Lenkeng LKV373 as part of this project!

Expected results

  • Two gstreamer plugins;
    • lkv373-sink -- Sends data from gstreamer to the output part of the device.
    • lkv373-src -- Receives data from the device and sends into gstreamer.
  • Documentation on using the two gstreamer plugins
  • Technical documentation of LKV373 protocol

Detailed explanation

FIXME: Add more information here.

You may need to get out a soldering iron so you can access the serial port on the device.

Detailed Explanation

There's a HDMI extender device, Lenkeng LKV373, which captures HDMI, and transcodes the video into MJPEG and outputs the stream over multicast UDP.

They're designed to run as a pair of sender + receiver, but someone has reverse engineered the wire protocol, and written some notes on reverse engineering the units:

http://danman.eu/blog/?p=110

Further reading

Knowledge Prerequisites

  • Strong coding skills. (C experience preferred!)
  • (Good to have) Some multimedia coding experience.
  • (Good to have) gstreamer coding experience.
  • (Good to have) JPEG / MJPEG understanding.
  • (Good to have) Wireshark / Reverse engineering experience.

Contacts

[HDMI2USB] Create a serial port extension board - Support both RS232 and RS485 modes.

Brief explanation

Create a serial port extension board - Support both RS232 and RS485 modes

Should also have lots of blinking lights.

Firmware should be modified so that they appear as USB-CDC ports to the computer.

Detailed Explanation

This can be achieved using SP331 transceiver.

The SP331 is a programmable RS-232 and/or RS-485 transceiver IC. The SP331 contains four drivers and four receivers when selected in RS-232 mode; and two drivers and two receivers when selected in RS-485 mode. The SP331 also contains a dual mode which has two RS-232 drivers/receivers plus one differential RS-485 driver/receiver.

The RS-232 transceivers can typically operate at 230kbps while adhering to the RS-232 specifications. The RS-485 transceivers can operate up to 10Mbps while adhering to the RS-485 specifications. The SP331 includes a self-test loopback mode where the driver outputs are internally configured to the receiver inputs. This allows for easy diagnostic serial port testing without using an external loopback plug. The RS-232 and RS-485 drivers can be disabled (High-Z output) by controlling a set of four select pins.

Further reading

Contacts

Hdmi2UsbAPI

[{{reference.repo}} #{{reference.number}}] {{title}}

More technical details at Link to bug in the

Brief explanation

A short description of what the thing you want to do.

Expected results

What the expected outcome from the project should be.

Detailed Explanation

A much longer description of what the thing you want to do.

Further reading

Knowledge Prerequisites

  • List of
  • what knowledge is required
  • to complete this project.

Contacts

[LiteX] Create a litescope based "Integrated Bit Error Ratio Tester" (iBERT) clone

[LiteX] Create a litescope based "Integrated Bit Error Ratio Tester" (iBERT) clone

Brief explanation

Xilinx has a logicore called iBERT for doing testing of error rates on high speed channels. The task is to create a similar tool based on LiteX and LiteScope.

Expected results

Gateware can be generated for a given board with high speed transceivers and a GUI tool on the computer can be used to examine error rates over the transceivers using different settings.

Detailed Explanation

This project has three parts;

a) Data sequence generators + checkers. These generate given bit data stream, then after transmission and receiving check that the bit data stream is correct.

b) Data channel wrappers. These give you a common interface to controlling the parameters of a channel used in transmission and receiving. For simple data channels this might just provide clock control. For more advanced channels, like the high speed transceivers, this provides things like controlling parameters like pre-emphasis, equalisation, etc.

c) Host computer Control GUI / Console. This gives a nice interface for controlling all the parameters and seeing the results of various tests. This is the Xilinx iBERT Console -> image

The student is expected to create all three parts of this tool reusing litescope and litex for the FPGA<->Host communication and development. The work can be seen in the following diagram, the parts in blue need to be developed by the student.

LiteX iBERT Diagram

Further reading

Knowledge Prerequisites

  • This project requires access to FPGA hardware with high speed transceivers.

Contacts

[HDMI2USB] Convert the JPEG encoder from VHDL to Migen/MiSoC

Brief explanation

The HDMI2USB firmware currently has a JPEG encoder written in VHDL. We would prefer it be written in Migen / MiSoC to allow better integration into the firmware.

Expected results

The HDMI2USB MiSoC Firmware correctly generates JPEG output using new Migen / MiSoC JPEG encoder.

The new Migen / MiSoC JPEG encoder has a strong test suite.

Detailed Explanation

The current JPEG encoder is based on mkjpeg from opencores but has been slightly modified by @enjoy-digital and @ajitmathew. The current encoder has the following problems;

  • It is written in VHDL which means;
    • Isn't easy to simulate using FOSS tools,
    • An adapter between migen/misoc is needed,
  • It has been the cause of numerous bugs.
  • It doesn't have a strong test suite making modification hard.

Using Migen / MiSoC would allow a lot of better testing.

As there is an existing implementation of the encoder, it makes the conversion process a simpler. cfelton's JPEG encoder example might be useful source too.

Further reading

Contacts

[LiteX] Finish support for RISC-V in LiteX

Brief explanation

LiteX currently supports the lm32 and mor1k architectures and has some preliminary support for RISC-V. It would be nice if it support the RISC-V architecture fully.

Expected results

MiSoC is able to use a CPU core which supports the RISC-V architecture as an option when building SoC components.

Detailed Explanation

RISC-V

RISC-V (pronounced "risk-five") is a new instruction set architecture (ISA) that was originally designed to support computer architecture research and education and is now set to become a standard open architecture for industry implementations under the governance of the RISC-V Foundation. RISC-V was originally developed in the Computer Science Division of the EECS Department at the University of California, Berkeley.

Process

Steps would be;

  • Figure out the best RISC-V compatible core to use.
  • Figure out the toolchain needed.
  • Import into MiSoC.

Some potential options are;

Further reading

[streaming-system] Create a mobile theme for the website

Brief explanation

Make the website responsive so it looks nice on mobile devices.

Detailed Explanation

The website should have 2 "cards", one with the video, one with the chat. Along the bottom should be a marquee which scrolls twitter information. When going to the chat card the video should keep playing and the audio can still be heard.

Should support both iPod/iPad and Android ecosystems.

Contacts

  • Potential Mentors: TimVideos Streaming System Hackers team

[HDMI2USB] Allow HDMI2USB devices to act as HDMI extenders via the Gigabit Ethernet port

More technical details at HDMI2USB-misoc-firmware#133: Allow HDMI2USB devices to act as HDMI extenders via the Gigabit Ethernet port

Brief explanation

Allow two HDMI2USB devices which are connected by Gigabit Ethernet to stream video from one device to the other.

Detailed Explanation

It should be possible to stream the pixel data from one board to another board via Ethernet.

Gigabit Ethernet should be fast enough to transfer 720p60 video at raw pixel data (when using YUV encoding).

Both the Numato Opsis and Digilent Atlys boards have Gigabit Ethernet interfaces. The HDMI2USB-misoc-firmware already supports both these interfaces using liteEth.

As the two devices will be directly connected, you can use raw Ethernet frames rather than UDP/IP. It might be easier for debugging to support UDP/IP.

Further reading

Contacts

[HDMI2USB] Create a 3G-SDI extension board for HDMI2USB

More technical details at

Brief explanation

Add support for 3G Serial digital interface for the HDMI2USB.

This project will let us push and pull lots of data over a wider range of protocols / cables / system.

Detailed Explanation

SDI interface is used by a large number of high end devices. It would be good for the HDMI2USB firmware to support the standards and protocols.

This will probably require development of a TOFE expansion board which has SDI connectors, otherwise some way to adapt the DisplayPort connector to SDI.

Further reading

Contacts

[streaming-system #4] Add bug / feedback reporting from end users system

More technical details at Streaming System #4

Brief Explanation

Create a website/webpage which records bug / feedback information from an end user. You can't assume that the users know anything about bugs reporting, their own system or the TimVideos streaming system.

Expected results

The tools will collect details about the conditions and outcomes of any bugs in the user interface.

Detailed Explanation

A user may face issues in watching streams, some possibilities include;

  • Low internet bandwidth
  • Outdated browser or operating system
  • Problems at TimVideos server end

Create a website/webpage which records information as requested below;

If you have other problems like the sound being quiet or the display being wrong in anyway, or if you can't get the stream to work, please send me an email with the following details;
* Browser Type (Chrome, Safari, IE, etc)
* Exact Browser Version (a screenshot of the about page would be awesome!)
* Operating System - as much detail as possible (IE Ubuntu Lucid 64bit)
* The speed and type of Internet connection you have. (IE ADSL, Cable, ADSL2 and 256k, 6M). You can find this information in your ADSL model. 
* Please go to http://www.speedtest.net/ and send me the numbers the widget reports.
* Anything else you think is important.

Take inspiration from tools like;

Further reading

Related issues;

Knowledge Prerequisites

  • Javascript
  • HTML

Contacts

  • Potential Mentors: TimVideos Streaming System Hackers team

[SymbiFlow] Add support for Spartan 6 parts

[SymbiFlow #10] Add support for Spartan 6 parts

More technical details at SymbiFlow Idea #10

Brief explanation

Spartan 6 is a hugely popular part which would be awesome to have support for in open source FPGA toolchain - SymbiFlow

Expected results

  • Documentation of the bitstream for Spartan 6 parts.
  • Support for using Spartan 6 in the SymbiFlow.

Detailed Explanation

SymbiFlow will be a FOSS Verilog-to-Bitstream FPGA synthesis flow for Xilinx 7-Series FPGAs and iCE40. SymbiFlow currently only supports Xilinx Series 7 parts and the Lattice iCE40 parts.

While the Spartan 6 has mostly be superseded by the Artix 7 and Spartan 7 there are still a huge number of boards out there with Spartan 6 parts. Due to its huge popularity it will be a long time until the part is no longer in use (people still start new designs with Spartan 3!).

To make it even more important Spartan 6 designs you still have to use ISE which is significantly worse then Vivado in many ways. Having an non-ISE toolchain for that would be awesome.

Spartan 6 is used heavily by a number of open source projects;

Further reading

Knowledge Prerequisites

  • List of
  • what knowledge is required
  • to complete this project.

[flumotion] Productionize Flumotion ported to Gstreamer-1.0 API

Brief explanation

Flumotion is an essential part of the TimVideos' streaming-system. It was ported to newer Gstreamer-1.x API in a previous GSoC project. Some work needs to be done for updating and making it ready for use in production.

More technical details at timvideos/flumotion#6

Expected results

Updated Flumotion works in streaming-system on Ubuntu 14.04 Trusty and is reliable.

Detailed Explanation

Flumotion is used in streaming-system to encode and stream the media.

The following tasks need to be finished in order of priority:

  • Merge porting work from previous year - pull request
  • Port any remaining components
  • Fix Gstreamer warnings
  • Add a web UI for admin
  • Write unit/integration tests
  • Package for Debian

Further reading

Knowledge Prerequisites

  • Python
  • Twisted and Gstreamer experience useful
  • Debian packaging experience useful

Contacts

[HDMI2USB] Port Linux to the lm32 CPU and support HDMI2USB firmware functionality

Brief explanation

The HDMI2USB gateware currently includes a lm32 soft core. The misoc version has an MMU and should support running a full Linux kernel.

Expected results

Linux booting on the HDMI2USB gateware.

Detailed Explanation

There is a bunch of extra information in the LiteX Linux Support Random Notes Google Doc.

The HDMI2USB-misoc-firmware embeds a LM32 soft-core for controlling and configuring the hardware. See the diagram below;

MiSoC firmware structure

This soft-core should be able to run Linux Kernel, which means we would get access to a lot of good things;

  • Access to well tested TCP/IP stack (useful for IP based streaming).
  • Access to well tested USB stack (useful for the USB-OTG connector).
  • Access to kernel mode setting for edid processing and DisplayPort stuff.

Work on this was started by the MilkyMist / M-Labs people.

There is a port of the lm32 to qemu which will help, see timvideos/HDMI2USB-litex-firmware#86

Further reading

Knowledge Prerequisites

  • Linux Kernel knowledge
  • Strong C knowledge.

[LiteX] QEmu simulation of a LiteX generated SoC

[LiteX] QEmu simulation of a LiteX generated SoC

More technical details at Issue #86: Get lm32 firmware running under qemu to enable testing without hardware on HDMI2USB-misoc-firmware repo

Brief explanation

Currently to do firmware development you need access to FPGA hardware. Using QEmu it should be possible to enable developers to develop and improve the firmware without hardware.

Expected results

People are able to write new features for the HDMI2USB firmware and develop MicroPython for FPGAs using QEmu.

Detailed Explanation

LiteX QEmu Simulation support Random Notes Document

Further reading

Knowledge Prerequisites

  • List of
  • what knowledge is required
  • to complete this project.

Contacts

[LiteX] Add support to pep8 / pylint for the "MiSoC / LiteX" formatting style & new specific checks

Brief explanation

Migen / LiteX doesn't quite follow pep8 / pylint for good reasons. Extend pep8 and/or pylint to understand when it should allow violations. Extra additional checks for good Migen / LiteX style should also be added.

Expected results

Running pep8-migen on Migen / LiteX HDL code does the following;

  • Checks all Python code that isn't Migen / LiteX complies with pep8 standard
  • Checks Migen / LiteX code compiles with the standard formatting style for Migen / LiteX
  • Checks for more common Migen / LiteX code smells.

Detailed Explanation

Due to the way the MiSoC / LiteX code works, this is preferred way to format the following statement;

    self.sync += [
        If(signal == 10,
            output1.eq(1),
        ).ElIf(signal == 5,
            output2.eq(0),
        ).Else(
            output3.eq(1),
        ),
    ]

See the migen documentation for more examples.

When the code is formatting this way it will fail pep8 with the following output;

$ pep8 misoc.py
misoc.py:6:9: E124 closing bracket does not match visual indentation
misoc.py:7:13: E128 continuation line under-indented for visual indent
misoc.py:8:9: E124 closing bracket does not match visual indentation

We need a pep8 / pylint extension which understand MiSoC / LiteX HDL formatting.

There are also a number of LiteX / Migen "code smells" that pep8 / pylint should also detect. Some are listed in the following section.

Migen / LiteX code smells

pep8 code smells which definitely apply

  • Using a bare try/except: clause
  • Pretty much most things
  • FIXME: Put more things here

pep8 code smells which do not apply

  • Wrapping of statements in pep8 style inside comb / sync blocks.
  • FIXME: Put more things here.

Incorrect wrapping of final brace

All lists / dictionaries should have commas on the last element;


No;

        self.comb += [
            counter0.eq(counter[0]),
            counter1.eq(counter[1]),
            counter2.eq(counter[2]),
            counter3.eq(counter[3]),
            ]

Yes;

        self.comb += [
            counter0.eq(counter[0]),
            counter1.eq(counter[1]),
            counter2.eq(counter[2]),
            counter3.eq(counter[3]),
        ]

Missing Comma on final element in list / dictionary

All lists / dictionaries should have commas on the last element;


No;

        self.comb += [
            counter0.eq(counter[0]),
            counter1.eq(counter[1]),
            counter2.eq(counter[2]),
            counter3.eq(counter[3])
        ]

Yes;

        self.comb += [
            counter0.eq(counter[0]),
            counter1.eq(counter[1]),
            counter2.eq(counter[2]),
            counter3.eq(counter[3]),
        ]

No;

        analyzer_signals = {
            0 : vcd_group,
            1 : sigrok_group
        }

Yes;

        analyzer_signals = {
            0 : vcd_group,
            1 : sigrok_group,
        }

Using line continuation rather than list

FIXME: Check this one makes sense...

No;

        self.sync.clk200 += \
            If(reset_counter != 0,
                reset_counter.eq(reset_counter - 1)
            ).Else(
                ic_reset.eq(0)
            )

Yes;

        self.sync.clk200 += [
            If(reset_counter != 0,
                reset_counter.eq(reset_counter - 1)
            ).Else(
                ic_reset.eq(0)
            ),
        ]

Using the wrong docstring style

Migen / LiteX use the "Numpy DocString style" for docstring comments.

See examples;

Code should not be using the "Google DocString Style" and not the "PEP 287 DocString Style"

This is supported by the napoleon module in Sphinx.

Not using yield from in test benches

https://m-labs.hk/migen/manual/simulation.html#pitfalls

When calling other testbenches, it is important to not forget the yield from. If it is omitted, the call would silently do nothing.

Further reading

Knowledge Prerequisites

  • List of
  • what knowledge is required
  • to complete this project.

Contacts

[HDMI2USB] firmware auto-builder

Brief explanation

On every commit, fire up a EC2 instance, build the HDMI2USB firmware. (This will require work on the HDMI2USB build system too.) Should auto-commit the firmware to the HDMI2USB prebuild repository.

Expected results

Detailed Explanation

HDMI2USB build system will need to be verified that it can produce the firmware in a reliable / automatic manner.

Some type of webhook is needed.

Every GitHub repository has the option to communicate with a web server whenever
the repository is pushed to. These โ€œwebhooksโ€ can be used to update an external
issue tracker, trigger CI builds, update a backup mirror, or even deploy to your
production server.

Further reading

Knowledge Prerequisites

  • Make experiance / Build systems experience
  • (Optional) VHDL / Verilog experience
  • (Optional) Xilinx ISE experience

Contacts

[flumotion] Port Flumotion to the gstreamer-1.0 API

Brief explanation

TimVideos.us makes heavy usage of flumotion for the backend of the streaming system. Flumotion currently uses the gstreamer0.10 API which is now End of Lifed. We need to port it to the gstreamer-1.0 API.

Expected results

  • Flumotion which runs on gstreamer-1.0
  • Comprehensive flumotion test suite.

Detailed Explanation

Flumotion is an award winning streaming software created in 2006 by a group of open source developers and multimedia experts. Flumotion Streaming Software allows broadcasters and companies to stream content live and on demand in all the leading formats from a single server.

flumotion is written in Python and uses GStreamer and Twisted

Gstreamer has recently moved to 1.0 - see the following announcement, we need to port to the new API.

GStreamer 0.10 no longer maintained

2013-03-15 10:00
Since this has never been announced officially outside of the conference circuit, the GStreamer team would like to clarify that GStreamer 0.10 is no longer maintained. Not by us, nor by anyone else. There will be no more GStreamer 0.10 releases, not even bug-fix releases. Bugs will not be investigated unless they also apply to GStreamer 1.x. Patches will only be reviewed and pushed if they apply to one of the current 1.x branches or fix an issue that still pertains to 1.x. Bug fixes have not been backported systematically to 0.10 for a very long time now, and there are many hundreds of bugs that have only been fixed in 1.x, and manymore are fixed every week. Any commits to the 0.10 branch are made purely on an ad-hoc basis now, and we may lock down the branch completely. No more features will be added to 0.10. However, GStreamer 0.10 support is of course still available on a commercial basis from the usual suspects.

We realise and regret that this may cause some inconvenience, and we also understand that there are circumstances where there are valid reasons not to migrate to 1.0 at this time, or in some cases ever, but we would also like to dispel any doubt whatsoever that there is any other way. We have neither the resources, nor the will or the energy to maintain more than one major version. We must use the little resources we have on code that has a future.

We are happy to provide help and advice to anyone migrating to 1.0. If there are plugins that haven't been ported yet that you need, let us know, maybe we can help get them ported.

Thanks for your understanding.

Happy hacking, see you on the other side!

-> Your GStreamer maintainers and release managers

Further reading

Knowledge Prerequisites

  • Strong python coding ability.
  • Twisted experiance very useful.
  • Some gstreamer experience useful.

Contacts

[streaming-system] TimVideos.us website (viewing interface) improvements

Some more details at Streaming system Issue #42.

Brief explanation

Make the timvideos.us website (the viewing interface for the streams) dynamically generated from a database rather then the config file. This will also include improving the frontend website.

Expected results

Detailed Explanation

This will also include improving the frontend website to support things like;

  • Proper theming of each channel. (Themes seperate from channels are needed so you can define a "Linux.conf.au 2015 theme" which is then used by multiple channels.)
  • Adding conference level pages (currently a version of the front page which groups together channels for each conference and supports theming for that conference).
  • Reworking the front timvideos.us page.
  • Accounts for control over each channel (admin level only, no users level accounts).
  • Admin interface for configuring channels.
  • Web base control of schedule download.
  • Support for proper backing up of the website.

Further reading

Knowledge Prerequisites

  • Django and Python web application development.
  • Graphic design experiance very useful.

Contacts

[flumotion] Create a RTMP flumotion component

Brief explanation

Create a new flumotion component which lets you stream to existing RTMP services (such as YouTube or JustinTV).

Should support the following targets:

Other helpful links:

Contacts

  • Potential Mentors: TimVideos Streaming System Hackers team

[HDMI2USB] Add USB 3.0 support

More technical details at;

Brief explanation

Add USB3.0 support to the HDMI2USB-misoc-firmware to allow high speed capture.

Expected results

Able to capture video from a HDMI2USB device at USB3.0 speeds.

Detailed Explanation

The current USB2.0 interface that is on the current HDMI2USB devices is only capable of capturing at 720p30 frame rate. USB3.0 has significantly more bandwidth which would allow much higher capture resolutions.

Adding USB3.0 support would be a multiple step process;

Further reading

Contacts

[LiteX] Improve support for project IceStorm (Lattice ICE40 + OpenFPGA toolchain)

[LiteX] Improve support for project IceStorm (Lattice ICE40 + OpenFPGA toolchain)

Brief explanation

The first fully FOSS toolchain for FPGA development targets the Lattice ICE40 FPGA. LiteX has some support for this toolchain but it hasn't been very tested.

Expected results

LiteX has good support for the IceStorm work flow and ICE40 based development boards. At least the LM32 and RISC-V soft CPUs should work and many of the peripherals should work.

Detailed Explanation

  • TODO.

Further reading

Knowledge Prerequisites

  • List of
  • what knowledge is required
  • to complete this project.

Contacts

[HDMI2USB] HDMI Audio to USB Sound

More technical details at

Brief explanation

Push audio from HDMI to USB sound in the HDMI2USB project.

Expected results

HDMI audio is captured on a computer connected to the HDMI2USB device.

Detailed Explanation

HDMI supports sending audio over the interface (in the blanking areas called data island). This audio should be captured and sent up the USB interface.

Further Reading

Contacts

Testing

[{{reference.repo}} #{{reference.number}}] {{title}}

More technical details at Link to bug in the

Brief explanation

A short description of what the thing you want to do.

Expected results

What the expected outcome from the project should be.

Detailed Explanation

A much longer description of what the thing you want to do.

Further reading

Knowledge Prerequisites

  • List of
  • what knowledge is required
  • to complete this project.

Contacts

[LiteX+HDMI2USB] Improve NuttX on LiteX generated SoC & replace HDMI2USB bare metal firmware

[LiteX+HDMI2USB] Improve NuttX on LiteX generated SoC & replace HDMI2USB bare metal firmware

Brief explanation

The HDMI2USB project uses "bare metal" firmware running on a soft CPU inside the FPGA. We would like to replace this code with something running on a proper operating system to make it easier to add new features. NuttX is one such option for a lightweight OS.

Expected results

All current HDMI2USB based functionality now works with a firmware running NuttX rather than bare metal C code.

Detailed Explanation

The HDMI2USB-misoc-firmware embeds a LM32 soft-core for controlling and configuring the hardware. See the diagram below;

MiSoC firmware structure

This soft-core should be able to run NuttX. NuttX is a real-time operating system (RTOS) with an emphasis on standards compliance and small footprint.

A bunch of work has been done on making NuttX work on LiteX.

Further reading

Knowledge Prerequisites

  • List of
  • what knowledge is required
  • to complete this project.

Contacts

[HDMI2USB] Intelligent Auto Audio gain

Brief Explanation

HDMI2USB Intelligent Auto Audio gain

No sudden loud noises, max through power, min through power

Contacts

  • Potential Mentors: TimVideos Hardware Hackers team

[streaming-system] Make the code.timvideos.us developer website awesome!

Brief explanation

code.timvideos.us is the first point of content for potential developers and users. It could use a lot of work to make it easier to use and more informative.

Expected results

Detailed Explanation

code.timvideos.us is generated using jekyll and pushed to GitHub Pages. We don't use the github automatic generation as we have a couple of custom plugins.

Things which could be done;

  • Blog / News items for the website. jekyll has strong support but we are not using it at all.
    • Some type of Planet / RSS aggregator to collect developers information.
  • Code status integration -- pull some type of stats from GitHub and publish it as part of the website.
  • Social web integration;
    • Some type of twitter display, follow us on twitter, etc. http://twitter.com/timvideosus. New news items get publish to twitter account.
    • Some type of G+ integration. New news items get publish to the G+ page.
  • Better Ideas / Issues page (javascript base system).
    • Allow filtering using tags.
    • Would be good for jekyll plugin to preload the issues information in some way.
    • Split out into jekyll plugin other people can use.
  • Auto-generated the People page from github some how.
  • Better edit interface (rather than using the wiki edit stuff).
  • Auto-download of Google Drive/Docs images.
  • Make sure everything works under IE, Firefox and Chrome.
    • Automated testing (Selenium? Sauce Labs?) that simple things work.

Further reading

  • FIXME: Add more here.

Knowledge Prerequisites

  • Ruby experience -- jekyll is written in ruby and you'll be modifying / developing the plugins.
  • Javascript / HTML / CSS -- Web development requires Javascript, HTML and CSS experience. Some parts like the Issue tracker are reasonably dynamic so good Javascript skills would be useful.
  • (Highly recommended) Graphics design experience.
  • (Optional) jQuery experience.

Contacts

[LiteX] Add support for ZPU / ZPUino in MiSoC

Brief explanation

MiSoC currently supports the lm32 and mor1k architectures. It would be nice if it also support the ZPU soft core (and the ZPUino peripherals).

Expected results

MiSoC is able to use a CPU core which supports the ZPUino architecture as an option when building SoC components.

Detailed Explanation

ZPU - The Zylin ZPU

ZPUino is a SoC (System-on-a-Chip) based on Zylin's ZPU 32-bit processor core.

The worlds smallest 32 bit CPU with GCC toolchain.

The ZPU is a small CPU in two ways: it takes up very little resources and the architecture itself is small. The latter can be important when learning about CPU architectures and implementing variations of the ZPU where aspects of CPU design is examined. In academia students can learn VHDL, CPU architecture in general and complete exercises in the course of a year.

Further reading

[HDMI2USB #32] MJPEG core optimisation

More technical details at HDMI2USB #32

Brief explanation

Optimisation of the MJPEG core in the HDMI2USB firmware to support full frame rate recording.

Current performance is roughly 720p@15fps minimum requirement would be 720p@30fps. We'd really like it to reach somewhere between 1080p@30Hz --> 1080p@120Hz.

Ideally these changes would make it back into the OpenCore's JPEG encoder to allow everyone else to benefit from the optimisations.

Expected results

  • HDMI2USB device can produce full frame rate output at 720p.
  • Faster mkjpeg for OpenCores.

Detailed Explanation

Further reading

Knowledge Prerequisites

Contacts

[HDMI2USB] Add "hardware mixing" support to HDMI2USB firmware

[HDMI2USB-misoc-firmware 27] Add "hardware mixing" support to HDMI2USB firmware

More technical details at following github issue

Brief explanation

Add support for mixing multiple input sources (either HDMI or pattern) together and output via any of the outputs.

Expected results

The firmware command line has the ability to specify that an output is the combination of two inputs. These combinations should include dynamic changes like fading and wipes between two inputs.

Detailed Explanation

The "Hardware Fader Design Doc" includes lots of information about how this stuff could be implemented.

You should read up about how to properly combined pixels in linear gamma space. All mixing should be done in linear gamma space. (http://www.poynton.com/PDFs/GammaFAQ.pdf || http://www.poynton.com/notes/colour_and_gamma/GammaFAQ.html)

It might be useful to read up about the original Milkymist One firmware and the TMU (Texture Mapping Unit) used in that.

Further reading

Contacts

[streaming-system] Automated build and configuration of full streaming system

Brief explanation

System to build and configure all of the machines needed to record and stream a multi track conference.

Full automated deployment of both the collecting infrastructure (parts inside a conference venue / user group venue) and the off site infrastructure (parts in the cloud).

Expected results

Convert existing scripts and manual processes to a consistent and modular deployment system templates.
Demonstrate it working in VMs.

Detailed Explanation

Problem

A 25 track conference needs about 60 physical and 30 virtual machines. Each needs an os installed, host name assigned, networking that plays well with the conference infrastructure, various applications installed and configured.

Setup

For production there will be a physical box with Debian stable, access to a standard Debian repository, sudo user, a master config file, file of secrets and a script. The script will install and configure the needed services to provide automated pxe install for the remaining machines and provision and configure the vm's.

For testing, a single physical box running lots of vms that replicates the production setup. This adds a layer of complexity and such, but is the only sensible way to test and maintain and enhance such a system.

Configuration

Everything possible will be contained in the master config file. There should be no editing additional config files by hand. Part of this project will be to determine what all needs to be included in the master config file. The file will likely be created by the veyepar system which will be responsible for transforming the conference schedule into a consistent format and providing a UI to manage such things as:
Talks in the room that the attendees know as โ€œUB2.252A (Lameere)โ€ will have a url of www.timvideos.com/fosdem2014/lameere ad an irc channel of #fosdem-lameere; and the video mixer machine has a mac address of 00:11:22..., content source machines: [22:33:44... 44:55:66...], host names: lameere-mixer, lameere-source-1 and -2, and somehow specifying if the encoders will be EC2, RackSpace or local vm's.

Pieces

  • collecting infrastructure
    • dvswitch or gst-switch
    • flumotion collector (watchdog, register, etc)
  • streaming infrastructure (EC2)

Further reading

Existing build systems that do quite a bit of what is needed:
https://github.com/yoe/fosdemvideo/blob/master/doc/README
https://github.com/CarlFK/veyepar/blob/master/setup/nodes/pxe/README.txt
https://github.com/timvideos/streaming-system/blob/master/tools/setup/runall.sh

Knowledge Prerequisites

  • Good understanding of deploying machines and systems such as PXE boot.

Contacts

[streaming-system] Upgrade website to latest Django version

Brief explanation

The website built initially on Django 1.4 with some updates to 1.5, needs to be upgraded to latest Django version since it has migrations included, many security and other updates.

Expected results

The website runs on latest Django version.

Detailed Explanation

FIXME: A much longer description of what the thing you want to do.

Further reading

Knowledge Prerequisites

  • Python
  • Django
  • HTML/CSS

Contacts

[Veyepar] JSON schedule output into website

Brief explanation

Zookeepr / Symposium (PyCon Website) JSON schedule output into website + veyepar

Feed validator!!!!

Problem

The Steaming UI and Processing videos relies on the talk schedule. This data is typically in the conference website. There is sometimes an API available,
These APIs are not always accurate, example they may be missing the Keynote entries, or the start time is on a date that is not part of the conference.

Things that will help:

  • A spec defining what data is needed.
  • A validater to inspect provided data. (command line to validate a local file, public facing web page that will let conference web site developers test their API.)
  • Work with the open source conference site projects to add the API to their codebase.

There are 3 popular conference systems:

http://zookeepr.org - https://github.com/zookeepr/zookeepr
http://pentabarf.org - https://github.com/nevs/pentabarf
http://eldarion.com/symposion/ - https://github.com/pinax/symposion

All 3 have schedule exports - examples:
http://www.pytennessee.org/api/schedule_json/
http://lca2013.linux.org.au/programme/schedule/json
https://fosdem.org/2014/schedule/xml

These two systems consume it:
https://github.com/timvideos/streaming-system
https://github.com/CarlFK/veyepar/blob/master/dj/scripts/addeps.py

A description of the data is at the top of addeps.py
and a little more http://nextdayvideo.com/page/metadata.html

https://github.com/CarlFK/veyepar/blob/master/dj/scripts/addeps.py#L288
stores the data to the model
https://github.com/CarlFK/veyepar/blob/master/dj/main/models.py#L228

The reason addeps.py is 2200 lines long is because the conference system coders keep changing things. mostly it is because the symposium system doesn't have the code, so each time someone installs it they write their own. and things evolve too. like adding "what kind of license is it released under?"

symposium team has not accepted the pull request that will help stabilize the API
pinax/symposion#45

Contacts

  • Potential Mentors: TimVideos/Veyepar team

[gstreamer] Productionize/Update gstreamer dvswitch plugin

Brief explanation

The dvswitch gstreamer plugin is an important part of a migration scheme from dvswitch to newer technologies. This plugin could use a bunch of work to make it better.

Full list of outstanding issues with the plugin can be found at https://github.com/timvideos/gst-plugins-dvswitch/issues

Expected results

  • Ideal -- gst-plugins-dvswitch becomes a standard gstreamer plugin.
  • Realistic -- gst-plugins-dvswitch is more reliable and code higher standard.

Detailed Explanation

The following tasks are the minimum things that need to be finished;

  • Porting plugin to support both gstreamer0.10 and gstreamer1.x
  • Add testing infrastructure for making sure the plugin works reliably.
  • Add auto-reconnect on disconnection support.

The dvswitch plugin is used by https://github.com/timvideos/dvsource-v4l2-other to send data from a v4l2src source to DVSwitch.

Read the README files for both repositories;

Further reading

Knowledge Prerequisites

  • C programming knowledge.

Contacts

[HDMI2USB #14] Supporting Marvell Ethernet chip on Digilent board

Brief explanation

The Diligent Atlys board has a Marvell Gigabit Ethernet chip on it. It would be awesome if we could ship the video via this port rather then the USB.

Expected results

  • Full functionality of the HDMI2USB is accessible via Ethernet interface, including;
    • Full video stream (equivalent to the UVC interface)
    • Control Interface (equivalent to the CDC serial port interface)

Detailed Explanation

Xilinx supplies a triple-speed Ethernet MAC core, TEMAC, but it requires a license which is not compatible with Open Source.

Further reading

Knowledge Prerequisites

  • Strong VHDL / Verilog skills.
  • Knowledge of Ethernet protocol highly desired.

Contacts

Recommend Projects

  • React photo React

    A declarative, efficient, and flexible JavaScript library for building user interfaces.

  • Vue.js photo Vue.js

    ๐Ÿ–– Vue.js is a progressive, incrementally-adoptable JavaScript framework for building UI on the web.

  • Typescript photo Typescript

    TypeScript is a superset of JavaScript that compiles to clean JavaScript output.

  • TensorFlow photo TensorFlow

    An Open Source Machine Learning Framework for Everyone

  • Django photo Django

    The Web framework for perfectionists with deadlines.

  • D3 photo D3

    Bring data to life with SVG, Canvas and HTML. ๐Ÿ“Š๐Ÿ“ˆ๐ŸŽ‰

Recommend Topics

  • javascript

    JavaScript (JS) is a lightweight interpreted programming language with first-class functions.

  • web

    Some thing interesting about web. New door for the world.

  • server

    A server is a program made to process requests and deliver data to clients.

  • Machine learning

    Machine learning is a way of modeling and interpreting data that allows a piece of software to respond intelligently.

  • Game

    Some thing interesting about game, make everyone happy.

Recommend Org

  • Facebook photo Facebook

    We are working to build community through open source technology. NB: members must have two-factor auth.

  • Microsoft photo Microsoft

    Open source projects and samples from Microsoft.

  • Google photo Google

    Google โค๏ธ Open Source for everyone.

  • D3 photo D3

    Data-Driven Documents codes.