Giter VIP home page Giter VIP logo

sevenz-rust's Introduction

Crate Documentation

This project is a 7z compressor/decompressor written in pure rust.
And it's very much inspired by the apache commons-compress project.

The LZMA/LZMA2 decoder and all filters code was ported from tukaani xz for java

Decompression

Supported codecs:

  • BZIP2 (require feature 'bzip2')
  • COPY
  • LZMA
  • LZMA2
  • ZSTD (require feature 'zstd')

Supported filters:

  • BCJ X86
  • BCJ PPC
  • BCJ IA64
  • BCJ ARM
  • BCJ ARM_THUMB
  • BCJ SPARC
  • DELTA
  • BJC2

Usage

[dependencies]
sevenz-rust={version="0.2"}

Decompress source file "data/sample.7z" to dest path "data/sample"

sevenz_rust::decompress_file("data/sample.7z", "data/sample").expect("complete");

Decompress a encrypted 7z file

Add 'aes256' feature

[dependencies]
sevenz-rust={version="0.2", features=["aes256"]}
sevenz_rust::decompress_file_with_password("path/to/encrypted.7z", "path/to/output", "password".into()).expect("complete");

Multi-thread decompress

check examples/mt_decompress

Compression

Currently only support LZMA2 method.

[dependencies]
sevenz-rust={version="0.5.0", features=["compress"]}

Use the helper function to create a 7z file with source path.

sevenz_rust::compress_to_path("examples/data/sample", "examples/data/sample.7z").expect("compress ok");

With AES encryption

require version>=0.3.0

[dependencies]
sevenz-rust={version="0.5", features=["compress","aes256"]}

Use the helper function to create a 7z file with source path and password.

sevenz_rust::compress_to_path_encrypted("examples/data/sample", "examples/data/sample.7z", "password".into()).expect("compress ok");

Advance

[dependencies]
sevenz-rust={version="0.5.0", features=["compress","aes256"]}

Solid compression

use sevenz_rust::*;

let mut sz = SevenZWriter::create("dest.7z").expect("create writer ok");

sz.push_source_path("path/to/compress", |_| true).expect("pack ok");

sz.finish().expect("compress ok");

Compression methods

with encryption and lzma2 options

use sevenz_rust::*;

let mut sz = SevenZWriter::create("dest.7z").expect("create writer ok");
sz.set_content_methods(vec![
    sevenz_rust::AesEncoderOptions::new("sevenz-rust".into()).into(),
    lzma::LZMA2Options::with_preset(9).into(),
]);
sz.push_source_path("path/to/compress", |_| true).expect("pack ok");

sz.finish().expect("compress ok");

sevenz-rust's People

Contributors

andrewreisdorph avatar bfrazho avatar dependabot[bot] avatar dylan-dpc avatar dyz1990 avatar marcospb19 avatar nlfiedler avatar tomicyo avatar xmakro avatar yujincheng08 avatar

Stargazers

 avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar

Watchers

 avatar  avatar  avatar  avatar  avatar  avatar

sevenz-rust's Issues

unresolved import of `BlockDecoder`

Rust is giving an error about:

unresolved import `sevenz_rust::BlockDecoder`
no `BlockDecoder` in the root

Where can BlockDecoder be accessed?

Problem handling encrypted files

Hello, I've noticed a problem with encrypted files, when the header data is also encrypted (with the -mhe=on option of 7z). When a wrong password is passed to such files, depending on the compression method used sevenz reports issues such as:

An IO error occurred: Io(Custom { kind: InvalidInput, error: "range decoder first byte is 0" }, "")

An IO error occurred: Io(Custom { kind: Other, error: ChecksumVerificationFailed }, "")

and therefore it's not possible to reliably say it's a bad password and that a different one should be used. Would it be possible to return a consistent error in case the password is invalid? Many thanks!

Multi input/output stream coders are not yet supported

I got the error Multi input/output stream coders are not yet supported when calling to sevenz_rust::decompress_file of an 7z file. It seems it is a limitation in the current implementation.

Is there any plan to support it?

Thank you very much.

How to decompress part of 7z to memory?

I need to decompress a single file in a huge archive(&Path) into memory(&mut Vec<u8>), how should I do it?

In fact, I still want to modify this file and write it back, can it be done?

Compressed folder missing directory structure

this is my dir
image

this is output.7z
image

this is my code

fn main() {
    sevenz_rust::compress_to_path_encrypted(
        r"E:\work\cocos\output",
        r"E:\work\cocos\output.7z",
        "a19970411".into(),
    )
    .expect("compress ok");
    println!("Hello, world!");
}

it runs on windows system

Corrupted data on write, results in `dist overflow` error on read.

Encountered an issue when writing binary files to a compressed archive. Didn't notice it with the text files I was using during testing. Not sure if it's certain kinds of binary files or something else, but sevenz-rust produces corrupted files that 7-zip cannot read. Attempting to decompress these files with sevenz-rust results in a dist overflow error. Since the reproduction involves a 1mb file, I've created a repository with the example code and PDF file.

https://github.com/nlfiedler/sevenz-overflow

I'm using rustc 1.67 and the latest sevenz-rust crate. I'm hoping I'm just doing it wrong and you will point out my error. Thank you.

No effect on compression level: LZMA2Options::with_preset()

Hi, I was running the following example:

let outputArchive = format!("{}.aes.7z",inputFolder);
let mut sz: SevenZWriter<File> = SevenZWriter::create(outputArchive2).expect("create writer ok");
sz.set_content_methods(vec![
    sevenz_rust::AesEncoderOptions::new("sevenz-rust".into()).into(),
    lzma::LZMA2Options::with_preset(0).into(),
]);
sz.push_source_path(inputFolder, |_| true).expect("pack ok");

sz.finish().expect("compress ok");

and tested around over some different values in with_preset(). My assumption was that 0 would be no compression and 9 would be maximum compression. But no matter the setting, I got the same amount of compression for every entered number. No matter what setting, the compressed folder ends up being 23% of the original. My expectation was that with_preset(0) would not preform any compression at all in this case, but rather just bundle the files together.

Am I misunderstanding something about the implementation on this?

Feature request BCJ LZMA support

Scenario: extract 7z file with Windows binaries

Example archive: https://github.com/niXman/mingw-builds-binaries/releases/download/12.1.0-rt_v10-rev3/x86_64-12.1.0-release-posix-seh-rt_v10-rev3.7z

Error message:

thread 'main' panicked at 'complete: UnsupportedCompressionMethod("[3, 3, 1, 3]")'

A problem caused by binary files which are compressed by BCJ LZMA2:26. Regular files compressed with LZMA2:26 are extracted without problem. Note: Type filter/compression applied to a particular file could be explored by 7-zip File Manager in column Method.

According to my research BCJ is a Branch-Call-Jump (BCJ) filter. It's mentioned here: https://docs.python.org/3/library/lzma.html#specifying-custom-filter-chains

ChecksumVerificationFailed on read of many files in solid archive

I have solid archives with block size of 16Mb. And many of the files fail to read because of ChecksumVerificationFailed.

Example archive: https://up.revertron.com/Memes.7z

Example code:

pub fn test_blocks() {
    let mut buf = Vec::new();

    let mut archive = SevenZReader::open("Memes.7z", Password::empty()).expect("Error opening 7z archive");
    let _ = archive.for_each_entries(|entry, reader| {
        println!("Reading file {}", &entry.name);
        if "FcGD7nuX0AgQNS_.jpg" == entry.name {
            println!("*** Found file {}", &entry.name);
            match reader.read_to_end(&mut buf) {
                Ok(_size) => {
                    println!("Have read file {}", &entry.name);
                    return Ok(false);
                }
                Err(e) => {
                    println!("Error reading file {}: {}", &entry.name, &e);
                    return Err(sevenz_rust::Error::from(e));
                }
            }
        }
        Ok(true)
    });
    assert!(!buf.is_empty())
}

Performance

I have been testing performance against the default 7zFM and it is much faster than this lib. I haven't look much into it but this seems like the program is multithreaded while this lib reads file by file. I'm gonna make my best effort to make it multithreaded but if there is known way of doing it please let me know!

Unable to compile sevenz-rust 5.0.2 in x86_64-unknown-linux-musl

I started to use sevenz-rust 5.0.2 in Selenium Manager (see source code). To compile it, we use GitHub Actions, and for compatibility, in Linux, we use x86_64-unknown-linux-musl as the target (see workflow).

When this workflow is launched, it fails when compiling sevenz-rust in Linux as follows (see execution):

Compiling sevenz-rust v0.5.2
error: linking with `rust-lld` failed: exit status: 1
  = note: rust-lld: error: unable to find library -lgcc_s
          rust-lld: error: unable to find library -lc

I looked for information about this error and changed different things, but no luck yet.

Any idea of the cause of this problem? Thanks a lot.

Incorrect handling of 7z time

In the implementation of this crate the mtime, ctime, and atime of a 7z entry is handled as an i64, as if it were a Unix timestamp, but if you look at the (public domain) C source for 7zip, it's always treated as the following struct, with no special handling for Linux:

typedef struct
{
  UInt32 Low;
  UInt32 High;
} CNtfsFileTime;

Effectively an equivalent to a Windows FILETIME, documentation for which can be found on MSDN or likely many other places.

In my own project I used (more or less) the following translation of the C conversion function:

#[repr(C)]
struct SzTime {
    low: u32,
    high: u32,
}

fn sz_to_unix_time_64(time: SzTime) -> Option<(i64, u32)> {
    const NANOS_IN_SECOND: u64 = 1_000_000_000;
    const WIN_TICKS_IN_SECOND: u64 = NANOS_IN_SECOND / 100;
    
    const WIN_TIME_START_YEAR: u64 = 1601;
    const UNIX_TIME_START_YEAR: u64 = 1970;
    
    const UNIX_WIN_YEAR_OFFSET: u64 = UNIX_TIME_START_YEAR - WIN_TIME_START_YEAR;
    const UNIX_WIN_DAY_OFFSET: u64 = 89 + (365 * UNIX_WIN_YEAR_OFFSET);
    const UNIX_WIN_SEC_OFFSET: u64 = 60 * 60 * 24 * UNIX_WIN_DAY_OFFSET;

    let time: u64 = time.low as u64 | (time.high as u64) << 32;

    let secs = (time / WIN_TICKS_IN_SECOND) - UNIX_WIN_SEC_OFFSET;
    let nanos = (time % WIN_TICKS_IN_SECOND) * 100;
    
    let secs: i64 = secs.try_into().ok()?;
    Some((secs, nanos as u32))
}

compression performance issue compared to native 7z

I ran a benchmark with the following config :

It takes 12x times. I noticed that the number of blocks was really different.

The example was json files

I used the info of 7z for windows to get this table.

Name Timing Size Packed Size Folders Files CRC Path Type Physical Size Headers Size Method Solid Blocks
7z manager windows (normal) 5s 275 460 238 4 041 321 548 28 062 5FA214C8 logs.7z 7z 4 690 530 649 209 LZMA2:24 + 1
7z manager windows (ultra) 21s 275 460 238 2 995 118 548 28 062 5FA214C8 by_files.7z 7z 3 644 327 649 209 LZMA2:26 + 1
examples\sevenz-rust-compression.rs 73s 275 460 238 13 107 812 0 28 062 5FA214C8 sample.7z 7z 16 818 429 3 710 617 LZMA2:23 - 28 062

Since Rust 1.78 - unsafe precondition(s) violated: ptr::copy_nonoverlapping requires that both pointer arguments are aligned and non-null

Hey there, first of all thanks for the cool crate!

Since the upgrade to Rust 1.78 i get the following error:

unsafe precondition(s) violated: ptr::copy_nonoverlapping requires that both pointer arguments are aligned and non-null and the specified memory ranges do not overlap

Here is the call stack. It seems to be an issue within the lzma_rust crate?

grafik

Downgrading to Rust 1.77 fixes the problem again. I found a similar issue here:
sequenceplanner/r2r#96

Thanks
Regards, Christian

Maybe better API

Have you considered add some of the public APIs to make them more human?

For example:

SevenZReader::open("path/file.7z")
    .with_password("password".into()) 
    .decompress();

or

is_7z('xxx')

Support for SFX archives

Hello, thanks for your great work and constant improvements! Do you have any plans to support self-extracting archives, as 7z can create them for pretty much every platform? Cheers

entry's compressed_size is always 0

Hey,

thanks for your great work! One issue I'm facing is that compressed_size is 0 for all entires (this is also the case with the test file in your repo). Is that a known issue?

Change push_archive_entry to return archive entry.

When writing archive entries with sevenz, I would very much like to get the compressed size of each entry after it is written. However, the SevenZWriter.push_archive_entry() takes the entry from the caller and never returns it. I believe that the function should either take a mutable reference or return the modified entry back to the caller. As it stands now, the function is updating the entry fields but then the entry is dropped. If making this change is agreeable, I would be happy to submit a PR.

Compression speed too low

Hi,

I am using sevenz-rust version 0.4.3, with Rust 1.71 on Windows. I am using the code below:
`
use std::path::{Path, PathBuf};
use std::fs::{File,OpenOptions};
use sevenz_rust::lzma::{LZMA2Options ,LZMA2Reader,LZMA2Writer};
use sevenz_rust::{SevenZReader, SevenZWriter, Password, SevenZArchiveEntry};
use std::{process::exit};
use std::time::Instant;

fn main() {
let verb_duration=Instant::now();
let file_name="testdir\folder1\file3.txt";
let file_path=PathBuf::from(file_name);

let archive_path=PathBuf::from("test.7z");
let file_handle = File::create(archive_path).unwrap();
let mut zip:SevenZWriter<File>;
let res_zipwriter=SevenZWriter::new(file_handle);
match res_zipwriter {
    Ok(z)=>{zip=z;}
    Err(e)=>{exit(1)}
}
let entry_name=file_name.replace("\\", "/");
let zipentry= SevenZArchiveEntry::from_path(file_path.clone(),entry_name);
let file_handle=File::open(file_path.clone()).unwrap();
let res_write=zip.push_archive_entry(zipentry,Some(file_handle));
zip.finish();
println!("{:?}",verb_duration.elapsed());

}
`
file3.txt is a 50MB text file and above code takes 54 seconds to generate 9 KB file.
When I use 7zip archiver, it takes less than a second and generates 9KB file.
I am hoping that there is a setting that I am missing to speed things up?

Something is wrong with bigger archives

I try to decompress just few bytes of two different files, one file works, the other one not. Both files work correctly with 7zip.

enwiki-20230501-pages-meta-history23.xml-p50555787p50564553.7z - works
enwiki-20230501-pages-meta-history5.xml-p956483p958045.7z - doesn't work

I used the following code to test it:

#[tokio::main]
async fn main() -> Result<(), Box<dyn Error>> {
    let filename = "/wikipedia/enwiki-20230501-pages-meta-history5.xml-p956483p958045.7z";
    let mut stream = SevenZReader::open(filename, Password::empty()).unwrap();

    stream.for_each_entries(|entry, mut reader| {
        let mut buffer = [0; 16];
        println!("Reading entry: {:?}", entry);
        println!("{:?}", reader.read(&mut buffer[..]));
        println!("{:?}", std::str::from_utf8(&buffer));
        Ok(true)
    })?;

    Ok(())
}

output for enwiki-20230501-pages-meta-history5.xml-p956483p958045.7z:

Reading entry: SevenZArchiveEntry { name: "", has_stream: true, is_directory: false, is_anti_item: false, has_creation_date: false, has_last_modified_date: true, has_access_date: false, creation_date: FileTime(0), last_modified_date: FileTime(133279036098162380), access_date: FileTime(0), has_windows_attributes: true, windows_attributes: 0, has_crc: true, crc: 2292542832, compressed_crc: 0, size: 1585055636, compressed_size: 0, content_methods: [] }
Ok(16)
Ok("<mediawiki xmlns")

output for enwiki-20230501-pages-meta-history5.xml-p956483p958045.7z:

Reading entry: SevenZArchiveEntry { name: "", has_stream: true, is_directory: false, is_anti_item: false, has_creation_date: false, has_last_modified_date: true, has_access_date: false, creation_date: FileTime(0), last_modified_date: FileTime(133278559111423540), access_date: FileTime(0), has_windows_attributes: true, windows_attributes: 0, has_crc: true, crc: 401901466, compressed_crc: 0, size: 3575097180, compressed_size: 0, content_methods: [] }
Ok(0)
Ok("\0\0\0\0\0\0\0\0\0\0\0\0\0\0\0\0")

Is it possible to track decompression progress?

Hello!

Firs of all, this crate is awesome and you made my work so much easier.

That said, I'm doing an installer that needs to display a progress bar with the download and decompression progress.

As the example show, I now have this function:

    let res = decompress_with_extract_fn_and_password(
        file,
        dst,
        "mypass".into(),
        |entry, reader, dest| {
            println!("start extract {}", entry.name());
            let r = default_entry_extract_fn(entry, reader, dest);
            println!("complete extract {}", entry.name());
            r
        },
    );

Is there a way to get the progress?

Decompression failure

system: windows 10 x64 ltsc 21H2 19044.3208
rust-version:

PS E:\.self\learn-rust> rustc -V
rustc 1.71.0 (8ede3aae2 2023-07-12)

code:

use std::fs;
use sevenz_rust;

fn main() {
    fs::remove_dir_all("node").unwrap();
    sevenz_rust::decompress_file("node.7z", "node").expect("complete1111");
}

log:

PS E:\.self\learn-rust> cargo run
   Compiling learn-rust v0.1.0 (E:\.self\learn-rust)
    Finished dev [unoptimized + debuginfo] target(s) in 0.40s
     Running `target\debug\learn-rust.exe`
thread 'main' panicked at 'complete: Other("Multi input/output stream coders are not yet supported")', src\main.rs:5:53
note: run with `RUST_BACKTRACE=1` environment variable to display a backtrace
error: process didn't exit successfully: `target\debug\learn-rust.exe` (exit code: 101)
PS E:\.self\learn-rust>

Expose API similar to for_each_entries but without callback

For my use-case, I need to get a stream to a file within the archive, without deferring to a callback. I don't mind this action blocking all other use of the archive until that file handle is closed. The main issue is just that I cannot work with a callback.

[request] Consider supporting async/await

I am in a situation were I download multiple 7s file and unzip them. My workflow would benefits from concurrency but it seem sevenz_rust does not support it.

Would you consider changing your SevenZReader API so that entries and readers are provided through an iterator and provide an async version of this iterator so that the provided readers would implement the AsyncBufRead trait?

Unable to build without default features

Hi! Since 0.4.0 sevenz-rust fails to build if you do not enable the default feature "compress".
If default-features = false is used, sevenz-rust build fails with this error:

error[E0432]: unresolved import `lzma_rust::LZMA2Options`
 --> C:\Users\runneradmin\.cargo\registry\src\github.com-1ecc6299db9ec823\sevenz-rust-0.4.0\src\method_options.rs:1:5
  |
1 | use lzma_rust::LZMA2Options;
  |     ^^^^^^^^^^^^^^^^^^^^^^^ no `LZMA2Options` in the root

error[E0599]: no variant or associated item named `LZMA2` found for enum `MethodOptions` in the current scope
  --> C:\Users\runneradmin\.cargo\registry\src\github.com-1ecc6299db9ec823\sevenz-rust-0.4.0\src\method_options.rs:32:68
   |
9  | pub enum MethodOptions {
   | ---------------------- variant or associated item `LZMA2` not found for this enum
...
32 |         Self::new(SevenZMethod::LZMA2).with_options(MethodOptions::LZMA2(value))
   |                                                                    ^^^^^ variant or associated item not found in `MethodOptions`

Bug: Unable to decompress due to bad signature error.

Hi there, I was trying to decompress vscode archive, using this library it fails due to bad signature error, however through cli, I'm able to decompresses it successfully, below are screenshots for same.

Using Rust code:

pub fn install(app: &str) {
    let query = app.trim().to_lowercase();

    let (app_name, manifest) = match query.split_once('/') {
        Some((bucket, app_name)) => (
            app_name,
            Buckets::query_app(app_name)
                .unwrap()
                .get_app_from(app_name, bucket),
        ),
        None => (app, Buckets::query_app(app).unwrap().get_app(app)),
    };

    let manifest = manifest.unwrap();
    let file_name = Downloader::download(app_name, true).unwrap();
    let cache_dir = Config::cache_dir().unwrap();

    let srcs = file_name
        .iter()
        .map(|f| match f {
            DownloadStatus::Downloaded(s) => s,
            DownloadStatus::DownloadedAndVerified(s) => s,
            DownloadStatus::AlreadyInCache(s) => s,
        })
        .map(|s| cache_dir.join(s))
        .collect::<Vec<_>>();

    let version = &manifest.version;

    let app_dir = Config::app_dir().unwrap();
    let app_dir = app_dir.join(app_name);
    let app_dir = app_dir.join(version);

    if !app_dir.exists() {
        PathBuf::create(app_dir.clone()).unwrap();
        use sevenz_rust::decompress_file;
        for src in srcs {
            if src
                .extension()
                .unwrap_or_default()
                .to_string_lossy()
                .to_string()
                == "7z"
            {
                println!("decompressingn {:?}", src);
                decompress_file(src, &app_dir).unwrap();
            }
        }
    }

    // println!("{app_name}\n{manifest:#?}\n{file_name:#?} at {app_dir:?}");
}

I gets:
image

However using 7z cli:
image

Empty files are ignored, improvements

Thanks for this library. Really a life saver!
I have three things I want to bring to your attention (v0.5.3):


1) sevenz_rust::default_entry_extract_fn seems to ignore files that are empty with if entry.size() > 0 { ...

While it may seem logical to do so, it isn't exactly lossless as shown below.

image

This is just a sample, I see a diff of about 250 files in my actual archive.
So I feel such assumptions should be avoided since empty files aren't uncommon (for e.g. __init__.py in python).
This is just FYI, I solved this by directly iterating over the entries.


2) The name FolderDecoder is a bit misleading.

It processes the solid blocks in the archive and has nothing to do with the number of folders.


3) Some speed comparisons for decompression!

2.1GB, 35682 Files, 2013 Folders compressed with 64MB solid block size, 4MB dictionary.
Yields a 499 MB 7z archive with 39 solid blocks.

7zFM - 8m 03s
sevenz_rust - 5m 40s
sevenz_rust w/ rayon - 1m 17s Impressive!

All tests on i7-1165G7 (4C/8T)
SSD: KBG40ZNS512G
Seq. R/W (MB/s): 2200/1400
Random R/W (IOPS): 330K/190K

Decompressing 7z+ZSTD is missing entries

Context:

  • I compress a folder using py7zr >= 0.21.0:
    ZSTD_FILTER = [{"id": FILTER_ZSTD, "level": ZSTD_COMPRESSION_LEVEL}]
    with SevenZipFile(<path-to-7z>, mode="w", filters=ZSTD_FILTER) as zst_handle:
        for root, dirs, files in os.walk(<input-path>):
            for node in files + dirs:
                zst_handle.write(os.path.join(root, node), os.path.relpath(os.path.join(root, node), <input-path>))
  • Using sevenz-rust = { version = "0.6.1", features = ["zstd"] } I then try to decompress this file using sevenz-rust sevenz_rust::decompress_file(7z_file, &args.dest)
  • This results in a 'successful' extraction, but it is actually missing a series of files in the directories
  • This archive can be extracted without issue using the 7z utility so it appears to be well formatted.

Debugging:

  • Debugging this with a local version of sevenz-rust shows that this loop in reader.rs is not iterating all of the files
  • Changing this line to for file_index in start..(archive.files.len() + start) fixed this in my case
  • In this archive self.archive.folders.len() = 1 so we only poke folder_dec.for_each_entries once
  • However, this does not iterate all the files because the file_count (computed by archive.folders[folder_index].num_unpack_sub_streams) appears to be too low:
file_count=204
archive.files.len() = 233
  • It looks like file_count is read from the archive numStreams header on this line and is indeed 204 in this case not 233

Questions / thoughts:

  • Is it reliable to rely on the numStreams header for determining the loop iterations?
  • Is there a better way to determine the loop iterations?

Any help / advice would be much appreciated

Recommend Projects

  • React photo React

    A declarative, efficient, and flexible JavaScript library for building user interfaces.

  • Vue.js photo Vue.js

    ๐Ÿ–– Vue.js is a progressive, incrementally-adoptable JavaScript framework for building UI on the web.

  • Typescript photo Typescript

    TypeScript is a superset of JavaScript that compiles to clean JavaScript output.

  • TensorFlow photo TensorFlow

    An Open Source Machine Learning Framework for Everyone

  • Django photo Django

    The Web framework for perfectionists with deadlines.

  • D3 photo D3

    Bring data to life with SVG, Canvas and HTML. ๐Ÿ“Š๐Ÿ“ˆ๐ŸŽ‰

Recommend Topics

  • javascript

    JavaScript (JS) is a lightweight interpreted programming language with first-class functions.

  • web

    Some thing interesting about web. New door for the world.

  • server

    A server is a program made to process requests and deliver data to clients.

  • Machine learning

    Machine learning is a way of modeling and interpreting data that allows a piece of software to respond intelligently.

  • Game

    Some thing interesting about game, make everyone happy.

Recommend Org

  • Facebook photo Facebook

    We are working to build community through open source technology. NB: members must have two-factor auth.

  • Microsoft photo Microsoft

    Open source projects and samples from Microsoft.

  • Google photo Google

    Google โค๏ธ Open Source for everyone.

  • D3 photo D3

    Data-Driven Documents codes.