Giter VIP home page Giter VIP logo

maxmind-db-writer-perl's People

Stargazers

 avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar

Watchers

 avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar

maxmind-db-writer-perl's Issues

automated test - buffer is not being set correctly

The test fails with:

===(     129;5  0/?  10/?  2/2  1/?  0/?  0/?  0/?  0/?  0/?  0/? )=====isa check for "data_source" failed: GLOB(0xf094e8) is not a file handle at /usr/local/share/perl5/MaxMind/DB/Types.pm line 239
    MaxMind::DB::Types::_confess('%s is not a file handle', 'GLOB(0xf094e8)') called at (eval 554) line 90
    MaxMind::DB::Reader::Decoder::new(undef, 'data_source', 'GLOB(0xf094e8)', '_data_source_size', 4) called at t/MaxMind/DB/Writer/Serializer-utf8-round-trip.t line 36
# Tests were run but no plan was declared and done_testing() was not seen.
[01:24:54] t/MaxMind/DB/Writer/Serializer-utf8-round-trip.t .... Dubious, test returned 2 (wstat 512, 0x200)
All 1 subtests passed

In the following code from : t/MaxMind/DB/Writer/Serializer-utf8-round-trip.t the open is actually failing with "Couldn't open file SCALAR(0x2a05188)", the was identified by adding the line or die "Couldn't open file $buffer"

my $serializer = MaxMind::DB::Writer::Serializer->new();
$serializer->store_data( utf8_string => $input );

my $buffer = $serializer->buffer();
open my $fh, '<:raw', $buffer;

Second instance is:

[01:25:00] t/MaxMind/DB/Writer/Tree-output/verifies.t .......... ok      558 ms
===(     593;11  0/?  295/? )===========================================isa check for "data_source" failed: GLOB(0x227a9e0) is not a file handle at /usr/local/share/perl5/MaxMind/DB/Types.pm line 239
    MaxMind::DB::Types::_confess('%s is not a file handle', 'GLOB(0x227a9e0)') called at (eval 553) line 90
    MaxMind::DB::Reader::Decoder::new(undef, 'data_source', 'GLOB(0x227a9e0)', '_data_source_size', 134748227) called at t/MaxMind/DB/Writer/Serializer-large-pointer.t line 36
[01:25:00] t/MaxMind/DB/Writer/Serializer-large-pointer.t ...... Dubious, test returned 2 (wstat 512, 0x200)

Could not determine the type for map key "color"

When I try to run the example from your front page .

I got the error message:

Could not determine the type for map key "color" lib/perl5/x86_64-linux-gnu-thread-multi/MaxMind/DB/Writer/Serializer.pm line 372.

Can't find appropriate typing for hashes with mixed data

I'm having some difficulty with the strict type mapping. The maxmind databases come with a set of data which contains multiple sub-types under a hash. For example:
subdivisions => [
geoname_id => 5037779,
iso_code => "MN",
names => {
en => "Minnesota",
},
],

The documentation at https://metacpan.org/pod/MaxMind::DB::Writer::Tree#DATA-TYPES only indicates a flat structure so if you set something to be a type "['array', 'map']" and then have any data in that array that is not a hash, it throws:
Can't use string ("5037779") as a HASH ref while "strict refs" in use at /usr/local/lib64/perl5/MaxMind/DB/Writer/Serializer.pm line 343.

In this case, this is how the data is structured coming from maxmind directly. Is there a syntax for dealing with subtypes that is not clear to me from the documentation or is this a limit to the writer? Thank you to whoever might have an idea here.

How to convert large database?

Hello,
I need to convert really large database (17M records) to the mmdb2 format. At a moment I run out of memory as the whole Tree not able to be placed in the memory. Are there a way to separate to smaller Trees and than merge them into one whole?

Error message when data section exceeds the maximum size is improved (was: Corrupt data when DB is large)

I'm running into an issue where the perl reader tells me data is corrupt when trying to read from a database file of 30 gigs I receive decode errors. Reading from a smaller, say 1gig, database works perfectly file. Are there limitations on the size of the database I can create or am I doing something wrong? I am using the same method to create both databases, which is just iterating IPs and adding a single large string value. Test code is attached.

read.pl.txt
write.pl.txt

MaxMind::DB::Writer Merge data in Ip ranges

I have a perl script that generates the mmdb files with the ip ranges without any issue for simple data.
I am using mysql to fetch a set of data from database.
What I want is now to have for e.g. something like this.

data -> ip start -> ip end
123 -> 0.0.0.0 -> 5.5.5.5
321 -> 1.1.1.1 -> 5.5.5.5 

Current Code:

my %types = (
    data => 'utf8_string',
);
Runs in a loop of data
$tree->insert_range(
  $row[1],$row[2],
  {
   data => $row[0],#data gets override here, 
   #$row is current row returned from db containing data as given above 
   },
 );

So the created mmdb lets say, should read data 123 for ip 0.0.0.0 and 123,321 for ip 5.5.5.5
Is it possible to add/append data to existing key, tried using recurse as merge strategy but it didn't worked, I get only data 321 for ip 5.5.5.5

Can anyone help me on this to include/add/append data to existing keys?

Merge strategy that does not create new 'map' data structures

Say we want to say what color sheds are painted in the region the IP is located in

123.123.123.123:
{
    "sheds" : {
        "number" : 6,
        "color" : "green"
    }
}

This records are inserted in two parts. The first insert_network call populates the number. The second populates the color.

I would like a flag in the insert_network (and insert_range) calls that allows me not to have data structures like this when inserting a color for a region not covered by number.

128.128.128.128:
{
    "sheds" : {
        "color" : "green" 
    }
}

i.e. the second call to insert_network (or insert_range) will not create a map for sheds if a map does not already exist.

Module does not work in Windows (was: Not able to install the module via metacpan.org)

When I try to install the module MaxMind::DB::Writer in my local system by command:

cpanm MaxMind::DB::Writer

It is failing.

I am pasting the build.log file which is generated while installing. Seems like the file tree.c have some errors.

---------------build.log---------------------
cpanm (App::cpanminus) 1.7004 on perl 5.018002 built for MSWin32-x64-multi-thread
Work directory is C:\Users\manish.m/.cpanm/work/1406179481.5480
You have make C:\strawberry\c\bin\dmake.exe
You have LWP 6.05
Falling back to Archive::Tar 1.96
Searching MaxMind::DB::Writer on cpanmetadb ...
--> Working on MaxMind::DB::Writer
Fetching http://www.cpan.org/authors/id/D/DR/DROLSKY/MaxMind-DB-Writer-0.050005.tar.gz
-> OK
Unpacking MaxMind-DB-Writer-0.050005.tar.gz
Entering MaxMind-DB-Writer-0.050005
Checking configure dependencies from META.json
Checking if you have Module::Build 0.3601 ... Yes (0.4205)
Checking if you have Module::Build 0.36 ... Yes (0.4205)
Configuring MaxMind-DB-Writer-0.050005
Running Build.PL
Checking for unsigned __int128... yes
Created MYMETA.yml and MYMETA.json
Creating new 'Build' script for 'MaxMind-DB-Writer' version '0.050005'
-> OK
Checking dependencies from MYMETA.json ...
Checking if you have Devel::Dwarn 0 ... Yes (undef)
Checking if you have Data::Printer 0 ... Yes (0.35)
Checking if you have List::AllUtils 0 ... Yes (0.08)
Checking if you have constant 0 ... Yes (1.27)
Checking if you have Encode 0 ... Yes (2.59)
Checking if you have bytes 0 ... Yes (1.04)
Checking if you have Test::Deep::NoTest 0 ... Yes (undef)
Checking if you have Digest::MD5 0 ... Yes (2.53)
Checking if you have warnings 0 ... Yes (1.18)
Checking if you have Sereal::Encoder 0 ... Yes (3.001)
Checking if you have Digest::SHA1 0 ... Yes (2.13)
Checking if you have MaxMind::DB::Metadata 0 ... Yes (0.031003)
Checking if you have Test::Bits 0 ... Yes (0.02)
Checking if you have Test::More 0.88 ... Yes (1.001003)
Checking if you have Scalar::Util 0 ... Yes (1.38)
Checking if you have ExtUtils::CBuilder 0 ... Yes (0.280216)
Checking if you have Net::Works::Network 0.16 ... Yes (0.18)
Checking if you have Math::Int128 0.06 ... Yes (0.13)
Checking if you have Net::Works::Address 0 ... Yes (0.18)
Checking if you have Test::Requires 0 ... Yes (0.07)
Checking if you have XSLoader 0 ... Yes (0.16)
Checking if you have Module::Build 0.3601 ... Yes (0.4205)
Checking if you have namespace::autoclean 0 ... Yes (0.19)
Checking if you have File::Temp 0 ... Yes (0.2304)
Checking if you have utf8 0 ... Yes (1.10)
Checking if you have MooseX::StrictConstructor 0 ... Yes (0.19)
Checking if you have Test::Fatal 0 ... Yes (0.013)
Checking if you have Carp 0 ... Yes (1.3301)
Checking if you have lib 0 ... Yes (0.63)
Checking if you have IO::Handle 0 ... Yes (1.34)
Checking if you have Exporter 0 ... Yes (5.68)
Checking if you have Moose 0 ... Yes (2.1204)
Checking if you have Test::MaxMind::DB::Common::Data 0 ... Yes (0.031003)
Checking if you have Moose::Util::TypeConstraints 0 ... Yes (2.1204)
Checking if you have Net::Works 0.16 ... Yes (0.18)
Checking if you have MaxMind::DB::Role::Debugs 0 ... Yes (0.031003)
Checking if you have List::Util 0 ... Yes (1.38)
Checking if you have MaxMind::DB::Reader::Decoder 0 ... Yes (0.050005)
Checking if you have Data::Dumper::Concise 0 ... Yes (2.022)
Checking if you have strict 0 ... Yes (1.07)
Checking if you have autodie 0 ... Yes (2.25)
Checking if you have MaxMind::DB::Common 0.031003 ... Yes (0.031003)
Checking if you have Data::IEEE754 0 ... Yes (0.01)
Building and testing MaxMind-DB-Writer-0.050005
Building MaxMind-DB-Writer
gcc -c -I"c" -s -O2 -DWIN32 -DWIN64 -DCONSERVATIVE -DPERL_TEXTMODE_SCRIPTS -DPERL_IMPLICIT_CONTEXT -DPERL_IMPLICIT_SYS -DUSE_PERLIO -fno-strict-aliasing -mms-bitfields -std=c99 -fms-extensions -Wall -g -s -O2 -I"C:\strawberry\perl\lib\CORE" -I"C:\strawberry\c\include" -o "c\perl_math_int128.o" "c\perl_math_int128.c"
In file included from C:\strawberry\perl\lib\CORE/sys/socket.h:180:0,
from C:\strawberry\perl\lib\CORE/win32.h:381,
from C:\strawberry\perl\lib\CORE/win32thread.h:4,
from C:\strawberry\perl\lib\CORE/perl.h:2869,
from c\perl_math_int128.c:11:
C:\strawberry\perl\lib\CORE/win32.h:386:26: warning: "/" within comment [-Wcomment]
C:\strawberry\perl\lib\CORE/win32.h:387:33: warning: "/
" within comment [-Wcomment]
In file included from C:\strawberry\perl\lib\CORE/win32thread.h:4:0,
from C:\strawberry\perl\lib\CORE/perl.h:2869,
from c\perl_math_int128.c:11:
C:\strawberry\perl\lib\CORE/win32.h:386:26: warning: "/" within comment [-Wcomment]
C:\strawberry\perl\lib\CORE/win32.h:387:33: warning: "/
" within comment [-Wcomment]
gcc -c -I"c" -s -O2 -DWIN32 -DWIN64 -DCONSERVATIVE -DPERL_TEXTMODE_SCRIPTS -DPERL_IMPLICIT_CONTEXT -DPERL_IMPLICIT_SYS -DUSE_PERLIO -fno-strict-aliasing -mms-bitfields -std=c99 -fms-extensions -Wall -g -s -O2 -I"C:\strawberry\perl\lib\CORE" -I"C:\strawberry\c\include" -o "c\perl_math_int64.o" "c\perl_math_int64.c"
In file included from C:\strawberry\perl\lib\CORE/sys/socket.h:180:0,
from C:\strawberry\perl\lib\CORE/win32.h:381,
from C:\strawberry\perl\lib\CORE/win32thread.h:4,
from C:\strawberry\perl\lib\CORE/perl.h:2869,
from c\perl_math_int64.c:11:
C:\strawberry\perl\lib\CORE/win32.h:386:26: warning: "/" within comment [-Wcomment]
C:\strawberry\perl\lib\CORE/win32.h:387:33: warning: "/
" within comment [-Wcomment]
In file included from C:\strawberry\perl\lib\CORE/win32thread.h:4:0,
from C:\strawberry\perl\lib\CORE/perl.h:2869,
from c\perl_math_int64.c:11:
C:\strawberry\perl\lib\CORE/win32.h:386:26: warning: "/" within comment [-Wcomment]
C:\strawberry\perl\lib\CORE/win32.h:387:33: warning: "/
" within comment [-Wcomment]
gcc -c -I"c" -s -O2 -DWIN32 -DWIN64 -DCONSERVATIVE -DPERL_TEXTMODE_SCRIPTS -DPERL_IMPLICIT_CONTEXT -DPERL_IMPLICIT_SYS -DUSE_PERLIO -fno-strict-aliasing -mms-bitfields -std=c99 -fms-extensions -Wall -g -s -O2 -I"C:\strawberry\perl\lib\CORE" -I"C:\strawberry\c\include" -o "c\tree.o" "c\tree.c"
In file included from C:\strawberry\perl\lib\CORE/sys/socket.h:180:0,
from C:\strawberry\perl\lib\CORE/win32.h:381,
from C:\strawberry\perl\lib\CORE/win32thread.h:4,
from C:\strawberry\perl\lib\CORE/perl.h:2869,
from c\tree.h:2,
from c\tree.c:1:
C:\strawberry\perl\lib\CORE/win32.h:386:26: warning: "/" within comment [-Wcomment]
C:\strawberry\perl\lib\CORE/win32.h:387:33: warning: "/
" within comment [-Wcomment]
In file included from C:\strawberry\perl\lib\CORE/win32thread.h:4:0,
from C:\strawberry\perl\lib\CORE/perl.h:2869,
from c\tree.h:2,
from c\tree.c:1:
C:\strawberry\perl\lib\CORE/win32.h:386:26: warning: "/" within comment [-Wcomment]
C:\strawberry\perl\lib\CORE/win32.h:387:33: warning: "/
" within comment [-Wcomment]
c\tree.c: In function 'resolve_network':
c\tree.c:173:46: error: 'AI_V4MAPPED' undeclared (first use in this function)
c\tree.c:173:46: note: each undeclared identifier is reported only once for each function it appears in
c\tree.c: In function 'merge_records':
c\tree.c:557:9: warning: implicit declaration of function 'inet_ntop' [-Wimplicit-function-declaration]
error building dll file from 'c/tree.c' at C:/strawberry/perl/lib/ExtUtils/CBuilder/Platform/Windows.pm line 130.
-> FAIL Installing MaxMind::DB::Writer failed. See C:\Users\manish.m.cpanm\work\1406179481.5480\build.log for details. Retry with --force to force install it.

Output depends on input order

I expected to get the same DB generated regardless of the order of my input, but that's not the case.

For example, if I create a tree with the following 2 IP ranges and input them in the order shown below then the tree contains what I'd expect:

66.17.128.0/17
66.17.185.0/24

But if I add them in the opposite order (66.17.185.0/24 first), then 66.17.185.0/24 doesn't appear in the tree at all.

Here's a simple writer script that I'm using to test this:

#!/usr/bin/perl
use MaxMind::DB::Writer::Tree;
use Net::Works::Network;

my $tree = MaxMind::DB::Writer::Tree->new(
   ip_version            => 4,
   record_size           => 24,
   database_type         => 'test',
   languages             => ['en'],
   description           => { en => 'ipinfo data' },
   map_key_type_callback => sub { 'utf8_string' },
);

foreach $line ( <STDIN> ) {
    chomp( $line );
    $tree->insert_network(
       Net::Works::Network->new_from_string( string => $line),
       {prefix => $line}
    );
}

open my $fh, '>:bytes', 'written.mmdb';
$tree->write_tree($fh);

I also have a simple read script, that looks up the given IP in the output database. Here's what happens with the ranges in the first order:

$ echo -e "66.17.128.0/17\n66.17.185.0/24" | ./write
$ ./read 66.17.185.5
{ prefix: '66.17.185.0/24' }
$ ./read 66.17.128.1
{ prefix: '66.17.128.0/17' }

$ strings written.mmdb | grep -o 66.*
66.17.128.0/17
66.17.185.0/24

And here's the second:

$ echo -e "66.17.185.0/24\n66.17.128.0/17" | ./write
$ ./read 66.17.185.5
{ prefix: '66.17.128.0/17' }
$ ./read 66.17.128.1
{ prefix: '66.17.128.0/17' }

$ strings written.mmdb | grep -o 66.*
66.17.128.0/17

I had expected the same database to be produced regardless of the order in which networks are added to the tree.

Can't locate object method "_encode_utf8-string" via package "MaxMind::DB::Writer::Serializer"

The following perl script yields this error:
_Can't locate object method "encode_utf8-string" via package "MaxMind::DB::Writer::Serializer" at /usr/local/lib/perl5/site_perl/5.24.0/x86_64-linux/MaxMind/DB/Writer/Serializer.pm line 209

And this is the perl file content:

!/usr/bin/perl

use MaxMind::DB::Writer::Tree;

my %types = (
names => 'map',
city => 'map',
continent => 'map',
code => 'utf8-string',
country => 'map',
iso_code => 'utf8-string',
location => 'map',
latitude => 'double',
longitude => 'double',
province => [ 'array', 'utf8_string' ],
en => 'utf8_string',
"zh-CN" => 'utf8_string',
);

my $tree = MaxMind::DB::Writer::Tree->new(
ip_version => 4,
record_size => 24,
database_type => 'My-IP-Data',
languages => ['en'],
description => { en => 'My database of IP data' },
map_key_type_callback => sub { $types{ $_[0] } },
);

$tree->insert_range(
'122.226.116.1', '122.226.116.255',
{
city => {names => {"zh-CN" => "金华", en => 'Jinhua' }},
continent => {names => {"zh-CN" => "亚洲", en => 'AS' }, code => 'AS'},
country => {names => {"zh-CN" => "**", en => 'China' }},
location => {latitude => 42.362600, longitude => -71.084300},
province => [ 'Fido', 'Ms. Pretty Paws' ],
},
);

open my $fh, '>:raw', '/root/MaxMind-DB-Writer-0.201000/test.mmdb';
$tree->write_tree($fh);

Subnets are randomly dropped on merge

When adding a subnet which is part of a larger one, the former is dropped randomly.

I have attached two files, both created with the exact same code (also attached). In ca. 1/2 of the cases the smaller subnet is missing in the file:

image

example_code.zip
mmdbs.zip

Perl Version: perl 5, version 26, subversion 1 (v5.26.1) built for x86_64-linux-gnu-thread-multi

IPv4 parent record and alias records are immutable

Right now it is possible to remove the IPv4 parent record or alias records by removing or replacing the subtree that they are in. This can lead to the unintentional removal of an IPv4 alias or, worse, leave the aliases with a dangling pointer to the IPv4 range. This is unlikely to happen when merging is enabled unless the subtree is explicitly removed or force_overwrite is used, but it would be very easy to accidentally do this when merging is disabled.

I'd propose that some records be marked as immutable, including all alias records and the IPv4 range. When removing or replacing a subtree, these records would remain unchanged (although the records in nodes that they point to may be updated).

Can't use string as a HASH ref in DB/Writer/Tree.pm

How to deal with the following?

Can't use string ("101.215.51.0/24") as a HASH ref while "strict refs" in use at /Library/Perl/5.18/darwin-thread-multi-2level/MaxMind/DB/Writer/Tree.pm line 203.

A simple ip like "1.2.3.4/24" works well.

__int128 is not supported on this target

Building and testing MaxMind-DB-Writer-0.100002 ... Building MaxMind-DB-Writer
cc -Ic -I/System/Library/Perl/5.18/darwin-thread-multi-2level/CORE -Wall -D__INT128 -DINT64_T -g -std=c99 -fms-extensions -c -arch x86_64 -arch i386 -g -pipe -fno-common -DPERL_DARWIN -fno-strict-aliasing -fstack-protector -Os -o c/perl_math_int128.o c/perl_math_int128.c
cc -Ic -I/System/Library/Perl/5.18/darwin-thread-multi-2level/CORE -Wall -D__INT128 -DINT64_T -g -std=c99 -fms-extensions -c -arch x86_64 -arch i386 -g -pipe -fno-common -DPERL_DARWIN -fno-strict-aliasing -fstack-protector -Os -o c/perl_math_int64.o c/perl_math_int64.c
cc -Ic -I/System/Library/Perl/5.18/darwin-thread-multi-2level/CORE -Wall -D__INT128 -DINT64_T -g -std=c99 -fms-extensions -c -arch x86_64 -arch i386 -g -pipe -fno-common -DPERL_DARWIN -fno-strict-aliasing -fstack-protector -Os -o c/tree.o c/tree.c
In file included from c/tree.c:1:
c/tree.h:46:9: error: __int128 is not supported on this target
typedef __int128 int128_t;
        ^
c/tree.h:47:18: error: __int128 is not supported on this target
typedef unsigned __int128 uint128_t;
                 ^
2 errors generated.

This error on mac OS X 10.10.3
sorry maybe this my error.

MaxMind::DB::Writer 0.300002 fails to compile on Ubuntu 18.04

Hello,

We find an issue on the compilation of the module as described below.
Note: the previous version compiles successfully.

Building and testing MaxMind-DB-Writer-0.300002 ... FAIL
! Installing MaxMind::DB::Writer failed. See /root/.cpanm/work/1530095179.266502/build.log for details. Retry with --force to force install it.

less /root/.cpanm/work/1530095179.266502/build.log

In file included from c/tree.h:3:0,
from c/tree.c:1:
c/tree.c: In function ‘insert_network’:
/usr/lib/x86_64-linux-gnu/perl/5.26/CORE/perl.h:176:16: error: ‘my_perl’ undeclared (first
use in this function); did you mean ‘my_fork’?

define aTHX my_perl

            ^

Regards,

Test does not cause segfault

The test at https://gist.github.com/oschwald/62d5304c0abf688c2f65e2ba6a03a2d4 causes a segfault in every version of the writer I have tested (0.100006 and greater). We seem to end up in an infinite recursion until the stack overflows. This is unrelated to the other recent bugs, from what I can tell.

Abridged backtrace:

#0  0x00007f6a477ab88f in _int_malloc (av=av@entry=0x7f6a47aeec00 <main_arena>, bytes=bytes@entry=16) at malloc.c:3326
#1  0x00007f6a477ae4ae in __GI___libc_malloc (bytes=bytes@entry=16) at malloc.c:2895
#2  0x00007f6a42f392f9 in checked_malloc (size=size@entry=16) at c/tree.c:1758
#3  0x00007f6a42f39ddd in insert_record_for_network (tree=tree@entry=0x46e68d0, network=network@entry=0x7ffd4effe160, 
    new_record=new_record@entry=0x7ffd4f7f38f0, merge_record_collisions=merge_record_collisions@entry=false) at c/tree.c:632
#4  0x00007f6a42f39e34 in insert_record_for_network (tree=tree@entry=0x46e68d0, network=network@entry=0x7ffd4effe1d0, 
    new_record=new_record@entry=0x7ffd4f7f38f0, merge_record_collisions=merge_record_collisions@entry=false) at c/tree.c:646
#5  0x00007f6a42f39e34 in insert_record_for_network (tree=tree@entry=0x46e68d0, network=network@entry=0x7ffd4effe240, 
    new_record=new_record@entry=0x7ffd4f7f38f0, merge_record_collisions=merge_record_collisions@entry=false) at c/tree.c:646
#6  0x00007f6a42f39e34 in insert_record_for_network (tree=tree@entry=0x46e68d0, network=network@entry=0x7ffd4effe2b0, 
    new_record=new_record@entry=0x7ffd4f7f38f0, merge_record_collisions=merge_record_collisions@entry=false) at c/tree.c:646
#7  0x00007f6a42f39e34 in insert_record_for_network (tree=tree@entry=0x46e68d0, network=network@entry=0x7ffd4effe320, 
    new_record=new_record@entry=0x7ffd4f7f38f0, merge_record_collisions=merge_record_collisions@entry=false) at c/tree.c:646
#8  0x00007f6a42f39e34 in insert_record_for_network (tree=tree@entry=0x46e68d0, network=network@entry=0x7ffd4effe390, 
    new_record=new_record@entry=0x7ffd4f7f38f0, merge_record_collisions=merge_record_collisions@entry=false) at c/tree.c:646
#9  0x00007f6a42f39e34 in insert_record_for_network (tree=tree@entry=0x46e68d0, network=network@entry=0x7ffd4effe400, 
    new_record=new_record@entry=0x7ffd4f7f38f0, merge_record_collisions=merge_record_collisions@entry=false) at c/tree.c:646
#10 0x00007f6a42f39e34 in insert_record_for_network (tree=tree@entry=0x46e68d0, network=network@entry=0x7ffd4effe470, 
    new_record=new_record@entry=0x7ffd4f7f38f0, merge_record_collisions=merge_record_collisions@entry=false) at c/tree.c:646
#11 0x00007f6a42f39e34 in insert_record_for_network (tree=tree@entry=0x46e68d0, network=network@entry=0x7ffd4effe4e0, 
    new_record=new_record@entry=0x7ffd4f7f38f0, merge_record_collisions=merge_record_collisions@entry=false) at c/tree.c:646
#12 0x00007f6a42f39e34 in insert_record_for_network (tree=tree@entry=0x46e68d0, network=network@entry=0x7ffd4effe550, 
    new_record=new_record@entry=0x7ffd4f7f38f0, merge_record_collisions=merge_record_collisions@entry=false) at c/tree.c:646
#13 0x00007f6a42f39e34 in insert_record_for_network (tree=tree@entry=0x46e68d0, network=network@entry=0x7ffd4effe5c0, 
    new_record=new_record@entry=0x7ffd4f7f38f0, merge_record_collisions=merge_record_collisions@entry=false) at c/tree.c:646
#14 0x00007f6a42f39e34 in insert_record_for_network (tree=tree@entry=0x46e68d0, network=network@entry=0x7ffd4effe630, 
    new_record=new_record@entry=0x7ffd4f7f38f0, merge_record_collisions=merge_record_collisions@entry=false) at c/tree.c:646
#15 0x00007f6a42f39e34 in insert_record_for_network (tree=tree@entry=0x46e68d0, network=network@entry=0x7ffd4effe6a0, 
    new_record=new_record@entry=0x7ffd4f7f38f0, merge_record_collisions=merge_record_collisions@entry=false) at c/tree.c:646
#16 0x00007f6a42f39e34 in insert_record_for_network (tree=tree@entry=0x46e68d0, network=network@entry=0x7ffd4effe710, 
    new_record=new_record@entry=0x7ffd4f7f38f0, merge_record_collisions=merge_record_collisions@entry=false) at c/tree.c:646
#17 0x00007f6a42f39e34 in insert_record_for_network (tree=tree@entry=0x46e68d0, network=network@entry=0x7ffd4effe780, 
    new_record=new_record@entry=0x7ffd4f7f38f0, merge_record_collisions=merge_record_collisions@entry=false) at c/tree.c:646
#18 0x00007f6a42f39e34 in insert_record_for_network (tree=tree@entry=0x46e68d0, network=network@entry=0x7ffd4effe7f0, 
    new_record=new_record@entry=0x7ffd4f7f38f0, merge_record_collisions=merge_record_collisions@entry=false) at c/tree.c:646
#19 0x00007f6a42f39e34 in insert_record_for_network (tree=tree@entry=0x46e68d0, network=network@entry=0x7ffd4effe860, 
    new_record=new_record@entry=0x7ffd4f7f38f0, merge_record_collisions=merge_record_collisions@entry=false) at c/tree.c:646
#20 0x00007f6a42f39e34 in insert_record_for_network (tree=tree@entry=0x46e68d0, network=network@entry=0x7ffd4effe8d0, 
    new_record=new_record@entry=0x7ffd4f7f38f0, merge_record_collisions=merge_record_collisions@entry=false) at c/tree.c:646
#21 0x00007f6a42f39e34 in insert_record_for_network (tree=tree@entry=0x46e68d0, network=network@entry=0x7ffd4effe940, 
    new_record=new_record@entry=0x7ffd4f7f38f0, merge_record_collisions=merge_record_collisions@entry=false) at c/tree.c:646
#22 0x00007f6a42f39e34 in insert_record_for_network (tree=tree@entry=0x46e68d0, network=network@entry=0x7ffd4effe9b0, 
    new_record=new_record@entry=0x7ffd4f7f38f0, merge_record_collisions=merge_record_collisions@entry=false) at c/tree.c:646
#23 0x00007f6a42f39e34 in insert_record_for_network (tree=tree@entry=0x46e68d0, network=network@entry=0x7ffd4effea20, 
    new_record=new_record@entry=0x7ffd4f7f38f0, merge_record_collisions=merge_record_collisions@entry=false) at c/tree.c:646
#24 0x00007f6a42f39e34 in insert_record_for_network (tree=tree@entry=0x46e68d0, network=network@entry=0x7ffd4effea90, 
    new_record=new_record@entry=0x7ffd4f7f38f0, merge_record_collisions=merge_record_collisions@entry=false) at c/tree.c:646
#25 0x00007f6a42f39e34 in insert_record_for_network (tree=tree@entry=0x46e68d0, network=network@entry=0x7ffd4effeb00, 
    new_record=new_record@entry=0x7ffd4f7f38f0, merge_record_collisions=merge_record_collisions@entry=false) at c/tree.c:646
#26 0x00007f6a42f39e34 in insert_record_for_network (tree=tree@entry=0x46e68d0, network=network@entry=0x7ffd4effeb70, 
    new_record=new_record@entry=0x7ffd4f7f38f0, merge_record_collisions=merge_record_collisions@entry=false) at c/tree.c:646
#27 0x00007f6a42f39e34 in insert_record_for_network (tree=tree@entry=0x46e68d0, network=network@entry=0x7ffd4effebe0, 
    new_record=new_record@entry=0x7ffd4f7f38f0, merge_record_collisions=merge_record_collisions@entry=false) at c/tree.c:646
#28 0x00007f6a42f39e34 in insert_record_for_network (tree=tree@entry=0x46e68d0, network=network@entry=0x7ffd4effec50, 
    new_record=new_record@entry=0x7ffd4f7f38f0, merge_record_collisions=merge_record_collisions@entry=false) at c/tree.c:646

How to support both IPv4 and IPv6 inside the same MMDB file?

Hi,

The current example contains the following :

my $tree = MaxMind::DB::Writer::Tree->new(
    ip_version            => 6,
    record_size           => 24,
    database_type         => 'My-IP-Data',
    languages             => ['en'],
    description           => { en => 'My database of IP data' },
    map_key_type_callback => sub { $types{ $_[0] } },
);

my $network
    = Net::Works::Network->new_from_string( string => '8.23.0.0/16' );

$tree->insert_network(
    $network,
    {
        color => 'blue',
        dogs  => [ 'Fido', 'Ms. Pretty Paws' ],
        size  => 42,
    },
);

This doesn't seem to work, as it tries to add an IPv4 network into an ip_version => 6 tree.

What is the proper and working way to create a single MMDB file which would work for both IPv4 and IPv6 address lookups (like the official GeoIP2 files from MaxMind)? Should the example also be fixed? Writing two different trees to the same file, one 4 and one 6, doesn't seem to work.

Bad IP interpreting in insert_network()

OS: FreeBSD 9.3
Perl: revision 5 version 16 subversion 3
MaxMind::DB::Writer is successfully built from ports.
MaxMind::DB::Writer version 0.100004 (also tried 0.100005)
Net::Works::Network version 0.21

During test script evaluation i'm getting an error:

Bad IP address: 2001:db8:: - Invalid value for hints

It happens somewhere inside insert_network method

IPv4 and IPv6 data conflict

Hi,

I found a problem when using both IPv4 and IPv6 generatethe ip2region.mmdb file: when the IPv4 segment is converted to IPv6, and the IPv6 also has this segment, it will cause problems in the result that the region information of IPv4 is overwritten by ipv6.

Than you.

Issue in insert network where raw string work but string in some variable does not work

use MaxMind::DB::Writer::Tree;
use strict;

my %types = (
autonomous_system_number => 'int32',
autonomous_system_organization => 'utf8_string'
);

my $tree = MaxMind::DB::Writer::Tree->new(
ip_version => 4,
record_size => 32,
database_type => 'GeoLite2-ASN',
languages => ['en'],
description => { en => 'My database of IP data' },
map_key_type_callback => sub { $types{ $_[0] } },
);

open(my $data, '<', "output.csv") or die;

while (my $line = <$data>)
{
chomp $line;

# Split the line and store it
# inside the words array
my @words = split ",", $line;

my $ip = $words[0];
print "$words[0], $words[1], $words[2] \n";
$tree->insert_network(
        $ip ,
        {
            autonomous_system_number => 0 + $words[1],
            autonomous_system_organization => $words[2],
        },
);

}

open my $fh, '>:raw', 'my-ip-data.mmdb';
$tree->write_tree($fh);

~/git/Asn_test $ perl mmdb_asn_creater.pl perl: warning: Setting locale failed. perl: warning: Please check that your locale settings: LC_ALL = (unset), LC_CTYPE = "en_IN.UTF-8", LANG = (unset) are supported and installed on your system. perl: warning: Falling back to the standard locale ("C"). "1.0.0.0/24", "13335", "cloudflare" Argument "24"" isn't numeric in int at /Users/ajain/perl5/lib/perl5/darwin-thread-multi-2level/MaxMind/DB/Writer/Tree.pm line 197, <$data> line 1. Invalid IP address: "1.0.0.0 at /Users/ajain/perl5/lib/perl5/darwin-thread-multi-2level/MaxMind/DB/Writer/Tree.pm line 206, <$data> line 1. ~/git/Asn_test $

as you can see words[0] is "1.0.0.0/24" and it gives error but when i put "1.0.0.0/24" directly into insert network its works fine``

Parse errors: No plan found in TAP output

I am unable to install on Mac. Earlier I was able to install. Then I installed MongoDB and I started getting errors like this in my script:


Invalid version format (version required) at /Users/mandeep/perl5/lib/perl5/Module/Runtime.pm line 386.
BEGIN failed--compilation aborted at /Users/mandeep/perl5/lib/perl5/darwin-thread-multi-2level/MaxMind/DB/Writer/Serializer.pm line 17.
Compilation failed in require at /Users/mandeep/perl5/lib/perl5/darwin-thread-multi-2level/MaxMind/DB/Writer/Tree.pm line 18.
BEGIN failed--compilation aborted at /Users/mandeep/perl5/lib/perl5/darwin-thread-multi-2level/MaxMind/DB/Writer/Tree.pm line 18.
Compilation failed in require at my_custom_db.pl line 8.
BEGIN failed--compilation aborted at my_custom_db.pl line 8.

I tried reinstalling but now it does not install properly. Here is the test summary


Test Summary Report
-------------------
t/MaxMind/DB/Writer/Serializer-deduplication.t           (Wstat: 65280 Tests: 0 Failed: 0)
  Non-zero exit status: 255
  Parse errors: No plan found in TAP output
t/MaxMind/DB/Writer/Serializer-large-pointer.t           (Wstat: 65280 Tests: 0 Failed: 0)
  Non-zero exit status: 255
  Parse errors: No plan found in TAP output
t/MaxMind/DB/Writer/Serializer-types/array.t             (Wstat: 65280 Tests: 0 Failed: 0)
  Non-zero exit status: 255
  Parse errors: No plan found in TAP output
t/MaxMind/DB/Writer/Serializer-types/boolean.t           (Wstat: 65280 Tests: 0 Failed: 0)
  Non-zero exit status: 255
  Parse errors: No plan found in TAP output
t/MaxMind/DB/Writer/Serializer-types/bytes.t             (Wstat: 65280 Tests: 0 Failed: 0)
  Non-zero exit status: 255
  Parse errors: No plan found in TAP output
t/MaxMind/DB/Writer/Serializer-types/double.t            (Wstat: 65280 Tests: 0 Failed: 0)
  Non-zero exit status: 255
  Parse errors: No plan found in TAP output
t/MaxMind/DB/Writer/Serializer-types/end_marker.t        (Wstat: 65280 Tests: 0 Failed: 0)
  Non-zero exit status: 255
  Parse errors: No plan found in TAP output
t/MaxMind/DB/Writer/Serializer-types/float.t             (Wstat: 65280 Tests: 0 Failed: 0)
  Non-zero exit status: 255
  Parse errors: No plan found in TAP output
t/MaxMind/DB/Writer/Serializer-types/int32.t             (Wstat: 65280 Tests: 0 Failed: 0)
  Non-zero exit status: 255
  Parse errors: No plan found in TAP output
t/MaxMind/DB/Writer/Serializer-types/map.t               (Wstat: 65280 Tests: 0 Failed: 0)
  Non-zero exit status: 255
  Parse errors: No plan found in TAP output
t/MaxMind/DB/Writer/Serializer-types/pointer.t           (Wstat: 65280 Tests: 0 Failed: 0)
  Non-zero exit status: 255
  Parse errors: No plan found in TAP output
t/MaxMind/DB/Writer/Serializer-types/uint128.t           (Wstat: 65280 Tests: 0 Failed: 0)
  Non-zero exit status: 255
  Parse errors: No plan found in TAP output
t/MaxMind/DB/Writer/Serializer-types/uint16.t            (Wstat: 65280 Tests: 0 Failed: 0)
  Non-zero exit status: 255
  Parse errors: No plan found in TAP output
t/MaxMind/DB/Writer/Serializer-types/uint32.t            (Wstat: 65280 Tests: 0 Failed: 0)
  Non-zero exit status: 255
  Parse errors: No plan found in TAP output
t/MaxMind/DB/Writer/Serializer-types/uint64.t            (Wstat: 65280 Tests: 0 Failed: 0)
  Non-zero exit status: 255
  Parse errors: No plan found in TAP output
t/MaxMind/DB/Writer/Serializer-types/utf8_string.t       (Wstat: 65280 Tests: 0 Failed: 0)
  Non-zero exit status: 255
  Parse errors: No plan found in TAP output
t/MaxMind/DB/Writer/Serializer-utf8-as-bytes.t           (Wstat: 65280 Tests: 0 Failed: 0)
  Non-zero exit status: 255
  Parse errors: No plan found in TAP output
t/MaxMind/DB/Writer/Serializer-utf8-round-trip.t         (Wstat: 65280 Tests: 0 Failed: 0)
  Non-zero exit status: 255
  Parse errors: No plan found in TAP output
t/MaxMind/DB/Writer/Serializer.t                         (Wstat: 65280 Tests: 0 Failed: 0)
  Non-zero exit status: 255
  Parse errors: No plan found in TAP output
t/MaxMind/DB/Writer/Tree-bigint.t                        (Wstat: 65280 Tests: 0 Failed: 0)
  Non-zero exit status: 255
  Parse errors: No plan found in TAP output
t/MaxMind/DB/Writer/Tree-data-references.t               (Wstat: 65280 Tests: 0 Failed: 0)
  Non-zero exit status: 255
  Parse errors: No plan found in TAP output
t/MaxMind/DB/Writer/Tree-insert-range.t                  (Wstat: 65280 Tests: 0 Failed: 0)
  Non-zero exit status: 255
  Parse errors: No plan found in TAP output
t/MaxMind/DB/Writer/Tree-ipv4-and-6.t                    (Wstat: 65280 Tests: 0 Failed: 0)
  Non-zero exit status: 255
  Parse errors: No plan found in TAP output
t/MaxMind/DB/Writer/Tree-ipv6-aliases.t                  (Wstat: 65280 Tests: 0 Failed: 0)
  Non-zero exit status: 255
  Parse errors: No plan found in TAP output
t/MaxMind/DB/Writer/Tree-iterator.t                      (Wstat: 65280 Tests: 0 Failed: 0)
  Non-zero exit status: 255
  Parse errors: No plan found in TAP output
t/MaxMind/DB/Writer/Tree-output/0-0-0-0.t                (Wstat: 65280 Tests: 0 Failed: 0)
  Non-zero exit status: 255
  Parse errors: No plan found in TAP output
t/MaxMind/DB/Writer/Tree-output/basic.t                  (Wstat: 65280 Tests: 0 Failed: 0)
  Non-zero exit status: 255
  Parse errors: No plan found in TAP output
t/MaxMind/DB/Writer/Tree-output/freeze-thaw-record-size.t (Wstat: 65280 Tests: 0 Failed: 0)
  Non-zero exit status: 255
  Parse errors: No plan found in TAP output
t/MaxMind/DB/Writer/Tree-output/freeze-then-write-bug.t  (Wstat: 65280 Tests: 0 Failed: 0)
  Non-zero exit status: 255
  Parse errors: No plan found in TAP output
t/MaxMind/DB/Writer/Tree-output/ipv6-aliases.t           (Wstat: 65280 Tests: 0 Failed: 0)
  Non-zero exit status: 255
  Parse errors: No plan found in TAP output
t/MaxMind/DB/Writer/Tree-output/record-deduplication.t   (Wstat: 65280 Tests: 0 Failed: 0)
  Non-zero exit status: 255
  Parse errors: No plan found in TAP output
t/MaxMind/DB/Writer/Tree-output/utf8-data.t              (Wstat: 65280 Tests: 0 Failed: 0)
  Non-zero exit status: 255
  Parse errors: No plan found in TAP output
t/MaxMind/DB/Writer/Tree-record-collisions.t             (Wstat: 65280 Tests: 0 Failed: 0)
  Non-zero exit status: 255
  Parse errors: No plan found in TAP output
t/MaxMind/DB/Writer/Tree.t                               (Wstat: 65280 Tests: 0 Failed: 0)
  Non-zero exit status: 255
  Parse errors: No plan found in TAP output
Files=48, Tests=2, 12 wallclock secs ( 0.11 usr  0.09 sys + 10.75 cusr  1.09 csys = 12.04 CPU)
Result: FAIL
Failed 34/48 test programs. 0/2 subtests failed.
-> FAIL Installing MaxMind::DB::Writer failed. See /Users/mandeep/.cpanm/work/1465898384.44897/build.log for details. Retry with --force to force install it.

Converting CSV into MMDB

I am trying to convert the CSV files from the MaxMind website (http://dev.maxmind.com/geoip/geoip2/geolite2/) into an MMDB format.

However, whenever my files is larger than ~10,000 lines, I always get an error like this:

Argument "" isn't numeric in pack at /usr/local/lib64/perl5/MaxMind/DB/Writer/Serializer.pm line 389.
Cannot store an undef as data at /usr/local/lib64/perl5/MaxMind/DB/Writer/Serializer.pm line 93.
	MaxMind::DB::Writer::Serializer::store_data('MaxMind::DB::Writer::Serializer=HASH(0x3afe060)', 'utf8_string', undef, undef) called at /usr/local/lib64/perl5/MaxMind/DB/Writer/Serializer.pm line 355
	MaxMind::DB::Writer::Serializer::_encode_map('MaxMind::DB::Writer::Serializer=HASH(0x3afe060)', 'HASH(0xb2a2e60)', undef) called at /usr/local/lib64/perl5/MaxMind/DB/Writer/Serializer.pm line 211
	MaxMind::DB::Writer::Serializer::_store_data('MaxMind::DB::Writer::Serializer=HASH(0x3afe060)', 'map', 'HASH(0xb2a2e60)', undef) called at /usr/local/lib64/perl5/MaxMind/DB/Writer/Serializer.pm line 120
	MaxMind::DB::Writer::Serializer::store_data('MaxMind::DB::Writer::Serializer=HASH(0x3afe060)', 'map', 'HASH(0xb2a2e60)', undef) called at /usr/local/lib64/perl5/MaxMind/DB/Writer/Serializer.pm line 355
	MaxMind::DB::Writer::Serializer::_encode_map('MaxMind::DB::Writer::Serializer=HASH(0x3afe060)', 'HASH(0xb2b2478)', undef) called at /usr/local/lib64/perl5/MaxMind/DB/Writer/Serializer.pm line 211
	MaxMind::DB::Writer::Serializer::_store_data('MaxMind::DB::Writer::Serializer=HASH(0x3afe060)', 'map', 'HASH(0xb2b2478)', undef) called at /usr/local/lib64/perl5/MaxMind/DB/Writer/Serializer.pm line 120
	MaxMind::DB::Writer::Serializer::store_data('MaxMind::DB::Writer::Serializer=HASH(0x3afe060)', 'map', 'HASH(0xb2b2478)', undef, 'MfnaF9bHUODrHFghs4kNT8R8Aqo') called at /usr/local/lib64/perl5/MaxMind/DB/Writer/Tree.pm line 292
	MaxMind::DB::Writer::Tree::write_tree('MaxMind::DB::Writer::Tree=HASH(0x3006fe8)', 'GLOB(0x28aa338)') called at csvReaderWriter.pl line 148

where "csvReaderWriter.pl" is my file on my local machine.

I have tried running this on a server with 192GB of RAM so I don't think it is a memory issue.
Is there anything you think that might be causing this?

Thank You

How to create large database

It seems the perl writer expect to hold the entire tree in memory prior to writing it out? How would one go about writing a database larger than the raw they have available?

Allow deep merging of records when building a tree

Currently the merging of records only works for top level keys. For example this:

  {
      families => [ {
          husband => 'Fred',
          wife    => 'Wimla',
      }, ],
      year => 1960,
  }

Merged with this:

    {
        families => [ {
            wife    => 'Wilma',
            child   => 'Pebbles',
        }, {
            husband => 'Barney',
            wife    => 'Betty',
            child   => 'Bamm-Bamm',
        }, ],  
        company => 'Hanna-Barbera Productions',
    }

Gives this:

    {
        families => [ {
            wife    => 'Wilma',
            child   => 'Pebbles',
        }, {
            husband => 'Barney',
            wife    => 'Betty',
            child   => 'Bamm-Bamm',
        }, ],
        year => 1960,  
        company => 'Hanna-Barbera Productions',
    }

Not

    {
        families => [ {
            husband => 'Fred',
            wife    => 'Wilma',    # note replaced value
            child   => 'Pebbles',
        }, {
            husband => 'Barney',
            wife    => 'Betty',
            child   => 'Bamm-Bamm',
        }, ],
        year => 1960,
        company => 'Hanna-Barbera Productions',
    }

This should probably be an optional feature

Adding net ranges without CIDR notation

All of the examples use CIDR notation when adding network ranges, eg:

my $network = Net::Works::Network->new_from_string( string => '8.23.0.0/16' );

It is possible to provide a start and end IP instead? For example, I'd like to be able to do something like:

my $network = Net::Works::Network->new_from_string( start => '205.156.219.1', end => '205.156.219.255');

Which would be preferable to having to add the following 8 CIDR subnets that encapsulate that same range:

205.156.219.1/32
205.156.219.2/31
205.156.219.4/30
205.156.219.8/29
205.156.219.16/28
205.156.219.32/27
205.156.219.64/26
205.156.219.128/25

Freezing/thawing works as expected with an aliased tree

Right now it is causing segfaults:

#0  merge_hashes_for_keys (tree=0x12b9edc0, 
    key_from=0x12ba32e0 "hEdcaXFlSH5RIQ7E6717Di32b6k", 
    key_into=0x12ba32e0 "hEdcaXFlSH5RIQ7E6717Di32b6k", network=0x12b9fcd0)
    at c/tree.c:754
754 c/tree.c: No such file or directory.
(gdb) bt
#0  merge_hashes_for_keys (tree=0x12b9edc0, 
    key_from=0x12ba32e0 "hEdcaXFlSH5RIQ7E6717Di32b6k", 
    key_into=0x12ba32e0 "hEdcaXFlSH5RIQ7E6717Di32b6k", network=0x12b9fcd0)
    at c/tree.c:754
#1  0x00007fecfe4526b1 in merge_records (tree=tree@entry=0x12b9edc0, 
    network=network@entry=0x12b9fcd0, new_record=new_record@entry=0x109e95d0, 
    record_to_set=record_to_set@entry=0x12ba6eb0) at c/tree.c:728
#2  0x00007fecfe451d87 in insert_record_for_network (
    tree=tree@entry=0x12b9edc0, network=0x12b9fcd0, new_record=0x109e95d0, 
    merge_record_collisions=<optimized out>) at c/tree.c:611
#3  0x00007fecfe452cb5 in thaw_tree (
    filename=filename@entry=0x12b9c100 "/tmp/ldkq7IAnRO/data/GeoIP2-Anonymous-IP.frozen", initial_offset=initial_offset@entry=244, 
    ip_version=ip_version@entry=6 '\006', 
    record_size=record_size@entry=24 '\030', 
    merge_strategy=MMDBW_MERGE_STRATEGY_TOPLEVEL, alias_ipv6=<optimized out>)
    at c/tree.c:1236
#4  0x00007fecfe44c611 in XS_MaxMind__DB__Writer__Tree__thaw_tree (
    cv=<optimized out>) at lib/MaxMind/DB/Writer/Tree.xs:297

Before the recent bug fix, aliasing happened when the tree was written, and generally there was no reason to freeze a tree at that point. However, this could cause a corrupt database if the user inserted IPv4 addresses in the aliased locations before aliasing.

Error with example code

I've installed the writer from CPAN, and when running the example code from the readme I've got the next error:

Iteration is not currently allowed in trees with no nodes. Record type: empty at /usr/local/lib/x86_64-linux-gnu/perl/5.24.1/MaxMind/DB/Writer/Tree.pm line 292.

I'm mostly new to Perl and totally new to CPAN. Any suggestion about what can I do?

How to minimise on-disk database size?

I've been attempting to produce a cut-down version of the GeoLite2 database in order to reduce the on-disk size. My approach has been to iterate over all nodes using MaxMind::DB::Reader and then selectively inserting each one into a new tree if it contains all of three fields in which I'm interested. This works in so far as I do create a new database that seems to contain much terser records. However, the new database is some 70 Mb in comparison to the 50 Mb of the original file. I'd be grateful for some advice. Here's my code:

#!/usr/bin/perl -w

use strict;
use warnings;
use Getopt::Std;
use MaxMind::DB::Reader;
use MaxMind::DB::Writer::Tree;
use Net::Works::Address;
use Net::Works::Network;

my %types = (
    city =>     'map',
    country =>  'map',
    names =>    'map',
    en =>       'utf8_string',
    iso_code => 'utf8_string'
);

my %options=();
getopts("i:o:", \%options);

if (!defined($options{i}) ||
    !defined($options{o})) {
	print "usage: $0 -i input_file -o ouput_file\n";
	exit 1;
}

my $reader = MaxMind::DB::Reader->new(file => $options{i});
open my $fh, '>:raw', $options{o};

my $writer = MaxMind::DB::Writer::Tree->new(
    {
        ip_version =>       $reader->metadata()->ip_version(),
        record_size =>      $reader->metadata()->record_size(),
        database_type =>    "Cut-down " . $reader->metadata()->database_type(),
        languages =>        ['en'],
        description =>      { en => $reader->metadata()->description()->{en} },
        map_key_type_callback => sub { $types{ $_[0] } },
        merge_strategy =>   'recurse',
    }
);

my @records;

$reader->iterate_search_tree(
	sub {
		my $ip_as_integer = shift;
		my $mask_length   = shift;
		my $data          = shift;
		my %record;
		my $add = 0;

		if ($data->{city}{names}{en}) {
			$record{data}{city}{names}{en} = $data->{city}{names}{en};
			$add++;
		}
		if ($data->{country}{names}{en}) {
			$record{data}{country}{names}{en} = $data->{country}{names}{en};
			$add++;
		}
		if ($data->{country}{iso_code}) {
			$add++;
			$record{data}{country}{iso_code} = $data->{country}{iso_code};
		}

		if ($add == 3) {
			$record{network} = {ip => $ip_as_integer, size => $mask_length};
			push @records, \%record;
		 }
	}
);

# Sorting here is a stab in the dark based on reading other issues.
foreach my $record (sort { $b->{network}{size} <=> $a->{network}{size} ||
                           $a->{network}{ip} <=> $b->{network}{ip} } @records) {
    my $address = Net::Works::Address->new_from_integer(
        integer => $record->{network}{ip});
    my $network = Net::Works::Network->new_from_string(
            string => $address->as_string . "/" . $record->{network}{size}
    );

    $writer->insert_network($network, $record->{data});
}

$writer->write_tree($fh);
close $fh;

Iteration is not currently allowed in trees with no nodes

This returns "Iteration is not currently allowed in trees with no nodes. Record type: empty at /usr/lib/perl5/site_perl/MaxMind/DB/Writer/Tree.pm line 292."

use MaxMind::DB::Writer::Tree;

my %types = (
    area_code => 'utf8_string'
);

my $tree = MaxMind::DB::Writer::Tree->new(
    ip_version            => 4,
    record_size           => 24,
    database_type         => 'my-db',
    languages             => ['en'],
    description           => { en => "my mmdb" },
    map_key_type_callback => sub { $types{ $_[0] } },
);

$tree->insert_network('192.168.1.4/32', { area_code => 'US' });

open my $fh, '>:raw', './test.mmdb';
$tree->write_tree($fh);

Character encoding issue

I'm having issues with encoding...

Here's my test input file:

$ file input
input: UTF-8 Unicode text
$ cat input
0.0.0.0/0|Cablevisión Reykjavík རེཀ་ཇ་བིཀ།Ρέικιαβικ

Here's my writer script:

use MaxMind::DB::Writer::Tree;
use Net::Works::Network;

my %types = (
    name => 'utf8_string',
);

my $tree = MaxMind::DB::Writer::Tree->new(
   ip_version            => 4,
   record_size           => 28,
   database_type         => 'encoding test',
   languages             => ['en'],
   description           => { en => 'encoding test' },
   map_key_type_callback => sub { $types{ $_[0] } },
);

foreach $line ( <STDIN> ) {
    chomp( $line );
    @parts = split(/\|/, $line);
    $tree->insert_network(
       Net::Works::Network->new_from_string( string => $parts[0]),
       {name => ($parts[1] or '')},
    );
}

open my $fh, '>:bytes', 'test.mmdb';
$tree->write_tree($fh);

Here's a perl reader script:

use MaxMind::DB::Reader;
use Data::Dumper;
my $reader = MaxMind::DB::Reader->new( file => 'test.mmdb' );
my $res = $reader->record_for_address('0.0.0.0');
print Dumper($res);

That outputs this:

$VAR1 = {
          'name' => "Cablevisi\x{c3}\x{b3}n Reykjav\x{c3}\x{ad}k \x{e0}\x{bd}\x{a2}\x{e0}\x{bd}\x{ba}\x{e0}\x{bd}\x{80}\x{e0}\x{bc}\x{8b}\x{e0}\x{bd}\x{87}\x{e0}\x{bc}\x{8b}\x{e0}\x{bd}\x{96}\x{e0}\x{bd}\x{b2}\x{e0}\x{bd}\x{80}\x{e0}\x{bc}\x{8d}\x{ce}\x{a1}\x{ce}\x{ad}\x{ce}\x{b9}\x{ce}\x{ba}\x{ce}\x{b9}\x{ce}\x{b1}\x{ce}\x{b2}\x{ce}\x{b9}\x{ce}\x{ba}",
        };

I also have a script that uses https://github.com/PaddeK/node-maxmind-db

reader = require('maxmind-db-reader')

data = reader.openSync('test.mmdb')
result = data.getGeoDataSync('0.0.0.0')
console.log result.name

And that outputs:

Cablevisión Reykjavík རེ�����ི��Ρέικιαβικ

Any thoughts on what might be going wrong here?

how to merge in private addresses? (ie 10/8)

Hi there

I want to merge our internally gathered 10/8 GeoIP data into mmdb so that our SIEM can use GeoIP to show the location of not only Internet events - but intranet ones too

I'm no programmer, but I have managed to get a script together to do this - but it's barfing with

The IP address you provided (10.3.123.0) is not a public IP address when calling GeoIP2::Database::Reader::city on GeoIP2::Database::Reader

I see there's a "if ( is_private_ipv4($ip) || is_private_ipv6($ip) ) {" in GeoIP2/Database/Reader.pm, but I cannot figure out how to disable that "is_private_ipv4" check? Is that possible?

Thanks

Search tree duplicated via Writer has much bigger size than an original

Hello.

I have a test script, which simply duplicates database via
iterate_search_tree() => Network->new_from_integer() => insert_network() => write_tree().

Script is runned inside docker, Dockerfile.

This is a test GeoLite2-City.mmdb database.

I can't understand why the file written by the MaxMind::DB::Writer::Tree has a 3.5x much bigger search tree size than the original.

Is something wrong in the script itself or the Perl Writer API has a bug?

FTBFS: test failures on some architectures

We have the following bug reported to the Debian package of
MaxMind-DB-Writer, c.f. https://bugs.debian.org/968362

It doesn't seem to be a bug in the packaging, so you may want to take
a look. Thanks!

------8<-----------8<-----------8<-----------8<-----------8<-----

Source: libmaxmind-db-writer-perl
Version: 0.300003-1
Severity: serious
Tags: upstream ftbfs
Justification: fails to build from source (but built successfully in the past)

libmaxmind-db-writer-perl never built on all architectures:

https://buildd.debian.org/status/logs.php?pkg=libmaxmind-db-writer-perl

The history of the uploads goes like this:

0.300003-1: testsuite disabled
Result: successful build on all architectures where all build dependencies
are available (esp. libmath-int128-perl is missing on quite a few).

0.300003-2: testsuite enabled but tests needing Test::HexDifferences
skipped (as it was not yet packaged)
Result: additional failures in the tests on ppc64, s390x, sparc64
As (only) s390x is a release architecture, the package never migrated
to testing.

0.300003-3: all tests are run after libtest-hexdifferences entered
the archive.
Result: same as for 0.300003-2


Logs of the failures (for 0.300003-3):

ppc64:
https://buildd.debian.org/status/fetch.php?pkg=libmaxmind-db-writer-perl&arch=ppc64&ver=0.300003-3&stamp=1596597051&raw=0

sparc64:
https://buildd.debian.org/status/fetch.php?pkg=libmaxmind-db-writer-perl&arch=sparc64&ver=0.300003-3&stamp=1596597463&raw=0

s390x:
https://buildd.debian.org/status/fetch.php?pkg=libmaxmind-db-writer-perl&arch=s390x&ver=0.300003-3&stamp=1596578449&raw=0


The failing tests are always the same, quoting from the s390x log:

Sereal: Error: Bad Sereal header: Not a valid Sereal document. at offset 1 of input at srl_decoder.c line 600 at /<<PKGBUILDDIR>>/blib/lib/MaxMind/DB/Writer/Tree.pm line 403.
t/MaxMind/DB/Writer/Tree-freeze-thaw.t ..................... 
    1..0
not ok 1 - No tests run for subtest "Tree with 256 networks - IPv4 only - 24-bit records"
Dubious, test returned 255 (wstat 65280, 0xff00)
Failed 1/1 subtests 
[�]
Sereal: Error: Bad Sereal header: Not a valid Sereal document. at offset 1 of input at srl_decoder.c line 600 at /<<PKGBUILDDIR>>/blib/lib/MaxMind/DB/Writer/Tree.pm line 403.
t/MaxMind/DB/Writer/Tree-output/freeze-thaw-record-size.t .. 
Dubious, test returned 255 (wstat 65280, 0xff00)
No subtests run 
[�]
Sereal: Error: Bad Sereal header: Not a valid Sereal document. at offset 1 of input at srl_decoder.c line 600 at /<<PKGBUILDDIR>>/blib/lib/MaxMind/DB/Writer/Tree.pm line 403.
t/MaxMind/DB/Writer/Tree-record-collisions.t ............... 
[�]
Dubious, test returned 255 (wstat 65280, 0xff00)
All 21 subtests passed 
[�]
    #   Failed test 'Run without exceptions'
    #   at t/MaxMind/DB/Writer/Tree-thaw-merge.t line 86.
    # Sereal: Error: Bad Sereal header: Not a valid Sereal document. at offset 1 of input at srl_decoder.c line 600 at /<<PKGBUILDDIR>>/blib/lib/MaxMind/DB/Writer/Tree.pm line 403.
    # Looks like you failed 1 test of 1.


    #   Failed test 'Run without exceptions'
    #   at t/MaxMind/DB/Writer/Tree-thaw-merge.t line 86.
    # Sereal: Error: Bad Sereal header: Not a valid Sereal document. at offset 1 of input at srl_decoder.c line 600 at /<<PKGBUILDDIR>>/blib/lib/MaxMind/DB/Writer/Tree.pm line 403.
    # Looks like you failed 1 test of 1.


    #   Failed test 'Run without exceptions'
    #   at t/MaxMind/DB/Writer/Tree-thaw-merge.t line 86.
    # Sereal: Error: Bad Sereal header: Not a valid Sereal document. at offset 1 of input at srl_decoder.c line 600 at /<<PKGBUILDDIR>>/blib/lib/MaxMind/DB/Writer/Tree.pm line 403.
    # Looks like you failed 1 test of 1.


    #   Failed test 'Run without exceptions'
    #   at t/MaxMind/DB/Writer/Tree-thaw-merge.t line 86.
    # Sereal: Error: Bad Sereal header: Not a valid Sereal document. at offset 1 of input at srl_decoder.c line 600 at /<<PKGBUILDDIR>>/blib/lib/MaxMind/DB/Writer/Tree.pm line 403.
    # Looks like you failed 1 test of 1.


    #   Failed test 'Run without exceptions'
    #   at t/MaxMind/DB/Writer/Tree-thaw-merge.t line 86.
    # Sereal: Error: Bad Sereal header: Not a valid Sereal document. at offset 1 of input at srl_decoder.c line 600 at /<<PKGBUILDDIR>>/blib/lib/MaxMind/DB/Writer/Tree.pm line 403.
    # Looks like you failed 1 test of 1.


    #   Failed test 'Run without exceptions'
    #   at t/MaxMind/DB/Writer/Tree-thaw-merge.t line 86.
    # Sereal: Error: Bad Sereal header: Not a valid Sereal document. at offset 1 of input at srl_decoder.c line 600 at /<<PKGBUILDDIR>>/blib/lib/MaxMind/DB/Writer/Tree.pm line 403.
    # Looks like you failed 1 test of 1.


    #   Failed test 'Run without exceptions'
    #   at t/MaxMind/DB/Writer/Tree-thaw-merge.t line 86.
    # Sereal: Error: Bad Sereal header: Not a valid Sereal document. at offset 1 of input at srl_decoder.c line 600 at /<<PKGBUILDDIR>>/blib/lib/MaxMind/DB/Writer/Tree.pm line 403.
    # Looks like you failed 1 test of 1.


    #   Failed test 'Run without exceptions'
    #   at t/MaxMind/DB/Writer/Tree-thaw-merge.t line 86.
    # Sereal: Error: Bad Sereal header: Not a valid Sereal document. at offset 1 of input at srl_decoder.c line 600 at /<<PKGBUILDDIR>>/blib/lib/MaxMind/DB/Writer/Tree.pm line 403.
    # Looks like you failed 1 test of 1.


    #   Failed test 'Run without exceptions'
    #   at t/MaxMind/DB/Writer/Tree-thaw-merge.t line 86.
    # Sereal: Error: Bad Sereal header: Not a valid Sereal document. at offset 1 of input at srl_decoder.c line 600 at /<<PKGBUILDDIR>>/blib/lib/MaxMind/DB/Writer/Tree.pm line 403.
    # Looks like you failed 1 test of 1.


    #   Failed test 'Run without exceptions'
    #   at t/MaxMind/DB/Writer/Tree-thaw-merge.t line 86.
    # Sereal: Error: Bad Sereal header: Not a valid Sereal document. at offset 1 of input at srl_decoder.c line 600 at /<<PKGBUILDDIR>>/blib/lib/MaxMind/DB/Writer/Tree.pm line 403.
    # Looks like you failed 1 test of 1.


    #   Failed test 'Run without exceptions'
    #   at t/MaxMind/DB/Writer/Tree-thaw-merge.t line 86.
    # Sereal: Error: Bad Sereal header: Not a valid Sereal document. at offset 1 of input at srl_decoder.c line 600 at /<<PKGBUILDDIR>>/blib/lib/MaxMind/DB/Writer/Tree.pm line 403.
    # Looks like you failed 1 test of 1.


    #   Failed test 'Run without exceptions'
    #   at t/MaxMind/DB/Writer/Tree-thaw-merge.t line 86.
    # Sereal: Error: Bad Sereal header: Not a valid Sereal document. at offset 1 of input at srl_decoder.c line 600 at /<<PKGBUILDDIR>>/blib/lib/MaxMind/DB/Writer/Tree.pm line 403.
    # Looks like you failed 1 test of 1.


    #   Failed test 'Run without exceptions'
    #   at t/MaxMind/DB/Writer/Tree-thaw-merge.t line 86.
    # Sereal: Error: Bad Sereal header: Not a valid Sereal document. at offset 1 of input at srl_decoder.c line 600 at /<<PKGBUILDDIR>>/blib/lib/MaxMind/DB/Writer/Tree.pm line 403.
    # Looks like you failed 1 test of 1.


    #   Failed test 'Run without exceptions'
    #   at t/MaxMind/DB/Writer/Tree-thaw-merge.t line 86.
    # Sereal: Error: Bad Sereal header: Not a valid Sereal document. at offset 1 of input at srl_decoder.c line 600 at /<<PKGBUILDDIR>>/blib/lib/MaxMind/DB/Writer/Tree.pm line 403.
    # Looks like you failed 1 test of 1.

t/MaxMind/DB/Writer/Tree-thaw-merge.t ...................... 
    not ok 1 - Run without exceptions
    1..1
not ok 1 - check defaults work
    not ok 1 - Run without exceptions
    1..1
not ok 2 - check no merging explictly
    not ok 1 - Run without exceptions
    1..1
not ok 3 - check no merging and none explictly
    not ok 1 - Run without exceptions
    1..1
not ok 4 - set mrc in constructor, toplevel in thaw
    not ok 1 - Run without exceptions
    1..1
not ok 5 - set toplevel in constructor
    not ok 1 - Run without exceptions
    1..1
not ok 6 - set recurse in constructor
    not ok 1 - Run without exceptions
    1..1
not ok 7 - set mrc only in constructor
    not ok 1 - Run without exceptions
    1..1
not ok 8 - set toplevel only in constructor
    not ok 1 - Run without exceptions
    1..1
not ok 9 - set recurse only in constructor
    not ok 1 - Run without exceptions
    1..1
not ok 10 - set toplevel only in thaw
    not ok 1 - Run without exceptions
    1..1
not ok 11 - set mrc off in constructor, toplevel in thaw
    not ok 1 - Run without exceptions
    1..1
not ok 12 - set none in constructor, toplevel only in thaw
    not ok 1 - Run without exceptions
    1..1
not ok 13 - set recurse only in thaw
    not ok 1 - Run without exceptions
    1..1
not ok 14 - set mrc off in constructor, recurse in thaw
ok 15 - no (unexpected) warnings (via done_testing)
1..15
Dubious, test returned 14 (wstat 3584, 0xe00)
Failed 14/15 subtests 
[�]
Test Summary Report
-------------------
t/MaxMind/DB/Writer/Tree-freeze-thaw.t                   (Wstat: 65280 Tests: 1 Failed: 1)
  Failed test:  1
  Non-zero exit status: 255
  Parse errors: No plan found in TAP output
t/MaxMind/DB/Writer/Tree-output/freeze-thaw-record-size.t (Wstat: 65280 Tests: 0 Failed: 0)
  Non-zero exit status: 255
  Parse errors: No plan found in TAP output
t/MaxMind/DB/Writer/Tree-record-collisions.t             (Wstat: 65280 Tests: 21 Failed: 0)
  Non-zero exit status: 255
  Parse errors: No plan found in TAP output
t/MaxMind/DB/Writer/Tree-thaw-merge.t                    (Wstat: 3584 Tests: 15 Failed: 14)
  Failed tests:  1-14
  Non-zero exit status: 14
Files=39, Tests=446, 16 wallclock secs ( 0.19 usr  0.02 sys +  9.19 cusr  0.52 csys =  9.92 CPU)
Result: FAIL
Failed 4/39 test programs. 15/446 subtests failed.


So basically always the same:
Sereal: Error: Bad Sereal header: Not a valid Sereal document. at offset 1 of input at srl_decoder.c line 600 at /<<PKGBUILDDIR>>/blib/lib/MaxMind/DB/Writer/Tree.pm line 403.


The tests were run with libsereal-{de,}encoder-perl 4.018+ds-1, srl_decoder.c
is in libsereal-decoder-perl.


Cheers,
gregor

------8<-----------8<-----------8<-----------8<-----------8<-----

Thanks for considering,
gregor herrmann,
Debian Perl Group

Merge of array values on ip collision

Am trying to merge array values when there is an ip collision, am trying something like this,

my %types = (
    ids => [ 'array', 'utf8_string' ],
);

while( my $line = <$info>) {
	#my $mk = 'merge_strategy';
	$tree->insert_network(
                trim($line).'/32' => {
                        ids => [ '5854' ],
                },
	);
        $tree->insert_network(
                trim($line).'/32' => {
                        ids => [ '5994' ],
                },
        );
}

but everytime am getting only 5994 in ids, the output is something like this

123.123.123.100/30
\ {
    ids   [
        [0] 5994
    ]
}

how can i achieve this to write something like below

123.123.123.100/30
\ {
    ids   [
        [0] 5854,
        [1] 5994
    ]
}

The problem of ipv4 and ipv6

Hi, the IPv6 which were mapped in IPv4 IP segment are not written into the mmdb file, is it controlled by itself or there are related variables in the program that can be operated?
When IPv4 and IPv6 are in the same tree, what is default express way of the IPv4 in the program? Is it ::ffff:0:0/96?

Thanks

2001::/32 IPv4 alias works again

This broke in 0.20000 as the reserved networks to remove were copied from Net::Works::Network and this included 2001::/23.

Ideally remove_reserved_networks() should throw an exception if you try to remove an aliased node. This should be easy to do by iterating over the subtree.

Should calculate the search tree record size automatically

There's no reason not to do this, and picking too small a size causes the build to fail. The calculation should be based on the number of nodes in the search tree + 16 (data section separator) + the sum of the length of all unique data items, or something like that.

automated test - Use of uninitialized value $this in pattern match (m//) at t/MaxMind/DB/Writer/Tree-output/basic.t line 61

Test failure on uninitilised value $this on pattern match (m//)

===(     154;9  0/?  1/?  0/?  1/?  0/?  0/?  0/?  0/? )================Use of uninitialized value $this in pattern match (m//) at t/MaxMind/DB/Writer/Tree-output/basic.t line 61.

#   Failed test 'tree output starts with a search tree of 18 bytes - 24-bit record'
#   at t/MaxMind/DB/Writer/Tree-output/basic.t line 61.
#                   undef
#     doesn't match '(?-xism:^.{18}\\\\\\\\\\\\\\\\)'
Use of uninitialized value $this in pattern match (m//) at t/MaxMind/DB/Writer/Tree-output/basic.t line 73.

#   Failed test 'first node in search tree points to nodes 1 (L) and 2 (R) - 24-bit record'
#   at t/MaxMind/DB/Writer/Tree-output/basic.t line 73.
#                   undef
#     doesn't match '(?-xism:\\\\\\)'
Use of uninitialized value $buffer in length at t/MaxMind/DB/Writer/Tree-output/basic.t line 79.
Use of uninitialized value $buffer in pattern match (m//) at t/MaxMind/DB/Writer/Tree-output/basic.t line 84.
Use of uninitialized value $got in numeric gt (>) at cmp_ok [from t/MaxMind/DB/Writer/Tree-output/basic.t line 104] line 1.

#   Failed test 'node 1 left record points to a value outside the search tree - 24-bit record'
#   at t/MaxMind/DB/Writer/Tree-output/basic.t line 104.
Use of uninitialized value $val in addition (+) at /usr/share/perl5/Test/Builder.pm line 631.
#     undef
#         >
#     '2'
Use of uninitialized value $left_record{"1"} in subtraction (-) at t/MaxMind/DB/Writer/Tree-output/basic.t line 109.

#   Failed test 'node 1 left record points to a value in the data section - 24-bit record'
#   at t/MaxMind/DB/Writer/Tree-output/basic.t line 109.
#     '-3'
#         <
#     '-18'
Use of uninitialized value $got in numeric gt (>) at cmp_ok [from t/MaxMind/DB/Writer/Tree-output/basic.t line 114] line 1.

#   Failed test 'node 2 left record points to a value outside the search tree - 24-bit record'
#   at t/MaxMind/DB/Writer/Tree-output/basic.t line 114.
Use of uninitialized value $val in addition (+) at /usr/share/perl5/Test/Builder.pm line 631.
#     undef
#         >
#     '2'
Use of uninitialized value $left_record{"2"} in subtraction (-) at t/MaxMind/DB/Writer/Tree-output/basic.t line 119.

#   Failed test 'node 2 left record points to a value in the data section - 24-bit record'
#   at t/MaxMind/DB/Writer/Tree-output/basic.t line 119.
#     '-3'
#         <
#     '-18'

2nd instance

===(     180;9  0/?  0/?  8/?  0/?  0/?  0/?  0/? )=====================Use of uninitialized value $this in pattern match (m//) at t/MaxMind/DB/Writer/Tree-output/basic.t line 61.

#   Failed test 'tree output starts with a search tree of 21 bytes - 28-bit record'
#   at t/MaxMind/DB/Writer/Tree-output/basic.t line 61.
#                   undef
#     doesn't match '(?-xism:^.{21}\\\\\\\\\\\\\\\\)'
Use of uninitialized value $this in pattern match (m//) at t/MaxMind/DB/Writer/Tree-output/basic.t line 73.

#   Failed test 'first node in search tree points to nodes 1 (L) and 2 (R) - 28-bit record'
#   at t/MaxMind/DB/Writer/Tree-output/basic.t line 73.
#                   undef
#     doesn't match '(?-xism:\\\\\\\\)'
Use of uninitialized value $buffer in length at t/MaxMind/DB/Writer/Tree-output/basic.t line 79.
Use of uninitialized value $buffer in pattern match (m//) at t/MaxMind/DB/Writer/Tree-output/basic.t line 88.
Use of uninitialized value $node in unpack at t/MaxMind/DB/Writer/Tree-output/basic.t line 93.
Argument "" isn't numeric in bitwise and (&) at t/MaxMind/DB/Writer/Tree-output/basic.t line 95.
Use of uninitialized value $node in unpack at t/MaxMind/DB/Writer/Tree-output/basic.t line 93.
Argument "" isn't numeric in bitwise and (&) at t/MaxMind/DB/Writer/Tree-output/basic.t line 95.
Use of uninitialized value $got in numeric gt (>) at cmp_ok [from t/MaxMind/DB/Writer/Tree-output/basic.t line 104] line 1.

#   Failed test 'node 1 left record points to a value outside the search tree - 28-bit record'
#   at t/MaxMind/DB/Writer/Tree-output/basic.t line 104.
Use of uninitialized value $val in addition (+) at /usr/share/perl5/Test/Builder.pm line 631.
#     undef
#         >
#     '2'
Use of uninitialized value $left_record{"1"} in subtraction (-) at t/MaxMind/DB/Writer/Tree-output/basic.t line 109.

#   Failed test 'node 1 left record points to a value in the data section - 28-bit record'
#   at t/MaxMind/DB/Writer/Tree-output/basic.t line 109.
#     '-3'
#         <
#     '-21'
Use of uninitialized value $got in numeric gt (>) at cmp_ok [from t/MaxMind/DB/Writer/Tree-output/basic.t line 114] line 1.

#   Failed test 'node 2 left record points to a value outside the search tree - 28-bit record'
#   at t/MaxMind/DB/Writer/Tree-output/basic.t line 114.
Use of uninitialized value $val in addition (+) at /usr/share/perl5/Test/Builder.pm line 631.
#     undef
#         >
#     '2'
Use of uninitialized value $left_record{"2"} in subtraction (-) at t/MaxMind/DB/Writer/Tree-output/basic.t line 119.

#   Failed test 'node 2 left record points to a value in the data section - 28-bit record'
#   at t/MaxMind/DB/Writer/Tree-output/basic.t line 119.
#     '-3'
#         <
#     '-21'
Use of uninitialized value $this in pattern match (m//) at t/MaxMind/DB/Writer/Tree-output/basic.t line 61.

#   Failed test 'tree output starts with a search tree of 24 bytes - 32-bit record'
#   at t/MaxMind/DB/Writer/Tree-output/basic.t line 61.
#                   undef
#     doesn't match '(?-xism:^.{24}\\\\\\\\\\\\\\\\)'
Use of uninitialized value $this in pattern match (m//) at t/MaxMind/DB/Writer/Tree-output/basic.t line 73.

#   Failed test 'first node in search tree points to nodes 1 (L) and 2 (R) - 32-bit record'
#   at t/MaxMind/DB/Writer/Tree-output/basic.t line 73.
#                   undef
#     doesn't match '(?-xism:\\\\\\\\)'
Use of uninitialized value $buffer in length at t/MaxMind/DB/Writer/Tree-output/basic.t line 79.
Use of uninitialized value $buffer in pattern match (m//) at t/MaxMind/DB/Writer/Tree-output/basic.t line 101.
Use of uninitialized value $got in numeric gt (>) at cmp_ok [from t/MaxMind/DB/Writer/Tree-output/basic.t line 104] line 1.

#   Failed test 'node 1 left record points to a value outside the search tree - 32-bit record'
#   at t/MaxMind/DB/Writer/Tree-output/basic.t line 104.
Use of uninitialized value $val in addition (+) at /usr/share/perl5/Test/Builder.pm line 631.
#     undef
#         >
#     '2'
Use of uninitialized value $left_record{"1"} in subtraction (-) at t/MaxMind/DB/Writer/Tree-output/basic.t line 109.

#   Failed test 'node 1 left record points to a value in the data section - 32-bit record'
#   at t/MaxMind/DB/Writer/Tree-output/basic.t line 109.
#     '-3'
#         <
#     '-24'
Use of uninitialized value $got in numeric gt (>) at cmp_ok [from t/MaxMind/DB/Writer/Tree-output/basic.t line 114] line 1.

#   Failed test 'node 2 left record points to a value outside the search tree - 32-bit record'
#   at t/MaxMind/DB/Writer/Tree-output/basic.t line 114.
Use of uninitialized value $val in addition (+) at /usr/share/perl5/Test/Builder.pm line 631.
#     undef
#         >
#     '2'
Use of uninitialized value $left_record{"2"} in subtraction (-) at t/MaxMind/DB/Writer/Tree-output/basic.t line 119.

#   Failed test 'node 2 left record points to a value in the data section - 32-bit record'
#   at t/MaxMind/DB/Writer/Tree-output/basic.t line 119.
#     '-3'
#         <
#     '-24'
# Looks like you failed 18 tests of 90.
[01:24:58] t/MaxMind/DB/Writer/Tree-output/basic.t ............. Dubious, test returned 18 (wstat 4608, 0x1200)
Failed 18/90 subtests 

Recommend Projects

  • React photo React

    A declarative, efficient, and flexible JavaScript library for building user interfaces.

  • Vue.js photo Vue.js

    🖖 Vue.js is a progressive, incrementally-adoptable JavaScript framework for building UI on the web.

  • Typescript photo Typescript

    TypeScript is a superset of JavaScript that compiles to clean JavaScript output.

  • TensorFlow photo TensorFlow

    An Open Source Machine Learning Framework for Everyone

  • Django photo Django

    The Web framework for perfectionists with deadlines.

  • D3 photo D3

    Bring data to life with SVG, Canvas and HTML. 📊📈🎉

Recommend Topics

  • javascript

    JavaScript (JS) is a lightweight interpreted programming language with first-class functions.

  • web

    Some thing interesting about web. New door for the world.

  • server

    A server is a program made to process requests and deliver data to clients.

  • Machine learning

    Machine learning is a way of modeling and interpreting data that allows a piece of software to respond intelligently.

  • Game

    Some thing interesting about game, make everyone happy.

Recommend Org

  • Facebook photo Facebook

    We are working to build community through open source technology. NB: members must have two-factor auth.

  • Microsoft photo Microsoft

    Open source projects and samples from Microsoft.

  • Google photo Google

    Google ❤️ Open Source for everyone.

  • D3 photo D3

    Data-Driven Documents codes.