formationai / hazel Goto Github PK
View Code? Open in Web Editor NEWManaging third-party Haskell packages in Bazel
License: Apache License 2.0
Managing third-party Haskell packages in Bazel
License: Apache License 2.0
From internal issue tracker:
Currently, if we want to make a change to a package we need to make a git clone.
Longer-term, we may want to add more explicit "patching" ability into Hazel somehow.
What is the intended mechanism for adding packages that are not in the Stackage snapshot from Hackage into a Hazel project?
Looking at the docstring on hazel_custom_package_hackage
it sounds like the right tool:
Generate a repo for a Haskell package fetched from Hackage.
However, hazel_custom_package_hackage
and hazel_custom_package_github
do not use _cabal_haskell_repository
internally but instead refer to a dedicated per-package BUILD
file that has to be present in the hazel
repository. So, these functions don't work for arbitrary Hackage packages, but only those for which Hazel has builtin support.
On the other hand, we can manually extend the packages
parameter to hazel_repositories
. E.g.
load("//hazel:packages.bzl", "core_packages", "packages")
hazel_repositories(
core_packages = core_packages,
packages = packages + {
"some-package": struct(
version = "x.y.z.w",
sha256 = "...",
),
},
...
)
Unfortunately, this has to be done in a .bzl
file because struct
cannot be used in WORKSPACE
(AFAIK), which makes this boilerplate heavy. Maybe hazel_repositories
should have an additional parameter extra_packages
that takes a dict
of dict
s to make this extensible in a WORKSPACE
file?
From internal issue tracker:
Currently, Hazel requires manually generating a packages.bzl
file that reifies the Stack resolver:
load("//:packages.bzl", "prebuilt_dependencies", "packages")
hazel_repositories(
prebuilt_dependencies=prebuilt_dependencies,
packages=packages)
We ought to be able to do better; the only thing that we should need to pass is the resolver name (and potentially extra-dep versions), and have Hazel do the rest:
hazel_repositories(
resolver="lts-11.6",
extra_deps = {
"optparse-applicative": "0.14.2.0",
...,
})
Note that for parsing a Cabal file we could just use GHC + built-ins, but for a YAML file we don't get enough from the built-in packages. Instead I think we can just write a Python script using the yaml
package. (Bazel already requires Python to be installed anyway.)
packages.bzl
file for a given Stackage resolverhazel_repositories
to take the resolver name directly, as described above,packages.bzl
in @hazel_base_repositories
.Following issue encountered on
Ubuntu 16.04.4 LTS (Xenial Xerus)
GHC v8.2.2
Nix revision c33c5239f62b4855b14dc5b01dfa3e2a885cf9ca
➜ hazel git:(master) bazel build @haskell_math_functions//:all
Starting local Bazel server and connecting to it...
...........
/nix/store/xjgn21ai9mg1ghfb1sb34wgarnpbcrzm-ghc-8.2.2
INFO: Analysed 27 targets (13 packages loaded).
INFO: Found 27 targets...
ERROR: /home/ashrestha/.cache/bazel/_bazel_ashrestha/98700ac0081e3d500e0bf543dde9b2b3/external/haskell_math_functions/BUILD:14:1: Compiling math-functions failed (Exit 1)
<command line>: can't load .so/.DLL for: bazel-out/k8-fastbuild/bin/external/haskell_primitive/libHSexternalZShaskellZUprimitiveZSprimitive-0.6.3.0-ghc8.2.2.so (libstdc++.so.6: cannot open shared object file: No such file or directory)
INFO: Elapsed time: 33.870s, Critical Path: 17.93s
INFO: 81 processes, linux-sandbox.
FAILED: Build did NOT complete successfully
Currently, Hazel uses a c2hs built from Nix. This is redundant, since it can build c2hs fine by itself.
Currently a haskell_binary
can't be used to supply c2hs to a it because there would be a circular dependency:
...
ERROR: .cache/bazel/_bazel_mark/ba741a813dc79243fef0e9bb685fff36/external/ghc/BUILD:41:1: in _haskell_toolchain rule @ghc//:ghc-impl: cycle in dependency graph:
@haskell_hsndfile//:hsndfile
.-> @ghc//:ghc-impl (host)
| @haskell_c2hs//:c2hs_bin (host)
`-- @ghc//:ghc-impl (host)
This cycle occurred because of a configuration option
ERROR: Analysis of target '@haskell_hsndfile//:hsndfile' failed; build aborted
INFO: Elapsed time: 0.972s
FAILED: Build did NOT complete successfully (4 packages loaded)```
The plumbing between hazel and cabal2bazel seems to work really well for many packages on Hackage. I'd like to be able to use the same plumbing in custom packages, such as forks hosted on github, or repos not on Hackage. Unfortunately right now I have to define my own BUILD file, and it is not obvious how to re-use some of that plumbing.
With this infrastructure, it could be possible also to write a preprocessor that runs hpack
for projects that use stack and don't checkin cabal files and then kick off to hazel/cabal2bazel.
It seems like majority of the work is done here but Im still getting familiar with the code base so Im not sure how easy that would be able to make something that can be easily invoked.
Hazel currently hard-codes the assumption that @ghc//
contains the GHC distribution. This is an issue, because it prohibits to use Hazel for a multi platform build where different GHC distributions are required for different platforms. E.g. @ghc_platform_unix
@ghc_platform_windows
.
A first step towards resolving this issue is to add a parameter, say ghc_workspace
, to hazel_repositories
that defines the GHC workspace, and then pass it along within Hazel to all the components that require this information. This is doable.
However, as I understand, in order to choose the value of ghc_workspace
based on platform information requires calling hazel_repositories
from within a repository rule. (Only those have access to repository_ctx
and thereby to platform information. Note, that Bazel's select
does not work for this use-case in WORKSPACE
.) At the time of writing, it seems that hazel_repositories
cannot be called from within a repository rule. Any attempt causes errors along the lines of
ERROR: Analysis of target '//some/package:some-target' failed; build aborted: no such package '@haskell_some_dependency//': The repository could not be resolved
I.e. the external workspaces for Hazel packages are not generated. I am not sure why this happens. I assume it is related to nested workspace restrictions in Bazel.
Given that, I think a better approach would be to not require direct access to the GHC workspace in the first place. To my understanding it is currently used to access the Unix system headers and the threaded runtime shared libraries. Instead of accessing these through the workspace, maybe rules_haskell
could be extended to make them available through the GHC toolchain and Hazel could then pick them up from there.
Am I overlooking something, or do you think going through the GHC toolchain is a feasible solution?
Looking at Stackage.hs
Lines 54 to 56 in 9252939
it appears that the packages are downloaded using the https://hackage.haskell.org/package/<package-name>-<package-version>.tar.gz
URL, and then the sha256
is computed and inserted into packages.bzl
. This sha256
is then used
Lines 24 to 28 in 9252939
in Bazel's download_and_extract
.
AFAICT this can lead to the following situation:
Stackage.hs
is run to generate packages.bzl
which includes package pkg
at version ver
pkg
is introduced, still at version ver
bazel build
is called with the packages.bzl
containing the sha256
for the old tarball, which will try to download the tarball of the latest revision, which will fail the build (the sha256
s do not match anymore)There may be solutions to this by using the resolver's cabal-file-info
fields:
packages:
drawille:
users: []
constraints:
...
version: '0.1.2.0'
cabal-file-info:
size: 2409
hashes:
MD5: 1518765abeeb698adbc558de46d45222
Skein512_512: a99229bbb75b0395649e6e0a571c1fa69617803486377f100b6724ad65693c80be6b81d27b47c4ca64065ba6adfd24b4738f04bd77ffc9e20584efb7863fa2bf
SHA1: d44b178055fb373978f9052d513e429b929a6dd7
SHA512: 345426ecaf3fb3db4093cd17a3bf87f44424cd16422d946db8896ec09681fe95ad8dee061744f05002bae09182c133e00d91702f4bcc8da462b672d79f8ded72
SHA256: 41a1627890649690d76261040c67779fa2e6fa2eb0ff912ee24891ea21b7b2f7
GitSHA1: dfe6d875faa4af01d4cba1e215e0da5e0885db7c
although I can never remember whether there's an URL for downloading a package from Hackage by specifying either the revision or the cabal file's hash.
Hazel builds fail with linker errors if linkstatic
is set to True
, which is the default in rules_haskell. Currently, Hazel works around this by setting linkstatic = False
. However, for various reasons it would be preferable to use static linking, or at least have it available. E.g. startup speed of executables, MacOS MACH-O header size limit, Haskell profiling (AFAIK currently only available with static linking in rules_haskell).
It is unclear to me whether the issue lies in rules_haskell or Hazel. I was under the impression that it is a known limitation in rules_haskell. However, I was unable to reproduce the same kind of linker errors rules_haskell alone.
cc @guibou
To reproduce:
Check out the test-static-linking
branch on the following Hazel clone: https://github.com/DACH-NY/hazel (7904fad at the time of writing)
Run the tests as follows:
nix-shell --pure path/to/rules_haskell/shell.nix --run 'bazel test //test/vector-test'
Assuming a checkout of rules_haskell at commit d5203b8273fa53d68341e8bf0dc7cb8f14ebbf6c.
Observe linker errors of the following form
bazel-out/k8-fastbuild/bin/external/haskell_primitive__1834808089/libHSexternalZShaskellZUprimitiveZUZU1834808089ZSprimitive.a(Primitive.o):s2YJ_info: error: undefined reference to 'transformerszm0zi5zi2zi0_ControlziMonadziTransziCont_zdfMonadTransContT_closure'
bazel-out/k8-fastbuild/bin/external/haskell_primitive__1834808089/libHSexternalZShaskellZUprimitiveZUZU1834808089ZSprimitive.a(Primitive.o):externalZZShaskellZZUprimitiveZZUZZU1834808089ZZSprimitive_ControlziMonadziPrimitive_zdfPrimMonadContT_info: error: undefined reference to 'transformerszm0zi5zi2zi0_ControlziMonadziTransziCont_zdfMonadContT_closure'
bazel-out/k8-fastbuild/bin/external/haskell_primitive__1834808089/libHSexternalZShaskellZUprimitiveZUZU1834808089ZSprimitive.a(Primitive.o):s2YP_info: error: undefined reference to 'transformerszm0zi5zi2zi0_ControlziMonadziTransziIdentity_zdfMonadTransIdentityT_closure'
Hazel's cabal2bazel
calls the ./configure
script on packages that require this step. The is done using a
call to repository_ctx.execute
. It is common for configure scripts to look for tools like gcc
. Since this happens in a repository rule, it has access to the regular environment (e.g. $PATH
) even if --experimental_strict_action_env
is set, as a quick test showed. This is an inhermiticity. E.g. this step will fail if a user does not have a global gcc
installed. It could also introduce cache misses with a remote cache due to differences in setup between machines. Furthermore, if the Bazel build later on configures a different cc toolchain, say using nixpkgs_cc_configure
, this could potentially cause issues due to incompatibility between the toolchains.
I've been trying to use Hazel this past day and a bit yesterday, and have unfortunately so far not been successful. I would submit a PR to try to fix this, but I really don't understand what the right way of using this repo is.
There is rules_haskell_examples, but it's old and does not work with recent Bazel (it uses http_archive
without importing it, and depends on and old rules_haskell
that does the same). Maybe I should just downgrade Bazel and use old versions of everything...
It would be helpful if the README.md
instructions contained the load("@bazel_tools//tools/build_defs/repo:http.bzl", "http_archive")
line.
README.md
has code to add to WORKSPACE
that has syntax errors in it around HAZEL_SHA
: It's missing "
s.
The version of Hazel that the README.md
instructions point to using (via git hash) has a hazel_repositories
rule(?) that does not accept the core_packages
attribute that the example code has so it doesn't work.
Issue #53 makes it, as far as I can tell at least, impossible to use rules_haskell
's haskell_register_toolchains
since that does not set up a @ghc
workspace.
Nevertheless, I still managed to make things build by importing @ghc
with nixpkgs_package
instead of using haskell_register_toolchains
. However, I get error while loading shared libraries: libHSrts_thr-ghc8.2.2.so
runtime errors which I don't know how to fix. That file is present somewhere in /nix
but I don't know how to point my binaries to it. I've found that there is a :threaded-rts
target in BUILD.ghc
that tries to work around it but it doesn't seem to work.
What can I do to make this work? Is there some other repo like rules_haskell_examples that works with recent Bazel versions that I can look at?
The third-party build of conduit
currently fails. It was disabled in 1fcdcef to restore some semblance of CI to the project.
Using cabal_paths in a project results in an error when the path is "internal":
ERROR: .../BUILD:86:1: in _path_module_gen rule //...:Paths_....hs:
Traceback (most recent call last):
File "/home/mbauer/.cache/bazel/_bazel_mbauer/bcdf842875ade63816596dd2168fa013/external/hazel/third_party/cabal2bazel/bzl/cabal_paths.bzl", line 34, in _impl_path_module_gen
ctx.template_action(template = ctx.file._template, out..., ...)})
File "/home/mbauer/.cache/bazel/_bazel_mbauer/bcdf842875ade63816596dd2168fa013/external/hazel/third_party/cabal2bazel/bzl/cabal_paths.bzl", line 39, in ctx.template_action
paths.join("..", paths.relativize(ctx.label.w..."), ...)
File "/home/mbauer/.cache/bazel/_bazel_mbauer/bcdf842875ade63816596dd2168fa013/external/hazel/third_party/cabal2bazel/bzl/cabal_paths.bzl", line 43, in paths.join
paths.relativize(ctx.label.workspace_root, "externa...")
File "/home/mbauer/.cache/bazel/_bazel_mbauer/bcdf842875ade63816596dd2168fa013/external/bazel_skylib/lib/paths.bzl", line 188, in paths.relativize
fail(("Path '%s' is not beneath '%s'"...)))
Path '' is not beneath 'external'
I see there is a line mentioning this in cabal_paths.bzl:
https://github.com/FormationAI/hazel/blob/master/third_party/cabal2bazel/bzl/cabal_paths.bzl#L43-L46
It would be nice if this was supported.
We can avoid building these. Stackage.hs puts these into the to-be-built packages but should put them into core_packages.
The tests currently involve a shell script defined only in the CircleCI configuration that builds a couple of random packages via direct workspace references, i.e. not how anybody actually uses Hazel.
A local_repository with own WORKSPACE or similar would be a nice improvement, especially if it allowed us to test multiple snapshots at once for some characteristic set of packages. Maybe best to borrow whatever structure is used in rules_haskell
.
I'm encountering a build failure when trying to use cabal2bazel
on a Haskell library that makes use of extra-source-files
and CPP #include
directives.
For example, assume the following source files exist: srcs/Some/Module.hs
, includes/Some.Module.hs
. And that srcs/Some/Module.hs
contains:
{-# LANGUAGE CPP #-}
module Some.Module where
...
#include "../../includes/Some.Module.hs
And the cabal file contains:
extra-source-files:
includes/Some.Module.hs
Then the cabal2bazel
build fails with an error of the form
error: fatal error: ../../includes/Some.Module.hs: No such file or directory
The only reference to extra-source-files
I could find is here, where it is used to construct -I
command-line flags to GHC. It looks like, contrary to other source files, extra-source-files
are not linked or copied into the working directory for builds. And even if they were, the include
path would still be broken because cabal2bazel
arranges source files relative to their hs-source-dirs
entry and not to their position in the original source tree.
What is the reason for arranging source files relative to their hs-source-dirs
entry instead of to their original position in the source tree? Couldn't the same effect be achieved by keeping files in their original source tree position and passing additional -i
arguments to GHC?
This would allow to fix the above issue by simply linking or copying the extra-source-files
into the build tree.
Currently hazel_custom_package_github
only uses a single directory. This makes the rules a little redundant when we have multiple of them, and also means we download the same repo multiple times (I think).
For example, we have three separate repositories for opentracing:
https://github.com/FormationAI/platform-core/blob/131037553cfebe6702769703c342fdff495da238/WORKSPACE#L352
Figure out an API and approach so that a single call to that macro can refer to multiple subdirectories (maybe mapping package names to paths).
This PR in rules_haskell changed the way Haskell toolchains are handled: tweag/rules_haskell#610
The toolchains are not referenced by name
anymore; rather the best toolchain is picked. As an example, the rules_haskell WORKSPACE
does not give the bindist toolchains a name anymore: https://github.com/tweag/rules_haskell/blob/12af32dee83ba654cc01a84a4b0f105e80c243a7/WORKSPACE#L102-L105
This means we can't generally pass the ghc_workspace
to hazel anymore. Instead Hazel should, when needed, figure out the toolchain that's being used.
We should make hazel_library
more generic so it can support targets besides just libraries.
Suggestion: make a generic hazel
macro:
def hazel(package, target=None):
if not target:
target = package
else if target == package: # binary
target = package + "_bin"
...
Socialize this design first to see how intuitive it is in practice.
I'm getting
ERROR: /private/var/tmp/_bazel_brandonchinn/0a8e6892997a7541bfd4e2d7913db675/external/haskell_clock_94755854/BUILD:17:1: HaskellHsc2hs external/haskell_clock_94755854/_hsc/clock/System/Clock.hs failed (Exit 1) hsc2hs failed: error executing command external/io_tweag_rules_haskell_ghc_darwin_amd64/bin/hsc2hs bazel-out/darwin-fastbuild/bin/external/haskell_clock_94755854/gen-srcs-clock/System/Clock.hsc -o ... (remaining 52 argument(s) skipped)
Use --sandbox_debug to see verbose messages from the sandbox
clang: warning: no such sysroot directory: '__BAZEL_XCODE_SDKROOT__' [-Wmissing-sysroot]
Clock.hsc:34:10: fatal error: 'hs_clock_darwin.c' file not found
#include "hs_clock_darwin.c"
^~~~~~~~~~~~~~~~~~~
1 error generated.
when building a package that depends on clock. It seems like there's some includes flag that's missing with files listed in cabal underc-sources
or include-dirs
?
The integer-simple
package is part of the GHC distribution. However, it is not listed in the core_packages
section of the Stackage LTS 12.4 snapshot as generated by the Stackage.hs
script. It can be manually added to the core_packages
as follows.
WORKSPACE
:
load("@ai_formation_hazel//:hazel.bzl", "hazel_repositories")
load("//hazel:packages.bzl", "core_packages", "packages")
hazel_repositories(
core_packages = core_packages + {
"integer-simple": "0.1.1.1",
},
packages = packages,
...
)
Since integer-simple
is part of the GHC distribution, maybe it should already be part of core_packages
?
From internal issue tracker:
The initial PR of merging cabal2build into Hazel has a workaround for happy and alex w.r.t. data files:
Quoting /pull/3 :
Note: this PR manually checks in the data-files that Happy and Alex use
and which are generated automatically by their custom Setup.hs scripts.
Figuring out a better solution is TODO; however, those files' interfaces
have been very stable, and upcoming releases of those packages
will include them, so I think this is OK for now.
Look for a less janky approach. For example:
I cannot build snap-server:
[nix-shell:~/projects/FormationAI/hazel]$ bazel build @haskell_snap_server//...
/nix/store/mmhdqindrpvg91a2vnrr76vxcn5a4n0x-ghc-8.2.2
/nix/store/gqg2vrcq7krqi9rrl6pphvsg81sb8pjw-gcc-wrapper-7.3.0
/nix/store/cmxaqb5cbzy4jk26na842n6hy1s4yn19-binutils-wrapper-2.28.1
INFO: Build options have changed, discarding analysis cache.
/nix/store/vg0s4sl74f5ik64wrrx0q9n6m48vvmgs-glibc-locales-2.26-131
/nix/store/7j4qqfbdx91j77cg79g72jvkxpqff1z5-c2hs-0.26.2-28-g8b79823
INFO: SHA256 (https://hackage.haskell.org/package/zlib-bindings-0.1.1.5/zlib-bindings-0.1.1.5.tar.gz) = c83bb438f9b6c5fe860982731eb8ac7eff993e8b56cbc15ef5b471f229f79109
DEBUG: /home/nicolas/.cache/bazel/_bazel_nicolas/a1508dc904dd03aa664eaed35fad9172/external/bazel_tools/tools/build_defs/repo/http.bzl:43:9: ctx.attr.build_file //third_party/haskell:BUILD.zlib_bindings, path /home/nicolas/projects/FormationAI/hazel/third_party/haskell/BUILD.zlib_bindings
INFO: Analysed 25 targets (66 packages loaded).
INFO: Found 25 targets...
ERROR: /home/nicolas/.cache/bazel/_bazel_nicolas/a1508dc904dd03aa664eaed35fad9172/external/haskell_snap_server/BUILD:17:1: error executing shell command: '${1+"$@"}' failed (Exit 1)
bazel-out/k8-fastbuild/bin/external/haskell_snap_server/gen-srcs-snap-server/System/SendFile.hsc:57:34: error:
Not in scope: ‘SF.sendFile’
Perhaps you meant ‘S.readFile’ (imported from Data.ByteString.Char8)
No module named ‘SF’ is imported.
bazel-out/k8-fastbuild/bin/external/haskell_snap_server/gen-srcs-snap-server/System/SendFile.hsc:65:16: error:
Not in scope: ‘SF.sendFileMode’
No module named ‘SF’ is imported.
INFO: Elapsed time: 13.780s, Critical Path: 6.02s
INFO: 0 processes.
FAILED: Build did NOT complete successfully
because the following CPP conditional does not trigger (LINUX
should be defined on my platform):
-- System/SendFile.hsc
#if defined(LINUX)
import qualified System.SendFile.Linux as SF
#elif defined(FREEBSD)
import qualified System.SendFile.FreeBSD as SF
#elif defined(OSX)
import qualified System.SendFile.Darwin as SF
#endif
where -DLINUX
is defined as follows in the cabal file of snap-server
if os(linux) && !flag(portable)
cpp-options: -DLINUX -DHAS_SENDFILE -DHAS_UNIX_SOCKETS
other-modules:
System.SendFile,
System.SendFile.Linux
it looks like the GHC option is not enabled:
[nix-shell:~/projects/FormationAI/hazel]$ bazel query --output=build @haskell_snap_server//...
...
# /home/nicolas/.cache/bazel/_bazel_nicolas/a1508dc904dd03aa664eaed35fad9172/external/haskell_snap_server/BUILD:17:1
haskell_library(
name = "snap-server",
visibility = ["//visibility:public"],
generator_name = "snap-server",
generator_function = "cabal_haskell_package",
generator_location = "/home/nicolas/.cache/bazel/_bazel_nicolas/a1508dc904dd03aa664eaed35fad9172/external/haskell_snap_server/BUILD:17",
src_strip_prefix = "gen-srcs-snap-server/",
srcs = ["@haskell_snap_server//:gen-srcs-snap-server/Snap/Http/Server.hs", "@haskell_snap_server//:gen-srcs-snap-server/Snap/Http/Server/Config.hs", "@haskell_snap_server//:gen-srcs-snap-server/Snap/Http/Server/Types.hs", "@haskell_snap_server//:gen-srcs-snap-server/Snap/Internal/Http/Server/Config.hs", "@haskell_snap_server//:gen-srcs-snap-server/Snap/Internal/Http/Server/Types.hs", "@haskell_snap_server//:gen-srcs-snap-server/System/FastLogger.hs", "@haskell_snap_server//:gen-srcs-snap-server/Control/Concurrent/Extended.hs", "@haskell_snap_server//:gen-srcs-snap-server/Snap/Internal/Http/Server/Address.hs", "@haskell_snap_server//:gen-srcs-snap-server/Snap/Internal/Http/Server/Clock.hs", "@haskell_snap_server//:gen-srcs-snap-server/Snap/Internal/Http/Server/Common.hs", "@haskell_snap_server//:gen-srcs-snap-server/Snap/Internal/Http/Server/Date.hs", "@haskell_snap_server//:gen-srcs-snap-server/Snap/Internal/Http/Server/Parser.hs", "@haskell_snap_server//:gen-srcs-snap-server/Snap/Internal/Http/Server/Session.hs", "@haskell_snap_server//:gen-srcs-snap-server/Snap/Internal/Http/Server/Socket.hs", "@haskell_snap_server//:gen-srcs-snap-server/Snap/Internal/Http/Server/Thread.hs", "@haskell_snap_server//:gen-srcs-snap-server/Snap/Internal/Http/Server/TimeoutManager.hs", "@haskell_snap_server//:gen-srcs-snap-server/Snap/Internal/Http/Server/TLS.hs", "@haskell_snap_server//:gen-srcs-snap-server/System/SendFile.hsc", "@haskell_snap_server//:gen-srcs-snap-server/System/SendFile/Linux.hsc"],
deps = ["@haskell_snap_server//:Paths_snap_server", "@haskell_attoparsec//:attoparsec", "@haskell_base//:base", "@haskell_blaze_builder//:blaze-builder", "@haskell_bytestring//:bytestring", "@haskell_bytestring_builder//:bytestring-builder", "@haskell_case_insensitive//:case-insensitive", "@haskell_clock//:clock", "@haskell_containers//:containers", "@haskell_filepath//:filepath", "@haskell_io_streams//:io-streams", "@haskell_io_streams_haproxy//:io-streams-haproxy", "@haskell_lifted_base//:lifted-base", "@haskell_mtl//:mtl", "@haskell_network//:network", "@haskell_old_locale//:old-locale", "@haskell_snap_core//:snap-core", "@haskell_text//:text", "@haskell_time//:time", "@haskell_unix_compat//:unix-compat", "@haskell_vector//:vector", "@haskell_unix//:unix", "@ghc//:unix-includes", "@haskell_snap_server//:snap-server-cbits"] + [],
compiler_flags = ["-XHaskell2010", "-fwarn-tabs", "-funbox-strict-fields", "-fno-warn-unused-do-bind", "-w", "-Wwarn", "-optP-DLINUX", "-optP-DHAS_SENDFILE", "-optP-DHAS_UNIX_SOCKETS"],
hidden_modules = ["Control.Concurrent.Extended", "Snap.Internal.Http.Server.Address", "Snap.Internal.Http.Server.Clock", "Snap.Internal.Http.Server.Common", "Snap.Internal.Http.Server.Date", "Snap.Internal.Http.Server.Parser", "Snap.Internal.Http.Server.Session", "Snap.Internal.Http.Server.Socket", "Snap.Internal.Http.Server.Thread", "Snap.Internal.Http.Server.TimeoutManager", "Snap.Internal.Http.Server.TLS", "System.SendFile", "System.SendFile.Linux"],
version = "1.0.3.3",
)
...
I'm on top of latest master with this extra commit (required for zlib-bindings to build)
nicolas@nicolas-XPS-13-9370:~/projects/FormationAI/hazel$ git log -n 2
commit f2d2c670fe9ec224530f32bdb69b30c51d101819 (HEAD -> nm-zlib-bindings, da/nm-zlib-bindings)
Author: Nicolas Mattia <[email protected]>
Date: Thu Dec 13 17:56:09 2018 +0400
zlib-bindings: add custom build
commit ea415332cfd10cf607c386a5103932f731d29bfb (origin/master, origin/HEAD, da/master, master, da-master)
Author: Andreas Herrmann <[email protected]>
Date: Thu Dec 6 20:52:25 2018 +0100
Allow to set extra cabal flags (#49)
Revert 38657ad once we resolve this failure
$ bazel build --jobs=2 $(for p in $(cat test-packages.txt); do echo @haskell_$p//...; done)
INFO: Invocation ID: 0d834acb-0118-4319-82fa-4d0f8a80f63a
DEBUG: /home/tim/.cache/bazel/_bazel_tim/3dfb6465070eadfbfb3bf40803766bf8/external/bazel_skylib/lib.bzl:30:1: WARNING: lib.bzl is deprecated and will go away in the future, please directly load the bzl file(s) of the module(s) needed as it is more efficient.
INFO: Analysed 778 targets (0 packages loaded, 0 targets configured).
INFO: Found 778 targets...
ERROR: /home/tim/.cache/bazel/_bazel_tim/3dfb6465070eadfbfb3bf40803766bf8/external/haskell_language_c__176479442/BUILD:17:1: error executing shell command: '/bin/bash -c
export PATH=${PATH:-} # otherwise GCC fails on Windows
$(< bazel-out/k8-fastbuild/bin/external/haskell_language_c__176479442/ghc_args_language-c_HaskellBuildLibrary) $...' failed (Exit 1) bash failed: error executing command /bin/bash -c ... (remaining 1 argument(s) skipped)
Use --sandbox_debug to see verbose messages from the sandbox
bazel-out/k8-fastbuild/bin/external/haskell_language_c__176479442/gen-srcs-language-c/src/Language/C/Analysis/DeclAnalysis.hs:32:1: error:
Could not find module ‘Language.C.Analysis.AstAnalysis’
Use -v to see a list of the files searched for.
|
32 | import {-# SOURCE #-} Language.C.Analysis.AstAnalysis (tExpr, ExprSide(..))
| ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
Some packages (I think) require custom build instructions to compile properly in the environment that Hazel, rules_haskell
and (assuming this is the default mode of usage for rules_haskell
) Nix/nixpkgs. As an example, cryptonite-0.25
contains an extra-lib: pthread
declaration that states it must be linked with the pthread
library on POSIX. To get this to work, the rough steps are something like:
glibc
from nixpkgs using nixpkgs_package
with a custom BUILD
that uses patchelf
(also pulled in from nixpkgs
) to patch libpthread
and expose it.libpthread
to extra_libs
when configuring Hazel.@haskell_cryptonite
libpthread.so
is in fact an ld
script that combines the dynamic and static portions of the library.In short a custom build process for a working copy of libpthread
needs to be devised in Bazel and then supplied using extra_libs
to Hazel. Having already encountered this issue for libstdc++
(and not really considering that it might bite in other forms), tweag/rules_haskell#387 lays out what I thought a better interface to rules_haskell
might be to enable users of rules_haskell
and Hazel to navigate this situation a bit better. However, on encountering this and also examining bits of Hazel such as the custom BUILD.*
s for wai
, conduit
, etc., I'm wondering -- are we destined to implement a similar effort to the one already being undertaken by nixpkgs, whereby there is a large amount of custom packaging for finicky cases like this? If so, do we want to undertake this work? If not, what is the alternative (e.g. should we just use Haskell packages from nixpkgs and only offer things like *_custom_package_github
for non-Hackage-package cases)?
CC @mboes since perhaps this affects rules_haskell
/there has been discussion around this already.
Hazel generates an external workspace for each of the packages provided in the core_packages
and packages
attributes to hazel_repositiories
. E.g. haskell_vector
for vector
. Hackage allows package names to only differ in case. E.g. HSet
and hset
. This will generate workspace names haskell_HSet
and haskell_hset
. On a case-insensitive system, in particular on Windows, Bazel will not distinguish these two workspace names, leading to errors like the following:
ERROR: C:/users/user/_bazel_user/glk6p2o5/external/ai_formation_hazel/hazel.bzl:172:3: Label '@haskell_HSet//:files' is duplicated in the 'files' attribute of rule 'all_hazel_packages'
Commenting out instances of such names in the generated hazel/packages.bzl
file resolves that issue. (Of course that's just a work around)
I've done some preliminary tests on Windows with the following approach and they show that it resolves this issue.
I propose to append case-sensitive information to the generated workspace names. To that end we can use the hash
function that Bazel offers, and append the hash of the package name to the generated workspace name. We can also lower case workspace names to ensure that other issues related to case insensitivity are also caught on Linux.
This would generate workspace names like the following
HSet -> haskell_hset_2227962
hset -> haskell_hset_3212026
SCalendar -> haskell_scalendar_191454385
scalendar -> haskell_scalendar__1877157711
(The double _
occurs when the result of hash
is a negative number)
Disadvantages:
bazel query
or a package group.Let me know what you think of this approach and I'll be happy to create a PR.
A declarative, efficient, and flexible JavaScript library for building user interfaces.
🖖 Vue.js is a progressive, incrementally-adoptable JavaScript framework for building UI on the web.
TypeScript is a superset of JavaScript that compiles to clean JavaScript output.
An Open Source Machine Learning Framework for Everyone
The Web framework for perfectionists with deadlines.
A PHP framework for web artisans
Bring data to life with SVG, Canvas and HTML. 📊📈🎉
JavaScript (JS) is a lightweight interpreted programming language with first-class functions.
Some thing interesting about web. New door for the world.
A server is a program made to process requests and deliver data to clients.
Machine learning is a way of modeling and interpreting data that allows a piece of software to respond intelligently.
Some thing interesting about visualization, use data art
Some thing interesting about game, make everyone happy.
We are working to build community through open source technology. NB: members must have two-factor auth.
Open source projects and samples from Microsoft.
Google ❤️ Open Source for everyone.
Alibaba Open Source for everyone
Data-Driven Documents codes.
China tencent open source team.