Giter VIP home page Giter VIP logo

inferno's Introduction

Infer, no?



This is the parser, type inference engine, and version control for a new functional scripting language.

Specifically, the project comprises of:

  • Parser for the new functional language
  • Type checker/inference
  • Evaluator
  • Basic prelude
  • A version control server to manage script histories and versions

Nix prerequisites

We currently only offer a Nix-based build system for building and developing inferno packages. You can build the project components directly with Nix or enter a Nix-based development environment to build or work on the project with Cabal.

Install Nix v2.8 or greater

If you don't have Nix installed, follow the directions here. This repository uses flakes, a new Nix feature, and we recommend installing v2.8 or greater for the best compatibility.

Enable required flakes settings

Certain features that flakes require are still marked as experimental and must be explicitly enabled. These features are required to build or develop this project.

On non-NixOS systems, edit ~/.config/nix/nix.conf or /etc/nix/nix.conf and add the following lines:

experimental-features = nix-command flakes

On NixOS, you can add the same line to nix.extraOptions in your system configuration.

Configure the binary caches

It is highly recommended to configure two extra Nix binary caches to download artifacts when building this project. We offer our own public Cachix cache (inferno) that is populated on CI. Since this project uses IOG's haskell.nix, you should also add IOG's binary caches. Although our cache contains some of the same artifacts as IOG's, you should still configure the latter in case a critical dependency (e.g. GHC) has not yet been cached by us, Cachix is experiencing an outage, or you make local changes that would require rebuilding a large dependency (e.g. upgrading to a new GHC version).

Important: If you do not enable at least IOG's binary cache, you will build GHC from source several times! This will take at least several hours in most cases.

There are two methods for enabling the caches. The flake will attempt to set the relevant values for you automatically. Caveats apply to this process, so you may need to enable them manually.

Automatic configuration

When you first run a nix command in this repository, you will be prompted to allow certain configuration values to be set:

$ nix develop
do you want to allow configuration setting 'extra-substituters' to be set to 'https://cache.iog.io https://inferno.cachix.org' (y/N)? y
do you want to permanently mark this value as trusted (y/N)? y
do you want to allow configuration setting 'extra-trusted-public-keys' to be set to 'hydra.iohk.io:f/Ea+s+dFdN+3Y/G+FDgSq+a5NEWhJGzdjvKNGv0/EQ= inferno.cachix.org-1:48GsOmEfRPXpTZsQSmnD2P42lpbUeHrjlzyhasL5rug=' (y/N)? y
do you want to permanently mark this value as trusted (y/N)? y

Accepting these prompts will set the required configuration values for you. Marking them as trusted will ensure that they are used for future nix invocations in this repository. No further configuration is required.

Important: If you are on NixOS or otherwise using a multi-user Nix install, you must be a trusted user to set substituters. If you are not a trusted user, enabling the options prompted by the flake will have no effect (non-trusted users are disallowed from doing this) and you must configure the caches manually.

If you see output similar to

warning: ignoring untrusted substituter 'https://cache.iog.io'

when running a nix command, you are not a trusted user and the settings from the flake will not be applied even if you have selected y for each prompt.

On non-NixOS systems, add the following to the system-wide configuration (/etc/nix/nix.conf):

trusted-users = <username> root

You can also use a group name by prefixing it with @, e.g. to add all members of the wheel group:

trusted-users = @wheel root

On NixOS, add the user/group name to the list under nix.settings.trusted-users.

If you do not wish to add yourself as a trusted user, you will need to configure the binary caches manually as explained below.

Manual configuration

IOG's cache

You can configure IOG's cache manually by following the instructions here. Again, not enabling this cache will require you to build GHC from source several times.

The inferno cache

The inferno cache can be manually enabled in two ways:

  • By installing the cachix CLI tool and then running cachix use inferno. This method is preferable if you are already using Cachix for other binary caches (e.g. https://nix-community.cachix.org/)
  • By manually copying the cache URL (https://inferno.cachix.org) and key (inferno.cachix.org-1:48GsOmEfRPXpTZsQSmnD2P42lpbUeHrjlzyhasL5rug=) to substituters and trusted-public-keys, respectively, in /etc/nix/nix.conf

Troubleshooting

If you find yourself building GHC despite having set the required configuration values (or allowed the flake to do so for you), something has gone wrong:

  • If you set the cache values manually, make sure that you restarted the Nix daemon on non-NixOS systems
  • If you accepted the prompts from the flake, you may not have permissions to set these values. Either set them manually in your system-wide configuration or continue reading below

Note: There's currently something strange going on that causes nix cache misses even when caches are configured properly on your computer, or on GitHub Actions. If you find your PR's CI is building GHC, try cancelling and re-running it.

Building or working on the project

Once you have completed the steps above, you can use the nix command to build or work on the project.

Building/running components

Executables

Use nix build .#<PACKAGE> to build project components or executables. For example, to build vscode-inferno-syntax-highlighting, run the following:

$ nix build .#vscode-inferno-syntax-highlighting

This will create a symlink in $PWD/result:

$ ls -H ./result
inferno.monarch.json  vscode-inferno-syntax-highlighting-0.0.1.vsix

nix build by itself will build all components and run all tests.

Tests

To run tests for a particular project component, you can use nix build -L .#checks.<SYSTEM>.<TEST> where <SYSTEM> corresponds to your platform and OS (e.g. x86_64-linux). All project tests are part of the checks flake output.

$ nix build -L .#checks.x86_64-linux.inferno-core:test:inferno-tests
inferno-core-test-inferno-tests> unpacking sources
inferno-core-test-inferno-tests> source root is inferno-src-root-inferno-core-test-inferno-tests-root
inferno-core-test-inferno-tests> patching sources
inferno-core-test-inferno-tests> configuring
inferno-core-test-inferno-tests> Configure flags:
# etc...

(Do note that nix flake check, a command which runs all checks among other things, will not work; see here for more information.)

To run all tests and build all packages, build packages.<SYSTEM>.default:

$ nix build -L
Running apps

To run an application directly via Nix, use nix run .#<APP>, e.g.

$ nix run .#inferno-lsp-server

This is equivalent to the following:

$ nix build .#inferno-lsp-server
$ ./result/bin/inferno-lsp-server

Entering a development environment

To enter a development environment suitable for working on the inferno project itself, use nix develop. cabal, haskell-language-server, and other development tools will be available in the shell.

$ nix develop
$ cabal repl inferno-core

Do note that building with Cabal directly outside of this Nix environment (that is, by installing the package set directly with a version of Cabal installed on your system) will not work.

Developing frontend packages

There are two flake packages that build VS Code extensions for Inferno, vscode-inferno-syntax-highlighting and vscode-inferno-lsp-server. Two identically named devShells correspond to these packages and can be entered to work on them. After entering the development environment, cd to the directory containing the sources for the extension; for example:

$ nix develop .#vscode-inferno-syntax-highlighting
$ cd ./vscode-inferno-syntax-highlighting

npm and all of the JS dependencies are available in the shell. The NODE_PATH points to generated node_modules in the Nix store and the environment variable NPM_CONFIG_PACKAGE_LOCK_ONLY is enabled. This is to ensure that the same dependencies are used everywhere (i.e. in the flake's packages as well as the corresponding devShells). Accordingly, npm install will only update the package-lock.json. After modifying dependencies listed in the package.json, update the lockfile, exit the shell, and then re-enter it using nix develop.

Building/running Inferno-ML on GPU

To build/run Inferno on GPU machines, use the

nix develop .#ghc925-cuda

dev-shell. You can run a test on the GPU with

cabal run inferno-ml-exe inferno-ml/test/test-gpu.inferno

(expected output Tensor Float [] 8.5899e9)

Development with pytorch

When working with inferno-ml, there may be cases where you need to use pytorch directly. devShells..pytorch can be used to obtain a development environment with the torch-bin package and dependencies and a Python interpreter:

nix develop .#pytorch

The torch version and its dependencies are the same as those used in Inferno's Hasktorch integration and should be compatible with inferno-ml.

Formatting all sources

To format all of the Nix and Haskell sources, run nix fmt. Alternately, running nix develop and then the command treefmt in the development shell will perform the same formatting.

To run a formatting check that will fail if any files are not formatted, run nix build -L .#check.<SYSTEM>.treefmt.

NOTE: Ormolu currently segfaults during compilation on aarch64-darwin (M1 Macs). ormolu is accordingly omitted from the formatter (formatters.aarch64-darwin) and formatting check (checks.aarch64-darwin.treefmt). CI will still fail on unformatted Haskell sources as it runs on x86_64-linux, so it is recommended to install ormolu-0.5.0.1 on your system using an alternate source. See #10 for more.

Profiling

We have a profiling build of the inferno binary, which can be used via:

$ nix run .#inferno-core:exe:inferno-ghc925-prof

Or equivalently:

$ nix build .#inferno-core:exe:inferno-ghc925-prof
$ ./result/bin/inferno

One can also obtain a shell with profiling enabled:

nix develop .#ghc924-prof

Or build packages and checks for the profiling configuration by putting -ghc924-prof at the end. For example, nix build .#checks.x86_64-linux.inferno-core:test:inferno-tests-ghc924-prof

Golden Files

The golden files for the exported frontend types currently live in inferno-core/golden. This will likely change as all the exported types should ideally be defined in inferno-types

From raw string to evaluated expression

The way we store and evaluate expressions in Inferno is now fairly involved and spanns several modules. Below is a high-level overview of how we go from a raw string to a fully evaluated expression:

┌──────────────┐                   ┌──────────────────────┐
│              │                   │  Pinned modules      │
│  Raw string  │     ┌─────────────┤                      │
│              │     │             │  available in scope  │
└───────┬──────┘     │             └──┬───────┬────────┬──┘
        │            │                │       │        │
        ▼            ▼                │       │        │
┌─────────────────────────────┐       │       │        │
│  Parse (using fixity        │       │       │        │
│                             │       │       │        │
│  information from modules)  │       │       │        │
└────────────────────┬────────┘       │       │        │
                     │                │       │        │
                     ▼                ▼       │        │
                ┌──────────────────────────┐  │        │
                │  Pin all free variables  │  │        │
                │                          │  │        │
                │  and enums               │  │        │
                └─────────────────┬────────┘  │        │
                                  │           │        │
                                  ▼           ▼        │
                              ┌────────────────────┐   │
                              │  Typecheck pinned  │   │
                              │                    │   │
                              │  expression        │   │
                              └──────────┬─────────┘   │
                                         │             │
                                         ▼             ▼
                                       ┌───────────────────┐
                                       │  Evaluate pinned  │
                                       │                   │
                                       │  expression       │
                                       └───────────────────┘

Parsing

The first step is parsing a raw string to an Expr () SourcePos which is the type of an inferno AST. The first parameter () will later be used for attaching a hash to every free variable inside the expression, as well as any operator or enum. The second parameter SourcePos is used by the UI to display type information/completion hints and attaching parsing/typecheching error messages to the specific location.

Internally, parsing is actually split into two steps, namely, we first parse the AST and comments separately and then use the insertCommentsIntoExpr function to attach comments to the nearest logical block within the AST (this is not always optimal)

When parsing, we can encounter blocks such as open Foo in .... When this happens, the parser looks up Foo in it's environment and uses the OpsTable for Foo to bring any infix operators defined within Foo into scope.

Pinning

To simplify the evaluation and certain operations on expressions stored in inferno's version control, an additional step between parsing and type-inferrence was introduced. The pinExpr function is now used to resolve any free variables (i.e. ones not bound by a fun, case or let) to a hash. This hash is either stored in the version control for expressions which are kept under version control or it's a hash of one of the internal functions built into the prelude/builtin modules.

Having this explicit step means the inference and evaluation are somewhat simplified, since we don't need to elaborately build the typechecking/evaluation environments, given the hashes (should be) are unique. Therefore, we can simply merge all the environments of the required modules into one without worrying about name shadowing/etc.

However, the main advantage of this approach comes from the fact that we can keep track of all the direct dependencies of any expression directly in its AST. This greatly simplifies the evaluation of an AST already in the store, as this simply comprises of:

  1. computing the closure of the given expression, by recursively fetching all its direct dependencies from the VC store and in turn fetching their dependencies, until we hit the builtin prelude functions which are built into the evaluator.
  2. putting all the collected expressions into the evaluation env/context
  3. evaluating the expression

Typechecking

Pretty standard, we simply collect all the hashes and the associated types for the modules in scope and then proceed with typechecking

Evaluation

As discusssed in the pinning section, evaluation is done on a fully typechecked and pinned Expression.

Developing Inferno scripts with VScode

In a shell go to ~/inferno (unless you cloned inferno into a different location) and run nix build .#vscode-inferno-lsp-server. Then in VScode press ctrl+shift+P and run Install from VSIX. In the window, navigate to ~/inferno/result and select inferno-lsp-server.vsix.

Do the same after for the VSIX created using nix build .#vscode-inferno-syntax-highlighting.

In a shell go to ~/inferno and run nix build .#inferno-lsp-server (nix build .#inferno-ml-lsp-server-ghc925 for inferno-ml) Run ls -al result Copy the nix/store ... directory to your clipboard. Open VScode, press ctrl + shift + P and search for Open User Settings. Search for Inferno, find the Inferno LSP extension tab and open it. Paste the directory you copied into the Path to the inferno-lsp-server executable field.

Be sure to append /bin/inferno-lsp-server (/bin/inferno-ml-lsp-server for inferno-ml) to the end of the directory, then restart VScode.

Create a new file with the .inferno extension. If you begin typing an inferno command such as Array.argmax, the autocomplete box should appear.

Next, add

"tasks": [
        {
            "label": "inferno",
            "type": "shell",
            "command": "cd ~/inferno; nix run .#inferno -- ${file}",
            "problemMatcher": [],
            "group": {
                "kind": "build",
                "isDefault": true
            }
        }
    ]

to your tasks.json file in VScode.

Change "command": "cd ~/inferno; if your inferno location is different

Change nix run .#inferno -- ${file} to nix run .#inferno-ml -- ${file} for inferno-ml

You should now be able to build .inferno scripts using ctrl + shift + B in VScode

examples

Try saving the following into an .inferno file and compiling it.

let arr = [1, 2, 3, 4] in
let sum = Array.sum arr in
sum

The file should compile and output

10.0

if you're using inferno-ml, you can also try the following

let arr = ML.ones ML.#int [10] in
let arr2 = ML.ones ML.#int [10] in
let arr3 = ML.stack 0 [arr, arr2] in
arr3

The file should compile and output

Tensor Int64 [2,10] [[ 1,  1,  1,  1,  1,  1,  1,  1,  1,  1],
                     [ 1,  1,  1,  1,  1,  1,  1,  1,  1,  1]]

Importing a torchscript model into inferno

In an inferno script, you can load the model using the ML.loadModel function. For instance, let model = ML.loadModel "path/to/model/<model_name>.ts.pt" in ...

You can pass arguments of type array of tensor to the model by passing them to ML.forward along with your model.

like

let outputs = ML.forward model inputs in

where inputs is an array of tensors. This line would assign the return value to outputs, but any other assignment (pattern matching on the return value, for instance) would also work.

Guidelines for model compatability with inferno

For general information on how you should convert your python files to torchscript models, please see the online documentation for torchscript: https://pytorch.org/docs/stable/jit.html

Inferno's only extra requirement is that a torchscript model should only take tensors in or out. Standard non-tensor python types are only allowed internally.

Be sure to use the correct version of torchscript. You can get this by building the python interpreter by running nix build .#pytorch and selecting it in VScode. To do this open VScode and press ctrl + shift + P and search Python: select interpreter. Choose the one with a nix/store/ path. If you have other python libraries that you'd like included in your nix build, you can add them to flake.nix in the pytorch section in this list

ps: with ps; [
pytorch-bin
torchvision-bin
torchaudio-bin
<-- add library name here
]

The correct version of torchscript can also be found here

https://github.com/plow-technologies/inferno/blob/8d1a5eee65fab73afc28c185199b44ece9b97b30/.github/workflows/build.yml#L50-L51

if you'd like to install it locally.

inferno's People

Contributors

albertov avatar antonioc76 avatar daniel-diaz avatar goodlyrottenapple avatar ltibbetts avatar mirko-plowtech avatar ngua avatar shulhi avatar siddharth-krishna avatar

Stargazers

 avatar  avatar  avatar  avatar

Watchers

 avatar  avatar  avatar  avatar  avatar  avatar

Forkers

shulhi

inferno's Issues

`vscode-inferno-syntax-highlighting` is broken

#11 removed the vsce input from the packages defined in vsCodeInfernoFor. This needs to be added back for vscode-inferno-syntax-highlighting:

error: builder for '/nix/store/...-vscode-inferno-syntax-highlighting-0.0.1.drv' failed with exit code 127;
       last 10 log lines:
       > patching sources
       > updateAutotoolsGnuConfigScriptsPhase
       > configuring
       > patching script interpreter paths in .
       > building
       >
       > > [email protected] build-tm
       > > node build-tm
       >
       > /nix/store/ncm43bbk3nfwpf7pycgz1cnasnihcmkg-stdenv-darwin/setup: line 1408: vsce: command not found

Set up binary cache and use Nix on CI

  • Set up Cachix cache
  • Rewrite the GH Actions config to use Nix and Cachix actions and build everything in the flake
  • Update the docs to instruct users on how to use Cachix
  • Possibly set the substituters and trusted-public-keys for the Cachix cache in the flake directly so users don't need to install the Cachix CLI

Torch version mismatch

As far as I can tell, Nix is not grabbing the version of pytorch specified here:

# Need specific torch version for the serialized torchscript models (e.g. Bert) to be compatible with hasktorch:
pip install torch==1.11.0+cpu torchvision==0.12.0+cpu torchaudio==0.11.0 --extra-index-url https://download.pytorch.org/whl/cpu
.

Add `devShells` for NPM packages

Currently it's kind of a pain to work on or fix the vsCodeInfernoFor packages as there are no devShells with npm, etc... in them

WIP: A Concurrency Tester for Inferno-VC (and other servant servers)

This issue documents partial progress towards testing inferno-vc-server for concurrency bugs (#19 , #36).

The basic idea is to use the servant-quickcheck package to test the VC server on arbitrary requests. In order for this to unearth concurrency bugs, the following steps are needed:

  1. Add a concurrent harness to servant-quickcheck so that it makes requests from multiple threads simultaneously. (Partial progress towards this is on my fork here.)
  2. The concurrent harness above is based on the stale package pqc (see my fork for GHC 9), but it needs improvements like early exit when one thread fails the test, logging the concurrent history of requests that caused an error, etc.
  3. Purely random requests are not very good at exercising race conditions and catching bugs. For example, inferno-vc needs to be called with successive stores on the same script in order to exhibit some of the bugs (#36). We should modify the generator to generate requests from a small pool of object hashes so that the tests are more realistic. (Partial progress on the sidk-conc-tests branch)
  4. The default arbitrary generator for Expr generates huge expressions, which causes tests to time out silently. A hacky fix is to add let reqs = resize 2 $ ... in servant-quickcheck's serverSatisfiesMgr. A better way forward would be to also limit the expressions used in test requests.

(CC: @smurphy8 )

Switch to `treefmt-nix`

Currently we're using treefmt to handle formatting. The developers recently split most of the Nix stuff into a separate repo with a flakeModule output that automates some things we're doing manually now. Also, the current formatter output isn't ideal in that it requires several executables to be on the PATH when called (treefmt-nix fixes that)

Pattern match is implemented incorrectly

The whole case matching code is implemented incorrectly, e.g. see a minimal example:

match (latestValue input0, latestValue input1) with {
|  (Some input0, Some input1) -> input1
|  _ -> 0
}

which throws a Custom values currently unsupported, clearly indicating that input1 in the body of a case match is incorrectly bound.

The issue arises in the matchTuple function, specifically in the <> combination of envs. The problem is when we combine envs with shadowed vars, the current implementation erases previously shadowed vars and restores the vars in the parent context. The fix is to only collect the env from the matching scope and insert it into the current env at the very top:

      matchAny v ((_, p, _, body) :| []) = case match v p of
        Just nenv -> 
          -- union is left biased so this will correctly override any shadowed vars from nenv onto env
          eval (nenv `union` env) body
        Nothing -> throwError $ RuntimeError $ "non-exhaustive patterns in case detected in " <> (Text.unpack $ renderPretty v)
      matchAny v ((_, p, _, body) :| (r : rs)) = case match v p of
        Just nenv -> eval (nenv `union` env) body
        Nothing -> matchAny v (r :| rs)

      match v p = case (v, p) of
        (_, PVar _ (Just (Ident x))) -> Just $ (Map.singleton (ExtIdent $ Right x) v, mempty)
        (_, PVar _ Nothing) -> Just mempty
        (VEnum h1 _, PEnum _ (Just h2) _ _) ->
          if h1 == h2
            then Just mempty
            else Nothing
        (VInt i1, PLit _ (LInt i2)) ->
          if i1 == i2
            then Just mempty
            else Nothing
        (VDouble d1, PLit _ (LDouble d2)) ->
          if d1 == d2
            then Just mempty
            else Nothing
        (VText t1, PLit _ (LText t2)) ->
          if t1 == t2
            then Just mempty
            else Nothing
        (VWord64 h1, PLit _ (LHex h2)) ->
          if h1 == h2
            then Just mempty
            else Nothing
        (VOne v', POne _ p') -> match v' p'
        (VEmpty, PEmpty _) -> Just mempty
        (VTuple vs, PTuple _ ps _) -> matchTuple vs $ tListToList ps
        _ -> Nothing

      matchTuple [] [] = Just mempty
      matchTuple (v' : vs) ((p', _) : ps) = do
        env1 <- match v' p'
        env2 <- matchTuple vs ps
        -- since variables in patterns must be linear,
        -- env1 and env2 should not overlap, so it doesn't
        -- matter which way around we combine
        return $ env1 `union` env2
      matchTuple _ _ = Nothing

Quick succession of saving script leads to broken history state

If you save the script quickly and rapidly, you will encounter TryingToAppendToNonHead exception.

image

The script history is also broken (see screenshot above). It thinks that the HEAD is the second latest one. If you pick the actual latest version, you won't be able to save the script as it throws TryingToAppendToNonHead exception.

I think we need some sort of lock while saving the script.

`devShells.ghc884` is broken

I haven't had a chance to look into this very thoroughly, but I believe that devShells.ghc884 is broken:

  • ormolu-0.5.0.1 requires base >=4.14 && <5.0
  • base-4.13 was bundled with GHC 8.8.4
  • base is a non-upgradeable package

Since ormolu is obtained via haskell-nix.tool, we get an incompatible build plan.

Possible solutions:

  • drop devShells.ghc884 altogether (not ideal)
  • use an older ormolu version (also not ideal)
  • don't include ormolu in devShells with older GHC versions
  • switch to haskellPackages.ormolu_0_5_0_1 from nixpkgs (maybe the best solution?)

Ormolu segfaults on M1 mac

Trying to enter the dev shell via nix develop on an M1 Mac segfaults with a clang linker segfault building ormolu:

➜  inferno git:(main) nix develop
warning: Using saved setting for 'extra-substituters = https://cache.iog.io' from ~/.local/share/nix/trusted-settings.json.
warning: Using saved setting for 'extra-trusted-public-keys = hydra.iohk.io:f/Ea+s+dFdN+3Y/G+FDgSq+a5NEWhJGzdjvKNGv0/EQ=' from ~/.local/share/nix/trusted-settings.json.
error: builder for '/nix/store/a00q1phmpqq1bd9d9l256lafjqgks69f-ormolu-exe-ormolu-0.5.0.1.drv' failed with exit code 1;
       last 10 log lines:
       > Configuring executable 'ormolu' for ormolu-0.5.0.1..
       > building
       > Preprocessing executable 'ormolu' for ormolu-0.5.0.1..
       > Building executable 'ormolu' for ormolu-0.5.0.1..
       > [1 of 2] Compiling Paths_ormolu     ( dist/build/ormolu/autogen/Paths_ormolu.hs, dist/build/ormolu/ormolu-tmp/Paths_ormolu.o, dist/build/ormolu/ormolu-tmp/Paths_ormolu.dyn_o )
       > [2 of 2] Compiling Main             ( app/Main.hs, dist/build/ormolu/ormolu-tmp/Main.o, dist/build/ormolu/ormolu-tmp/Main.dyn_o )
       > Linking dist/build/ormolu/ormolu ...
       > /nix/store/48py6zrawzim9ghrnkqwm36jl4j1l23x-clang-wrapper-11.1.0/bin/ld: line 256: 56158 Segmentation fault: 11  /nix/store/5wvlj00dr22ivh210b18ccv1i60h6c1q-cctools-binutils-darwin-949.0.1/bin/ld ${extraBefore+"${extraBefore[@]}"} ${params+"${params[@]}"} ${extraAfter+"${extraAfter[@]}"}
       > clang-11: error: linker command failed with exit code 139 (use -v to see invocation)
       > `clang' failed in phase `Linker'. (Exit code: 139)
       For full logs, run 'nix log /nix/store/a00q1phmpqq1bd9d9l256lafjqgks69f-ormolu-exe-ormolu-0.5.0.1.drv'.
error: 1 dependencies of derivation '/nix/store/70cjpq6acb0zgvrjw89g8zyidy6lhwjn-ghc-shell-for-packages-env.drv' failed to build

Fix slow parsing of tuple patterns and types

Example script:

fun x -> match x with {
  | ((((((((((((x)))))))))))) -> 2
}
image

We should use tupleElems instead of tupleArgs (slower) in Parse.hs for tuple patterns and type lists as well.

Type inference crashes on large pattern (mis)match

let foo =
  fun input0 input1 input2 input3 input4 input5 input6 input7 input8 input9 input10 ->
    let latestValue = fun i -> Some 0.0 in
    match (latestValue input0, latestValue input1, latestValue input2, latestValue input3, latestValue input4, latestValue input5, latestValue input6, latestValue input7, latestValue input8, latestValue input9, latestValue input10, latestValue input0) with {
      | (Some x0, Some x1, Some x2, Some x3, Some x4, Some x5, Some x6, Some x7, Some x8, Some x9, Some x10) ->
        Some (truncateTo 2 (x0 + x1 + x2 + x3 + x4 + x5 + x6 + x7 + x8 + x9 + x10))
      | _ -> None }
in foo 0 0 0 0 0 0 0 0 0 0 0

This script results in:

inferno: Prelude.undefined
CallStack (from HasCallStack):
  error, called at libraries/base/GHC/Err.hs:74:14 in base:GHC.Err
  undefined, called at src/Inferno/Infer/Error.hs:75:12 in inferno-core-0.3.1-inplace:Inferno.Infer.Error

This only happens when you try to match a 11-tuple with a 12-tuple. For smaller numbers you get the correct type error.

Here's a script to generate such examples:

script = '''
let foo =
fun {funArgs} ->
let latestValue = fun i -> Some 0.0 in
match ({matchArgs}) with {{
| ({pat}) ->
Some (truncateTo 2 ({sum}))
| _ -> None }}
in foo {callArgs}'''
def mk_script(n):
    args = [f'input{i}' for i in range(n)]
    with open('big-match.inferno', 'w') as f:
        f.write(script.format(
            funArgs=' '.join(args),
            matchArgs=', '.join((f'latestValue {i}' for i in args + args[0:1])),
            pat=', '.join((f'Some x{i}' for i in range(n))),
            sum=' + '.join(f'x{i}' for i in range(n)),
            callArgs=' '.join(('0' for i in range(n)))
        ))
mk_script(11)

Upgrade to GHC 9.2.5

#20 bumped the haskell.nix revision. It seems that 9.2.5 will be the new supported 9.2.x compiler version. I've noticed at least one cache miss with 9.2.4: devShells.<SYSTEM>.default now requires rebuilding GHC itself if shell.tools.haskell-language-server is configured. I've confirmed that bumping the version to 9.2.5 avoids this (although there are other unrelated build issues with HLS plugins). I disabled HLS in #24, but we will probably see more cache misses in the future if we don't upgrade from 9.2.4.

Figure out caching problem

Ever since the GHC 9.2.5 upgrade there have been very frequent cache misses. I don't think this is an issue with Cachix; I've checked open issues for Cachix (and its GH action) and can't find anything related. Cachix is very widely used so if there were an issue with this I imagine it would have been reported by now

We might be getting IOG cache misses. I'm going to try to bump the haskell.nix input and make some other changes to see if I can get it cached reliably

Recommend Projects

  • React photo React

    A declarative, efficient, and flexible JavaScript library for building user interfaces.

  • Vue.js photo Vue.js

    🖖 Vue.js is a progressive, incrementally-adoptable JavaScript framework for building UI on the web.

  • Typescript photo Typescript

    TypeScript is a superset of JavaScript that compiles to clean JavaScript output.

  • TensorFlow photo TensorFlow

    An Open Source Machine Learning Framework for Everyone

  • Django photo Django

    The Web framework for perfectionists with deadlines.

  • D3 photo D3

    Bring data to life with SVG, Canvas and HTML. 📊📈🎉

Recommend Topics

  • javascript

    JavaScript (JS) is a lightweight interpreted programming language with first-class functions.

  • web

    Some thing interesting about web. New door for the world.

  • server

    A server is a program made to process requests and deliver data to clients.

  • Machine learning

    Machine learning is a way of modeling and interpreting data that allows a piece of software to respond intelligently.

  • Game

    Some thing interesting about game, make everyone happy.

Recommend Org

  • Facebook photo Facebook

    We are working to build community through open source technology. NB: members must have two-factor auth.

  • Microsoft photo Microsoft

    Open source projects and samples from Microsoft.

  • Google photo Google

    Google ❤️ Open Source for everyone.

  • D3 photo D3

    Data-Driven Documents codes.