Giter VIP home page Giter VIP logo

hpack's Introduction

Hackage version Stackage LTS version hpack on Stackage Nightly

hpack: A modern format for Haskell packages

Hpack is a format for Haskell packages. It is a modern alternative to the Cabal package format and follows different design principles.

Design principles

The guiding design principles for Hpack are:

  • Don't require the user to state the obvious, make sensible assumptions by default
  • Give the user 100% control when needed
  • Don't require the user to repeat things, facilitate DRYness

Tool integration

Hpack packages are described in a file named package.yaml. Both cabal2nix and stack support package.yaml natively. For other build tools the hpack executable can be used to generate a .cabal file from package.yaml.

There is no user guide

There is reference documentation below, but introductory documentation is still lacking. For the time being, take a look at the slides from my talk about Hpack at the Singapore Haskell meetup: http://typeful.net/talks/hpack

Examples

Documentation

Handling of Paths_ modules

Cabal generates a Paths_ module for every package. How exactly Hpack behaves in regards to that module depends on the value of the spec-version field.

If the spec-version is explicitly specified and at least 0.36.0 the modern behavior is used, otherwise Hpack falls back to the legacy behavior.

To use the modern behavior, require at least

spec-version: 0.36.0

in your package.yaml.

Modern behavior

If you want to use the Paths_ module for a component, you have to explicitly specify it under generated-other-modules.

Example:

library:
  source-dirs: src
  generated-other-modules: Paths_name # substitute name with the package name

Legacy behavior

For historic reasons Hpack adds the Paths_ module to other-modules when generating a .cabal file.

To prevent Hpack from adding the Paths_ module to other-modules add the following to package.yaml:

library:
  when:
  - condition: false
    other-modules: Paths_name # substitute name with the package name

Quick-reference

Top-level fields

Hpack Cabal Default Notes Example Since
spec-version The minimum version of hpack that is required to parse this package description. spec-version: 0.30.0 0.30.0
name ·
version · 0.0.0
synopsis ·
description ·
category ·
stability ·
homepage · If github given, <repo>#readme
bug-reports · If github given, <repo>/issues
author · May be a list
maintainer · author May be a list
copyright · May be a list
license · Inferred from license-file Both SPDX license expressions and traditional Cabal license identifiers are accepted. license: MIT SPDX: 0.29.0
license-file license-file or license-files LICENSE if file exists May be a list
tested-with · May be a list (since 0.34.3)
build-type · Simple, or Custom if custom-setup exists Must be Simple, Configure, Make, or Custom
extra-source-files · Accepts glob patterns
extra-doc-files · Accepts glob patterns 0.21.2
data-files · Accepts glob patterns
data-dir ·
github source-repository head Accepts owner/repo or owner/repo/subdir github: foo/bar
git source-repository head No effect if github given git: https://my.repo.com/foo
custom-setup · See Custom setup
flags flag <name> Map from flag name to flag (see Flags)
library · See Library fields
internal-libraries library <name> Map from internal library name to a dict of library fields and global top-level fields. 0.21.0
executables executable <name> Map from executable name to executable (see Executable fields)
executable executable <package-name> Shortcut for executables: { package-name: ... } 0.18.0
tests test-suite <name> Map from test name to test (see Test fields)
benchmarks benchmark <name> Map from benchmark name to benchmark (see Benchmark fields)
defaults See Defaults, may be a list

cabal-version

Hpack does not require you to specify a cabal-version manually. When generating a .cabal file, Hpack sets the cabal-version automatically based on the features that are used.

If you want to override this behavior you can use verbatim to set cabal-version manually, e.g.:

verbatim:
  cabal-version: 2.2

Defaults

Hpack allows the inclusion of common fields from a file on GitHub or a local file.

To use this feature a user must specify a GitHub repository, Git reference and a path to a file within that repository; alternatively, a path to the local file must be given.

Example:

defaults:
  github: sol/hpack-template
  ref: 2017
  path: defaults.yaml

This will include all common fields from https://github.com/sol/hpack-template/blob/2017/defaults.yaml into the package specification.

Field Default Notes Example
github For github defaults. Accepts <owner>/<repo> github: sol/hpack-template
ref For github defaults. ref: 2017
path .hpack/defaults.yaml For github defaults. A relative path to a file within the repository, path segments are separated by / and must not contain : and \. path: defaults.yaml
local For local defaults. New in 0.26.0.

Exactly one of github and local must be given in a defaults section.

Hpack supports shorthand syntax for specifying github and ref as a string:

defaults: sol/hpack-template@2017

This is equivalent to:

defaults:
  github: sol/hpack-template
  ref: 2017

Note: Hpack caches downloaded files under ~/.hpack/defaults/<owner>/<repo>/<path>. Once downloaded, a file is reused from the cache. If the content on GitHub changes the file is not updated. For this reason it is recommended to only use tags as Git references.

  • If a defaults file has changed on GitHub and you want to use the latest version, then you have to delete that file from the cache manually.

  • If you want to prevent Hpack from accessing the network to download a defaults file, then you can achieve this by adding that file to the cache manually.

Custom setup

Hpack Cabal Default Notes Example
dependencies setup-depends Implies build-type: Custom

Common fields

These fields can be specified top-level or on a per section basis; top-level values are merged with per section values.

Hpack Cabal Default Notes
buildable · Per section takes precedence over top-level
source-dirs hs-source-dirs
default-extensions ·
language default-language Haskell2010 Also accepts Haskell98 or GHC2021. Per section takes precedence over top-level
other-extensions ·
ghc-options ·
ghc-prof-options ·
ghc-shared-options ·
ghcjs-options ·
cpp-options ·
cc-options ·
c-sources · Accepts glob patterns
cxx-options ·
cxx-sources · Accepts glob patterns
js-sources · Accepts glob patterns
extra-lib-dirs ·
extra-libraries ·
include-dirs ·
install-includes ·
frameworks ·
extra-frameworks-dirs ·
ld-options ·
dependencies build-depends See Dependencies
pkg-config-dependencies pkgconfig-depends
build-tools build-tools and/or build-tool-depends
system-build-tools build-tools A set of system executables that have to be on the PATH to build this component
when Accepts a list of conditionals (see Conditionals)

build-tools: A set of Haskell executables that are needed to build this component

Each element consists of a name and an optional version constraint.

The name can be specified in two ways:

  1. Qualified: <package>:<executable>
  2. Unqualified: <executable>

A qualified name refers to an executable named <executable> from a package named <package>.

An unqualified name either refers to an executables in the same package, or if no such executable exists it is desugared to <executable>:<executable>.

build-tools can be specified as a list or a mapping.

Examples:

build-tools:
  - alex
  - happy:happy
  - hspec-discover == 2.*
build-tools:
  alex: 3.2.*
  happy:happy: 1.19.*
  hspec-discover: 2.*

When generating a .cabal file each element of build-tools is either added to build-tools or build-tool-depends.

If the name refers to one of alex, c2hs, cpphs, greencard, haddock, happy, hsc2hs or hscolour then the element is added to build-tools, otherwise it is added to build-tool-depends.

This is done to allow compatibility with a wider range of Cabal versions.

Note: Unlike Cabal, Hpack does not accept system executables as build-tools. Use system-build-tools if you need this.

Library fields

Hpack Cabal Default Notes
exposed ·
visibility ·
exposed-modules · All modules in source-dirs less other-modules less any modules mentioned in when
generated-exposed-modules Added to exposed-modules and autogen-modules. Since 0.23.0.
other-modules · Outside conditionals: All modules in source-dirs less exposed-modules less any modules mentioned in when. Inside conditionals, and only if exposed-modules is not specified inside the conditional: All modules in source-dirs of the conditional less any modules mentioned in when of the conditional
generated-other-modules Added to other-modules and autogen-modules. Since 0.23.0.
reexported-modules ·
signatures ·

Executable fields

Hpack Cabal Default Notes
main main-is
other-modules · All modules in source-dirs less main less any modules mentioned in when
generated-other-modules Added to other-modules and autogen-modules. Since 0.23.0.

Test fields

Hpack Cabal Default Notes
type exitcode-stdio-1.0
main main-is
other-modules · All modules in source-dirs less main less any modules mentioned in when
generated-other-modules Added to other-modules and autogen-modules. Since 0.23.0.

Benchmark fields

Hpack Cabal Default Notes
type exitcode-stdio-1.0
main main-is
other-modules · All modules in source-dirs less main less any modules mentioned in when
generated-other-modules Added to other-modules and autogen-modules. Since 0.23.0.

Flags

Hpack Cabal Default Notes
description · Optional
manual · Required (unlike Cabal)
default · Required (unlike Cabal)

Dependencies

Dependencies can be specified as either a list or an object. These are equivalent:

  dependencies:
    - base >= 4.10.1.0
    - containers >= 5.10
  dependencies:
    base: ">= 4.10.1.0"
    containers: ">= 5.10"

The individual dependencies can also be specified as an object:

  dependencies:
    - name: base
      version: ">= 4.10.1.0"
    - name: containers

You can use objects at both levels, or have a mix of valid ways to specify the individual dependencies:

  dependencies:
    base:
      version: ">= 4.10.1.0"
    # If you don't give a version, it defaults to 'any version'.
    containers: {}
    transformers: ">= 0.5.5.0 && < 5.6"

Individual dependencies as objects are only supported from version 0.31.0.

When a dependency is specified as an object, you can use the mixin field to control what modules from the dependency your program will see and how its signatures are filled in:

  dependencies:
    # This gives you a shorter name to import from, and hides the other modules.
    - name: containers
      mixin:
        - (Data.Map.Lazy as Map)
    # This hides the System.IO.Unsafe module, and leaves the other modules unchanged.
    - name: base
      mixin:
        - hiding (System.IO.Unsafe)
    # This exposes only the listed modules - you won't be able to import the others!
    - name: lens
      mixin:
        - (Control.Lens, Data.Set.Lens, Data.Map.Lens as MapL)
    # This will rename the module, and expose the others.
    - name: transformers
      mixin:
        - hiding (Control.Monad.Trans.State.Lazy)
        - (Control.Monad.Trans.State.Lazy as State)

For more information, see the Cabal documentation.

Hint: you can hide the Prelude module from base, and then rename an alternative prelude to Prelude so that it doesn't need to be imported!

mixin was added in version 0.31.0.

Conditionals

Conditionals with no else branch:

  • Must have a condition field
  • May have any number of other fields

For example,

when:
  - condition: os(darwin)
    extra-lib-dirs: lib/darwin

becomes

if os(darwin)
  extra-lib-dirs:
    lib/darwin

Conditionals with an else branch:

  • Must have a condition field
  • Must have a then field, itself an object containing any number of other fields
  • Must have a else field, itself an object containing any number of other fields

For example,

when:
  - condition: flag(fast)
    then:
      ghc-options: -O2
    else:
      ghc-options: -O0

becomes

if flag(fast)
  ghc-options: -O2
else
  ghc-options: -O0

Note: Conditionals with condition: false are omitted from the generated .cabal file.

File globbing

At place where you can specify a list of files you can also use glob patterns. Glob patterns and ordinary file names can be freely mixed, e.g.:

extra-source-files:
  - static/*.js
  - static/site.css

Glob patterns are expanded according to the following rules:

  • ? and * are expanded according to POSIX (they match arbitrary characters, except for directory separators)
  • ** is expanded in a zsh-like fashion (matching across directory separators)
  • ?, * and ** do not match a . at the beginning of a file/directory

Passing things to Cabal verbatim

(since hpack-0.24.0)

In cases where Hpack does not (yet!) support what you want to do, you can use the verbatim field to pass things to Cabal verbatim. It is recognized top-level, in sections, and in conditionals.

verbatim accepts an object or a string (or a list of objects and strings).

Disclaimer: The content of verbatim fields are merged into the generated .cabal file as a final step, after Hpack is done with most of its work. Before that final step Hpack does not look at any verbatim fields. Consequently, the content of a verbatim field does not affect any other fields that are populated by Hpack. As an example, if you use verbatim to override hs-source-dirs, the overridden information will not be used when Hpack infers exposed-modules or other-modules.

Objects

When an object is used:

  • field values can be strings, numbers, booleans, or null
  • existing .cabal fields can be overridden
  • existing .cabal fields can be removed by overriding with null
  • additional .cabal fields can be added

Example:

tests:
  spec:
    main: Spec.hs
    source-dirs: test
    verbatim:
      type: detailed-0.9     # change type from exitcode-stdio-1.0
      default-language: null # remove default-language

Strings

When a string is used:

  • it will be added verbatim, indented to match the indentation of the surrounding context.
  • all existing .cabal fields are left untouched

Example:

verbatim: |
  build-tool-depends:
      hspec-discover:hspec-discover == 2.*

Lists of objects and strings

You can combine the use of objects and strings to gain more fine-grained control, e.g. you can remove an existing field with an object and then include it with a string so that you have 100% control over the layout.

verbatim:
  - build-depends: null
  - |
    -- let's use Cabal 5.0 dependency syntax
    build-depends:
      hspec: [2-3[

Not repeating yourself

It is possible to use YAML anchors (&), aliases (*) and merge keys (<<) to define fields and reference them later.

executables:
  my-exe-1: &my-exe
    main: my-exe-1.hs
    dependencies: [base, my-lib]
    ghc-options: [-threaded]
  my-exe-2:
    <<: *my-exe
    main: my-exe-2.hs

Fields that start with an underscore are ignored by hpack, so they can be used to declare aliases:

_exe-ghc-options: &exe-ghc-options
  - -threaded
  - -rtsopts

executables:
  my-exe-1:
    ghc-options: *exe-ghc-options

It is also possible to use the !include directive:

# ...

tests:
  hlint: !include "../common/hlint.yaml"

hlint.yaml:

source-dirs: test
main: hlint.hs
dependencies: [base, hlint]

This can also be used to provide entire libraries of snippets:

_common/lib: !include "../common/lib.yaml"

name: example1
version: '0.1.0.0'
synopsis: Example
<<: *legal

<<: *defaults

library:
  source-dirs: src

tests:
  hlint: *test_hlint

lib.yaml:

- &legal
  maintainer: Some One <[email protected]>
  copyright: (c) 2017 Some One
  license: BSD3

- &defaults
  dependencies:
    - base
    - containers
  ghc-options:
    - -Wall
    - -Werror

- &test_hlint
  source-dirs: test
  main: hlint.hs
  dependencies: [hlint]

Vim integration

To run hpack automatically on modifications to package.yaml add the following to your ~/.vimrc:

autocmd BufWritePost package.yaml call Hpack()

function Hpack()
  let err = system('hpack ' . expand('%'))
  if v:shell_error
    echo err
  endif
endfunction

Stack support

Stack has built-in support for Hpack. If you are using Stack you can use package.yaml instead of a .cabal file. No additional steps are required.

Binaries for use on Travis CI

You can get binaries for use on Travis CI with:

curl -sSL https://github.com/sol/hpack/raw/master/get-hpack.sh | bash

(both Linux and OS X are supported)

hpack's People

Contributors

andreasabel avatar appleprincess avatar fuuzetsu avatar harendra-kumar avatar krakrjak avatar liskin avatar lukexi avatar mgsloan avatar mitchellwrosen avatar mkscrg avatar mpilgrem avatar nmattia avatar peti avatar phadej avatar philderbeast avatar phunehehe avatar qoelet avatar quasicomputational avatar rimmington avatar robbinch avatar ryanglscott avatar scott-fleischman avatar silky avatar sjakobi avatar snoyberg avatar soenkehahn avatar sol avatar tfausak avatar tuleism avatar waddlaw avatar

Stargazers

 avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar

Watchers

 avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar

hpack's Issues

support cc-options cabal field

hpack does not seem to support the cc-options field for passing options to the c compiler:

cueball:/vol/hosts/cueball/workspace/projects/unicode-transforms$ stack build
WARNING: Ignoring unknown field "cc-options" in library section
WARNING: Ignoring unknown field "cc-options" in library section

Support for "build-type: Custom"

Is it possible to specify build-type: Custom somehow? I tried adding an appropriate entry to my package.yaml file, but hpack complained about an unsupported entry.

c-sources marked as "unknown field" when hpack is built with aeson-0.9 in Stack

I haven't had a chance to get all the way to the root of this yet, but I'm logging this here so I don't forget it.

The copy of hpack that's built into stack exhibits a bug that doesn't occur with a vanilla hpack installation, namely displaying:

WARNING: Ignoring unknown field "c-sources" in library section

when building a library with a c-sources field.

I traced the difference to hpack defining

extra-deps:
- aeson-0.11.0.0

in its stack.yaml, which is not defined in Stack's stack.yaml. When I add that to Stack's stack.yaml the problem disappears.

/cc @mgsloan

Support toml file format

https://github.com/toml-lang/toml

Any interest in adding support for accepting either package.yaml or package.toml?

It's a little more verbose than yaml (strings require quotes) but some people may prefer it. Example package.toml:

name       = "hpack"
version    = "0.15.0"
synopsis   = "An alternative format for Haskell packages"
maintainer = "Simon Hengel <[email protected]>"
license    = "MIT"
github     = "sol/hpack"
category   = "Development"

ghc-options = "-Wall"

dependencies = [
  "base >= 4.7 && < 5",
  "base-compat >= 0.8",
  "deepseq",
  "directory",
  "filepath",
  "Glob",
  "htoml",
  "text",
  "containers",
  "unordered-containers",
  "yaml",
  "aeson >= 0.11",
]

[library]
source-dirs = "src"
exposed-modules = [
  "Hpack",
  "Hpack.Config",
  "Hpack.Run",
  "Hpack.Yaml",
]

[executables.hpack]
main = "Main.hs"
source-dirs = "driver"
dependencies = ["hpack"]

[tests.spec]
cpp-options = "-DTEST"
main        = "Spec.hs"
source-dirs = ["test", "src"]
dependencies = [
  "hspec == 2.*",
  "QuickCheck",
  "temporary",
  "mockery >= 0.3",
  "interpolate",
  "aeson-qq",
]

`ghc-options` don't end up in test-suite sections

From a package.yaml file like this:

ghc-options: -Wall -fno-warn-name-shadowing

dependencies:
  - base == 4.*
  - generics-sop
  - safe

library: {}

tests:
  spec:
    main: "Spec.hs"
    ghc-options: "-threaded -O0 -pgmL markdown-unlit"
    source-dirs:
      - test
      - examples

    dependencies:
      - getopt-generics
      - hspec
      - hspec-expectations
      - silently

I would expect the test-suite section of the generated cabal file to contain a ghc-options field with the value -Wall -fno-warn-name-shadowing -threaded -O0 -pgmL markdown-unlit but it has none.

Problems when a directory field contains a dot

When a field that takes a directory contains a dot, hpack's renderer will generate a line with nothing but a . in it. However, this is invalid in a cabal file, as dots on their single lines are meant to express empty-lines (I think, I can't find the documentation the source that does it is here).

A simple example of the bug is here: https://gist.github.com/0cc7f8f8e2cbc9ee7bbf82876a822e28

In there, source-dirs: . generates an invalid cabal file. Though this specific case isn't super common, without specifying source-dirs: ., hpack won't include the other-modules section for the modules in the current directory (I'd consider that another ticket).


My idea for a fix would be to output:

field: .
  , other-value
  , other-value

Whenever . is a value.

Test fails on Mac OSX

  2) Hpack.Config.getModules, when source directory is './.', ignores Setup
       expected: ["Foo"]
        but got: ["Foo","Setup"]

this because other dir in path is /private/var/... and other /var/... - i.e. src == dir check isn't enough. One would need to follow links

$ ls -l /
...
lrwxr-xr-x@   1 root  wheel    11 Oct 23  2014 var -> private/var

data files are excluded if they do not exist

I have been using template haskell to create configuration files, and including those files in the data-files of a cabal package.

Hpack seems to check whether files exist before including them in the generated cabal file, which means I have to create stub files to get Hpack to work. This is annoying because I have to commit this stub to source control, and be sure to not commit generated files.

In the case where glob patterns are not used, I think the fix is as simple as not excluding a non-existant data file from the generated .cabal file.

I would like to be able to generate files of arbitrary name in template haskell, and have a glob pattern in the package.yaml file pick up each of those generated files, but I suspect this is impractical.

Command line interface is not particularly user-friendly

It would be helpful if hpack had shown anything when run from the command line, even if it's just Generating foo.cabal from package.yml or a usage string. The current behavior is confusing:

% hpack
% hpack --help
% hpack help
%

Have stack use hpack as a library, and automatically add version bounds?

The overall idea is to have stack detect hpack files and automatically generate cabal files when the hpack files change. I'm not sure if everyone will be on board with the idea, but it seems like a good idea to me, lets discuss!

In commercialhaskell/stack#1568 , I lay out an approach to having multiple stack configurations inform wide version bounds for dependencies. One reason that I'm keen on hpack is that the stage where .cabal files are generated would be a great time to insert these version constraints.

Optionally allow else branch for conditionals

I'm tempted to implement else-branches to conditional, e.g. accept something like the following, in addition to the current syntax:

when:
  - condition: os(windows)
    then:
      dependencies: Win32
    else:
      dependencies: unix

Allow specifying license file

My license files typically have an extension. hpack only looks for a file called LICENSE with no extension. I tried to manually set the license-file, but that is an unknown field. I would like to be able to manually set the license file. It would be nice if license files with extensions could automatically be found.

needs documentation

Several major features (flags, conditionals) have been in master for months now, but to use them you have to reverse-engineer from source code or issue discussion. hpack's lack of docs is holding it back.

How to add `exposed-modules` depending on a cabal flag?

I can't manage to create a package.yaml that allows to add exposed-modules depending on a cabal flag. This is roughly what I tried:

  when:
  - condition: flag(with-servant-aeson-specs)
    dependencies:
    - servant-aeson-specs ==0.2.* || ==0.3.* || ==0.4.*
    exposed-modules:
    - Servant.MatrixParam.AesonSpecs

But I get the following message from hpack:

WARNING: Ignoring unknown field "exposed-modules" in library section
servant-matrix-param.cabal is up-to-date

And the cabal file is missing the exposed-modules.

Is there a way to do this with hpack?

(Here's the relevant project, if that helps: https://github.com/plow-technologies/servant-matrix-param)

Compatibility with aeson-1.0.0.0

We need to:

  1. adapt hpack to allow the latest version of aeson
  2. add upper bounds for aeson to all previous versions of hpack on Hackage

Require package name?

While it is clever to infer the package name from the directory, it causes issues when hpack files are present in hackage tarballs: commercialhaskell/stack#2254

I haven't tested, but I believe this will even be an issue when you do stack unpack on the package, as the unpacked dir defaults to having the package name as well as the package version.

Automagically generate hpack file?

This is a bit of a hack idea, but I think it could work well:

  1. Based on source directories, it would discover haskell modules using the existing hpack logic. It'd also guess that the project name is the name of the current dir.

  2. Generate an hpack file with these fields filled in, but just a dependency on base.

  3. Iteratively run the compiler and parse out import resolution failures. Find the package in some Map ModuleName [PackageName], and ask the user when there's ambiguity.

The module name map could come from the stackage snapshot you've picked, or some other source like the latest hackage. For stack integration, the fanciest thing would be to fall back on looking it up in hackage, and add extra-deps as necessary.

It'd also be quite fancy if this could update an existing hpack file, perhaps based on errors the user encounters in their editor's ghci session

Warn on `extra-source-files` which don't exist

As in subject. We can do this fairly easily so I wonder if it makes sense for hpack to do so.

I think cabal warns itself later so if hpack is going fairly strictly for just translating yaml -> cabal then it's not that important. OTOH hpack already warns on missing source dirs…

Specification of multiple packages?

Perhaps it makes sense to be able to specify multiple packages in one hpack file? Then, common dependencies and flags can be factored out.

The only downside I can see to this is that it makes it easier to inadvertently add unused dependencies. Ideally, these'd be detected (commercialhaskell/stack#39)

So, this'd mean you'd potentially end up with a packages.yaml next to your `stack.yaml. Alternatively, I could even imagine stack supporting inline hpack in its packages list (related to #61). Not sure the complexity is worth it, or whether it'd be a positive effect on stack.yaml - just a thought!

hpack collpases newlines in the description field

Something like this in hpack.yaml description field:

a
.
b

Gets collapsed to this in .cabal:
a . b

So everything becomes a single line. Is there something that I am missing or that's how it works?

easier multi-line description formatting

I'd like to write a multi-line description in the yaml file, with long lines broken for readability. Ideally the line breaks would be preserved in the cabal file, for the same reason. In the yaml file I tried description: (complains if the text contains :), description: > and description: |. The last seemed to work best, but I think it is adding cabal's . lines for every line break; I think it should add these only for blank lines (paragraph separators).

I don't know if it's possible to preserve line breaks all the way through to the cabalish-haddock rendering on hackage - probably that's not desirable.

provide a command line option to specify output file

Complementing #106

My use case for this is that when running hpack from a script, it would be best to call it as:

hpack -i input.yaml -o output.cabal

Instead of having to cd to the directory where the input.yaml file is located, and having to “guess” the name of the output cabal file.

provide a command line option to specify input file

I am using stack and because of #104 I cannot keep a package.yaml. I want to be able to generate the cabal file from an arbitrary file say hpack.yaml. Otherwise I have to keep renaming to and from package.yaml to avoid #104. For such cases it will be useful to have a command line option to specify an input file such that:

hpack -f hpack.yaml

or read from stdin

hpack < hpack.yaml

Automatic generation of hpack from .cabal files?

Hpack looks really nice, great work! I think a good step towards adoption would be to support automatic conversion of cabal files to hpack format. This could be done in a new package, hpack-convert, or something, to avoid a dependency on Cabal in the main hpack library (maybe that dependency would be ok, though?).

Experiment with declaring a set of extensions?

There were some threads about introducing a -XHaskell2016 or -XGlasgow2016, or perhaps a new language.

https://www.reddit.com/r/haskell/comments/4fsuvu/can_we_have_xhaskell2016_which_turns_on_the_most/
https://www.reddit.com/r/haskell/comments/4ggp8z/summary_of_the_xhaskell2016_feedback/

hpack seems like a great place to experiment with this. What if something like this was added as a feature of hpack? To avoid confusion and make it clear it's an experiment, could have something like language: hpack2016.

IMHO, ideally we'd carve out something that can be the seed of a new standard, and so should only include extensions that we expect to make it in to the standard. I think the following are good guidelines:

  • Widely used with few unexpected consequences. We can use the poll results for this
  • Avoid things that have a bunch of complexity, particularly if there are known unresolved loose ends. Avoid complex things that haven't been around for a long time and are fully proven.
  • Avoid extensions that can cause some code to not compile when enabled. I have accumulated a list of these in the stack ghci code, to inform the user when they are loading multiple packages that have potentially conflicting extensions: https://github.com/commercialhaskell/stack/blob/master/src/Stack/Ghci.hs#L408

Here are the most popular "safe" extensions:

LambdaCase (71)
GADTSyntax (70)
RankNTypes (70)
ScopedTypeVariables (69)
DeriveGeneric (66)
TupleSections (59)
BangPatterns (59)
MultiParamTypeClasses (56)
FlexibleInstances (33)
FlexibleContexts (31)
MultiWayIf (31)
TypeFamilies, TypeOperators (29)
DuplicateRecordFields (17)
FunctionalDependencies (15)
DisambiguateRecordFields (14)
MonadComprehensions (12)
BinaryLiterals (11)
RecursiveDo (10)
Other safe extensions (there was no feedback against having them on by default):
ParallelListComp (7)
PartialTypeSignatures (6)
TypeApplications (4)
RecordWildCards (4)
PatternSynonyms (4)
EmptyCase (3)
InstanceSigs (3)
KindSignatures (3)

I would exclude the following:

  • TypeApplications - too new
  • DisambiguateRecordFields + DuplicateRecordFields - too new
  • PatternSynonyms - too new
  • TypeFamilies - as much as I like them, they seem too new. At once, they are popular and effective.
  • KindSignatures - perhaps this is too stringent, but it seemed to me like we aren't really sure what the kind syntax will be (* vs Type). Has this been decided for sure?

Not sure:

  • MonadComprehensions - seems like it could cause type ambiguity for code copied from somewhere that doesn't expect it.
  • PartialTypeSignatures is new, but this is mostly a dev feature so it seems reasonable for its behavior to change a bit. Seems like something we will for sure want.
  • DeriveGeneric - I'm not sure we want to codify this just yet. This one is really popular, though
  • RankNTypes - complex to support
  • RecordWildCards - it's nice at times, but not sure we want to codify something that makes local scope resolution require datatype reification.

So, without the ones I want to exclude and the ones I'm not so sure about, we get:

LambdaCase, GADTSyntax, ScopedTypeVariables, TupleSections, BangPatterns, MultiParamTypeClasses,
FlexibleInstances, FlexibleContexts, MultiWayIf, TypeOperators, FunctionalDependencies,
BinaryLiterals, RecursiveDo, ParallelListComp, PartialTypeSignatures, EmptyCase, InstanceSigs

don't list Setup.hs as a module

Just trying out this tool.. loving it so far!

With source-dirs: . , Setup.hs is included in exposed-modules which is undesirable.

Ordered input on tests, flags executables etc.

Can we change

Maybe (HashMap String (CaptureUnknownFields (Section ExecutableSection)))

to

Maybe (Map String (CaptureUnknownFields (Section ExecutableSection)))

or alternatively traverse them in name order. Time from time ordering of multiple tests changes. I guess I'm unlucky, I got a hashclash.

data-dir

Hpack does not seem to support the data-dir cabal option. Is there a reason for this? If not I would be happy to contribute a fix.

hpack-0.8.0 test suite failure

Citing from http://hydra.cryp.to/build/1306557/nixlog/2/raw:

Failures:

  test/Hpack/ConfigSpec.hs:64:
  1) Hpack.Config.parseJSON, when parsing a Dependency, when parsing fails, returns an error message
       expected: Left "Error in $: expected String or an Object, encountered Number"
        but got: Left "when expecting a String or an Object, encountered Number instead"

  test/Hpack/ConfigSpec.hs:69:
  2) Hpack.Config.parseJSON, when parsing a Dependency, when parsing fails, when ref is missing, produces accurate error messages
       expected: Left "Error in $: key \"ref\" not present"
        but got: Left "key \"ref\" not present"

  test/Hpack/ConfigSpec.hs:78:
  3) Hpack.Config.parseJSON, when parsing a Dependency, when parsing fails, when both git and github are missing, produces accurate error messages
       expected: Left "Error in $: neither key \"git\" nor key \"github\" present"
        but got: Left "neither key \"git\" nor key \"github\" present"

  test/Hpack/ConfigSpec.hs:643:
  4) Hpack.Config.readPackageConfig, when package.yaml is invalid, returns an error
       expected: Left "package.yaml: Error in $.executables.foo: failed to parse field executables: The key \"main\" was not found"
        but got: Left "package.yaml: The key \"main\" was not found"

  test/Hpack/UtilSpec.hs:71:
  5) Hpack.Util.List, when parsing single values, returns error messages from element parsing
       expected: Left "Error in $: neither key \"git\" nor key \"github\" present"
        but got: Left "neither key \"git\" nor key \"github\" present"

  test/Hpack/UtilSpec.hs:78:
  6) Hpack.Util.List, when parsing a list of values, propagates parse error messages of invalid elements
       expected: Left "Error in $[1]: neither key \"git\" nor key \"github\" present"
        but got: Left "neither key \"git\" nor key \"github\" present"

Confusing source-dirs

Some package.yaml file use e.g. driver, src as the value for source-dirs. This is not a valid yaml list value.

It still works most of the time because the field value gets inserted without quotes into the cabal file. For example: the package.yaml of this repo.

Should hpack warn about non-existing source directories?

Recommend Projects

  • React photo React

    A declarative, efficient, and flexible JavaScript library for building user interfaces.

  • Vue.js photo Vue.js

    🖖 Vue.js is a progressive, incrementally-adoptable JavaScript framework for building UI on the web.

  • Typescript photo Typescript

    TypeScript is a superset of JavaScript that compiles to clean JavaScript output.

  • TensorFlow photo TensorFlow

    An Open Source Machine Learning Framework for Everyone

  • Django photo Django

    The Web framework for perfectionists with deadlines.

  • D3 photo D3

    Bring data to life with SVG, Canvas and HTML. 📊📈🎉

Recommend Topics

  • javascript

    JavaScript (JS) is a lightweight interpreted programming language with first-class functions.

  • web

    Some thing interesting about web. New door for the world.

  • server

    A server is a program made to process requests and deliver data to clients.

  • Machine learning

    Machine learning is a way of modeling and interpreting data that allows a piece of software to respond intelligently.

  • Game

    Some thing interesting about game, make everyone happy.

Recommend Org

  • Facebook photo Facebook

    We are working to build community through open source technology. NB: members must have two-factor auth.

  • Microsoft photo Microsoft

    Open source projects and samples from Microsoft.

  • Google photo Google

    Google ❤️ Open Source for everyone.

  • D3 photo D3

    Data-Driven Documents codes.