Giter VIP home page Giter VIP logo

market's Introduction

banner

Ocean Marketplace

Build Status Netlify Status Maintainability Test Coverage js oceanprotocol

Table of Contents

๐Ÿ„ Get Started

The app is a React app built with Next.js + TypeScript + CSS modules and will connect to Ocean remote components by default.

Prerequisites:

  • Node.js (required). Check the .nvmrc file to make sure you are using the correct version of Node.js.
  • nvm (recommended). This is the recommended way to manage Node.js versions.
  • Git is required to follow the instructions below.

To start local development:

git clone [email protected]:oceanprotocol/market.git
cd market

# when using nvm to manage Node.js versions
nvm use

npm install
# in case of dependency errors, rather use:
# npm install --legacy-peer-deps
npm start

This will start the development server under http://localhost:8000.

Local components with Barge

Using the ocean-market with barge components is recommended for advanced users, if you are new we advice you to use the ocean-market first with remote networks. If you prefer to connect to locally running components instead of remote connections, you can spin up barge and use a local Ganache network in another terminal before running npm start. To fully test all The Graph integrations, you have to start barge with the local Graph node:

git clone [email protected]:oceanprotocol/barge.git
cd barge

# startup with local Ganache and Graph nodes
./start_ocean.sh --with-thegraph

Barge will deploy contracts to the local Ganache node which will take some time. At the end the compiled artifacts need to imported over to this project as environment variables. The set-barge-env script will do that for you and set the env variables to use this local connection in .env in the app. You also need to append the chainIdsSupported array with the barge's ganache chainId (8996) in the app.config.js file.

If you are using macOS operating system you should also make same changes to the provider url since the default barge ip can not be accessed due to some network constraints on macOs. So we should be using the 127.0.0.1:8030 (if you have changed the provider port please use that here as well) for each direct call from the market to provider, but we should keep the internal barge url http://172.15.0.4:8030/ (this is the default ip:port for provider in barge, if changed please use the according url). So on inside src/@utils/provider.ts if on macOS you can hardcode this env variable NEXT_PUBLIC_PROVIDER_URL or set 127.0.0.1:8030 as providerUrl on all the methods that call ProviderInstance methods. (eg: getEncryptedFiles, getFileDidInfo, downloadFile etc). You should use the same provider url for src/@utils/nft.ts inside setNFTMetadataAndTokenURI and setNftMetadata and src/components/Publish/index.tsx inisde encrypt method (if you set the env variable there's no need to do this). You also need to use local ip's for the subgraph (127.0.0.1 instead of 172.15.0.15) and the metadatacache (127.0.0.1 instead of 172.15.0.5).

Once you want to switch back to using the market against remote networks you need to comment or remove the env vars that are set by set-barge-env script.

cd market
npm run set-barge-env
npm start

To use the app together with MetaMask, importing one of the accounts auto-generated by the Ganache container is the easiest way to have test ETH available. All of them have 100 ETH by default. Upon start, the ocean_ganache_1 container will print out the private keys of multiple accounts in its logs. Pick one of them and import into MetaMask. Barge private key example : 0xc594c6e5def4bab63ac29eed19a134c130388f74f019bc74b8f4389df2837a58

Cleaning all Docker images so they are fetched freshly is often a good idea to make sure no issues are caused by old or stale images: docker system prune --all --volumes

๐Ÿฆ‘ Environment variables

The app.config.js file is setup to prioritize environment variables for setting each Ocean component endpoint. By setting environment variables, you can easily switch between Ocean networks the app connects to, without directly modifying app.config.js.

For local development, you can use a .env file:

# modify env variables, Goerli is enabled by default when using those files
cp .env.example .env

๐Ÿฆ€ Data Sources

All displayed data in the app is presented around the concept of one asset, which is a combination of:

  • metadata about an asset
  • the actual asset file
  • the NFT which represents the asset
  • the datatokens representing access rights to the asset file
  • financial data connected to these datatokens, either a fixed rate exchange contract or a dispenser for free assets
  • calculations and conversions based on financial data
  • metadata about publisher accounts

All this data then comes from multiple sources:

Aquarius

All initial assets and their metadata (DDO) is retrieved client-side on run-time from the Aquarius instance, defined in app.config.js. All app calls to Aquarius are done with 2 internal methods which mimic the same methods in ocean.js, but allow us:

  • to cancel requests when components get unmounted in combination with axios
  • hit Aquarius as early as possible without relying on any ocean.js initialization

Aquarius runs Elasticsearch under the hood so its stored metadata can be queried with Elasticsearch queries like so:

import { QueryResult } from '@oceanprotocol/lib/dist/node/metadatacache/MetadataCache'
import { queryMetadata } from '@utils/aquarius'

const queryLatest = {
  query: {
    // https://www.elastic.co/guide/en/elasticsearch/reference/current/query-dsl-query-string-query.html
    query_string: { query: `-isInPurgatory:true` }
  },
  sort: { created: 'desc' }
}

function Component() {
  const { appConfig } = useMarketMetadata()
  const [result, setResult] = useState<QueryResult>()

  useEffect(() => {
    if (!appConfig.metadataCacheUri) return
    const source = axios.CancelToken.source()

    async function init() {
      const result = await queryMetadata(query, source.token)
      setResult(result)
    }
    init()

    return () => {
      source.cancel()
    }
  }, [appConfig.metadataCacheUri, query])

  return <div>{result}</div>
}

For components within a single asset view the useAsset() hook can be used, which in the background gets the respective metadata from Aquarius.

import { useAsset } from '@context/Asset'

function Component() {
  const { ddo } = useAsset()
  return <div>{ddo}</div>
}

Ocean Protocol Subgraph

Most financial data in the market is retrieved with GraphQL from our own subgraph, rendered on top of the initial data coming from Aquarius.

The app has Urql Client setup to query the respective subgraph based on network. In any component this client can be used like so:

import { gql, useQuery } from 'urql'

const query = gql`
  query TopSalesQuery {
    users(first: 20, orderBy: totalSales, orderDirection: desc) {
      id
      totalSales
    }
  }
`

function Component() {
  const { data } = useQuery(query, {}, pollInterval: 5000 })
  return <div>{data}</div>
}

ENS

Publishers can fill their account's ENS domain profile and when found, it will be displayed in the app.

For this our own ens-proxy is used, within the app the utility method getEnsProfile() is called as part of the useProfile() hook:

import { useProfile } from '@context/Profile'

function Component() {
  const { profile } = useProfile()

  return (
    <div>
      {profile.avatar} {profile.name}
    </div>
  )
}

Purgatory

Based on list-purgatory some assets get additional data. Within most components this can be done with the internal useAsset() hook which fetches data from the market-purgatory endpoint in the background.

For asset purgatory:

import { useAsset } from '@context/Asset'

function Component() {
  const { isInPurgatory, purgatoryData } = useAsset()
  return isInPurgatory ? <div>{purgatoryData.reason}</div> : null
}

For account purgatory:

import { useAccount } from 'wagmi'
import { useAccountPurgatory } from '@hooks/useAccountPurgatory'

function Component() {
  const { address } = useAccount()
  const { isInPurgatory, purgatoryData } = useAccountPurgatory(address)
  return isInPurgatory ? <div>{purgatoryData.reason}</div> : null
}

Network Metadata

All displayed chain & network metadata is retrieved from https://chainid.network on build time and integrated into NEXT's GraphQL layer. This data source is a community-maintained GitHub repository under ethereum-lists/chains.

Within components this metadata can be queried for under allNetworksMetadataJson. The useNetworkMetadata() hook does this in the background to expose the final networkDisplayName for use in components:

export default function NetworkName(): ReactElement {
  const { isTestnet } = useNetworkMetadata()
  const { networkData, networkName } = useNetworkMetadata()

  return (
    <>
      {networkName} {isTestnet && `(Test)`}
    </>
  )
}

๐Ÿ‘ฉโ€๐ŸŽค Storybook

Storybook helps us build UI components in isolation from our app's business logic, data, and context. That makes it easy to develop hard-to-reach states and save these UI states as stories to revisit during development, testing, or QA.

To start adding stories, create a index.stories.tsx inside the component's folder:

src
โ””โ”€โ”€โ”€ components
โ”‚   โ””โ”€โ”€โ”€ @shared
โ”‚       โ””โ”€โ”€โ”€ 
โ”‚            โ”‚   index.tsx
โ”‚            โ”‚   index.module.css
โ”‚            โ”‚   index.stories.tsx
โ”‚            โ”‚   index.test.tsx

Starting up the Storybook server with this command will make it accessible under http://localhost:6006:

npm run storybook

If you want to build a portable static version under storybook-static/:

npm run storybook:build

๐Ÿค– Testing

Test runs utilize Jest as test runner and Testing Library for writing tests.

All created Storybook stories will automatically run as individual tests by using the StoryShots Addon.

Creating Storybook stories for a component will provide good coverage of a component in many cases. Additional tests for dedicated component functionality which can't be done with Storybook are created as usual Testing Library tests, but you can also import existing Storybook stories into those tests.

Executing linting, type checking, and full test run:

npm test

Which is a combination of multiple scripts which can also be run individually:

npm run lint
npm run type-check
npm run jest

A coverage report is automatically shown in console whenever npm run jest is called. Generated reports are sent to CodeClimate during CI runs.

During local development you can continuously get coverage report feedback in your console by running Jest in watch mode:

npm run jest:watch

โœจ Code Style

Code style is automatically enforced through ESLint & Prettier rules:

  • Git pre-commit hook runs prettier on staged files, setup with Husky
  • VS Code suggested extensions and settings for auto-formatting on file save
  • CI runs a linting & TypeScript typings check as part of npm test, and fails if errors are found

For running linting and auto-formatting manually, you can use from the root of the project:

# linting check
npm run lint

# auto format all files in the project with prettier, taking all configs into account
npm run format

๐Ÿ›ณ Production

To create a production build, run from the root of the project:

npm run build

# serve production build
npm run serve

โฌ†๏ธ Deployment

Every branch or Pull Request is automatically deployed to multiple hosts for redundancy and emergency reasons:

A link to a preview deployment will appear under each Pull Request.

The latest deployment of the main branch is automatically aliased to market.oceanprotocol.com, where the deployment on Netlify is the current live deployment.

๐Ÿ’– Contributing

We welcome contributions in form of bug reports, feature requests, code changes, or documentation improvements. Have a look at our contribution documentation for instructions and workflows:

๐Ÿด Forking

We encourage you to fork this repository and create your own data marketplace. When you publish your forked version of this market there are a few elements that you are required to change for copyright reasons:

  • The typeface is copyright protected and needs to be changed unless you purchase a license for it.
  • The Ocean Protocol logo is a trademark of the Ocean Protocol Foundation and must be removed from forked versions of the market.
  • The name "Ocean Market" is also copyright protected and should be changed to the name of your market.

Additionally, we would also advise that you retain the text saying "Powered by Ocean Protocol" on your forked version of the marketplace in order to give credit for the development work done by the Ocean Protocol team.

Everything else is made open according to the apache2 license. We look forward to seeing your data marketplace!

If you are looking to fork Ocean Market and create your own marketplace, you will find the following guides useful in our docs:

๐Ÿ’ฐ Pricing Options

Fixed Pricing

To allow publishers to set pricing as "Fixed" you need to add the following environmental variable to your .env file: NEXT_PUBLIC_ALLOW_FIXED_PRICING="true" (default).

Free Pricing

To allow publishers to set pricing as "Free" you need to add the following environmental variable to your .env file: NEXT_PUBLIC_ALLOW_FREE_PRICING="true" (default).

This allocates the datatokens to the dispenser contract which dispenses data tokens to users for free. Publishers in your market will now be able to offer their assets to users for free (excluding gas costs).

โœ… GDPR Compliance

Ocean Market comes with prebuilt components for you to customize to cover GDPR requirements. Find additional information on how to use them below.

Multi-Language Privacy Policies

Feel free to adopt our provided privacy policies to your needs. Per default we cover four different languages: English, German, Spanish and French. Please be advised, that you will need to adjust some paragraphs in the policies depending on your market setup (e.g. the use of cookies). You can easily add or remove policies by providing your own markdown files in the content/pages/privacy directory. For guidelines on how to format your markdown files refer to our provided policies. The pre-linked content tables for these files are automatically generated.

Privacy Preference Center

Additionally, Ocean Market provides a privacy preference center for you to use. This feature is disabled per default since we do not use cookies requiring consent on our deployment of the market. However, if you need to add some functionality depending on cookies, you can simply enable this feature by changing the value of the NEXT_PUBLIC_PRIVACY_PREFERENCE_CENTER environmental variable to "true" in your .env file. This will enable a customizable cookie banner stating the use of your individual cookies. The content of this banner can be adjusted within the content/gdpr.json file. If no optionalCookies are provided, the privacy preference center will be set to a simpler version displaying only the title, text and close-button. This can be used to inform the user about the use of essential cookies, where no consent is needed. The privacy preference center supports two different styling options: 'small' and 'default'. Setting the style property to 'small' will display a smaller cookie banner to the user at first, only showing the default styled privacy preference center upon the user's customization request.

Now your market users will be provided with additional options to toggle the use of your configured cookie consent categories. You can always retrieve the current consent status per category with the provided useConsent() hook. See below, how you can set your own custom cookies depending on the market user's consent. Feel free to adjust the provided utility functions for cookie usage provided in the src/utils/cookies.ts file to your needs.

import { CookieConsentStatus, useConsent } from '@context/CookieConsent'
import { deleteCookie, setCookie } from '@utils/cookies'

// ...

const { cookies, cookieConsentStatus } = useConsent()

cookies.map((cookie) => {
  const consent = cookieConsentStatus[cookie.cookieName]

  switch (consent) {
    case CookieConsentStatus.APPROVED:
      // example logic
      setCookie(`YOUR_COOKIE_NAME`, 'VALUE')
      break

    case CookieConsentStatus.REJECTED:
    case CookieConsentStatus.NOT_AVAILABLE:
    default:
      // example logic
      deleteCookie(`YOUR_COOKIE_NAME`)
      break
  }
})

Privacy Preference Center Styling

The privacy preference centre has two styling options default and small. The default view shows all of the customization options on a full-height side banner. When the small setting is used, a much smaller banner is shown which only reveals all of the customization options when the user clicks "Customize".

The style can be changed by altering the style prop in the PrivacyPreferenceCenter component in src/components/App.tsx. For example:

<PrivacyPreferenceCenter style="small" />

๐Ÿ› License

Copyright 2023 Ocean Protocol Foundation Ltd.

Licensed under the Apache License, Version 2.0 (the "License");
you may not use this file except in compliance with the License.
You may obtain a copy of the License at

   http://www.apache.org/licenses/LICENSE-2.0

Unless required by applicable law or agreed to in writing, software
distributed under the License is distributed on an "AS IS" BASIS,
WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
See the License for the specific language governing permissions and
limitations under the License.

market's People

Contributors

abrom8 avatar alexcos20 avatar andreaarmanni avatar bogdanfazakas avatar brucepon avatar claudiahash avatar corrie-sloot avatar dependabot[bot] avatar dimitarsd avatar enzovezzaro avatar innopreneur avatar jamiehewitt15 avatar katunanorbert avatar kremalicious avatar krisliew avatar loznianuanamaria avatar lucamilanese90 avatar lvl99wzrd avatar mariacarmina avatar maxieprotocol avatar mihaisc avatar moritzkirstein avatar omahs avatar soonhuat avatar tom1145 avatar trentmc avatar

Stargazers

 avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar

Watchers

 avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar

market's Issues

minimum viable search

Quick wins with current technology where we use Aquarius nativeSearch in the /search route. This also means all sorting & filtering actions fire a server search to Aquarius:

Sorting

  • by default, sort results by date created, newest first
  • sort results by liquidity, from highest to lowest (ddo.price.ocean)
  • sort results by price, from highest to lowest (ddo.price.value)
  • needs UI pattern for sorting actions

Filtering

  • filter to show only pools/only exchanges
  • needs UI pattern for filter actions

Drag & drop into decentralized storage

Goal should be to provide publishers with an easy solution to drag and drop some files during publish flow and in background they are added to a decentralized storage network.

This could start as providing an IPFS upload for sample files, since they can be public.

Storing actual data set files needs some more advanced solution to preserve data privacy.

"License" needs a commercial option as default

Ocean marketplaces, including Ocean Market, are meant for data that's for sale.

In Ocean Market, there's a dropdown for "License". Currently all the options are for free / open data. But this is a marketplace for commercial data. Therefore we need an appropriate license. And, it should be the default.

We should be able to take the terms developed for ascribe, and use them here.

I also question whether we want to offer any of the free / open licenses (Creative Commons). They're for open documents. Historically we had Ocean Commons, which was for open data, but it was really a demonstrator, with no real USP. If people want free / open data then they can simply use Web protocols directly, no need for extra Ocean machinery in between.

completely implement ocean-mock

ocean-mock has a minimal implementation so that tests run but stories don't reflect the actual scenarios. This impacts mostly compute and job related components.

-----kremalicious
good point, because of the way jest works we should rather change the scope to mocking squid. We have to create tests/mocks/@oceanprotocol/squid.ts and this will automatically be picked up by all the tests and just with ocean-mock we can use it in the stories too.

user preferences for fiat currency

  1. global store for user preferences, connected to localStorage
  2. User preferences UI somewhere, with currency selection as first option
  3. Modify price conversion fetches to include more currencies, and pick the one based on user preference

negative prices

Only happens when multiple prices are fetched at same time. Clicking on asset with negative price then will show correct price on asset details.

Screen Shot 2020-09-01 at 12 19 14

Price without connected wallet

I've been thinking about this and the conclusion is that it's not possible in the current state of the app. The problem is that you need a web3 connection and you need an ocean instance. To get a price you need to know the desired network.

My suggestion is that for now we have the constraint that to see prices you need to connect a wallet. When we go on the mainnet we can make a default dummy wallet that has the mainnet config hardcoded

I added the blocked label because we don't have a proper mainnet config

Publish: Include summary pop up with data asset details after Submit

-- kremalicious

This sounds like having a confirmation dialogue with a summary before we start the publish process? Cause after publish we redirect to the actual asset page.

This would imply a slightly different UI flow for the whole publishing, which then should happen in that confirmation popup, including the wallet connection. Would actually make more sense blocking the UI during publish with a popup, think we require people to stay on page during the publish process.


i think it would be better a step/wizard approach than a popup

Tags autocomplete

The UI draft implies some sort of autocomplete for tags during publishing flow. This is very useful to unify all tag names so we don't end up with different cased variations of the same tag.

This needs multiple things to work:

  1. UI: a tag within the input itself is rendered as a small, dismissible Material UI Chip
  2. UI: the Tags field needs an autocomplete UI where users start typing a tag and a dropdown appears showing all available tags based on user input.
  3. Aquarius: to make the UI work, we need a list of all tags in the system which ideally would come from Aquarius, like as described in oceanprotocol/aquarius#223

Another workaround for the consolidated metadata could be to construct and maintain that on frontend, where we have some provider which constantly queries all assets in Aquarius and returns that consolidated metadata based on that. This should only be last resort though, the client RAM usage probably goes through the roof with that.

This is a copy paste from daimler, we will probably not be using material ui

wallet - disconnect button

Should we have a disconnect button? If the user wants to disconnect. He can do so from metamask but there are several clicks involved.

Improve hint about why Publish Submit is disabled when Wallet is not activated

As stated in this comment from @maxieprotocol :

Visually, there is no clear correlation b/w form submit and activate wallet section.

Perhaps we shouldn't keep it disabled but rather provide feedback to the user that he must activate the wallet in order to submit the form the same way as we do with the rest of mandatory fields by
implementing the snack bar warning for wallet activation (that would be the faster solution to fix that)
Later the wallet activation would be a part of the wizard step

image

Not sure if this is an issue anymore, but we need to take this in consideration in the new design for the market

fallback for pool fetching

Just like price display, fetching pools should have a fallback for non-web3 browsers and disconnected wallets.

Consumed whitelisted assets don't shown up in Transactions/Downloaded table

unjapones commented 7 days ago
Downloadable assets that have been whitelisted and downloaded for a given account, do not show up as an entry on the Transactions page, table Downloaded.

kremalicious commented 7 days ago
we get consumed assets from chain and that requires an existing agreement for every asset consumption.

So we either:

actually create an agreement with 0 price when whitelisted users consume an asset
track downloads in a database for each account
Given that 2. is rather big to setup, we might want to look into 1. again

switch form library

This is a proposal to kick out react-jsonschema-form if we decide to continue to polish the POC after the final deadline. Going forward this should provide a more future-proof way of solving all those remaining and newly found issues with current publish form.

Background
Almost everything we want to do (with one form only so far) requires like 1 week of research and 1 week of implementation for minimal changes. With that speed we would take weeks to get the publish form in a user-friendly state. As of right now, nothing fits together visually with countless interaction and usability problems. Live validation is not working. Quickly creating a new form which works out of the box is not possible. Etc.

This is all because react-jsonschema-form tries to solve too many things at once without a clear separation of concerns. It has its own vocabulary and system (Field, Widget, Template, Object Field Template, Schema, UI Schema, what?), uses its own custom UI system and markup, which we constantly try to overwrite, not just in this app. This is also reflected in the library size, almost 100KB when gzipped(!), so we just add to that when doing more and more customizations on top of it. It is simply too inflexible and requires tremendous amount of tweaks to make it work in our projects so far.

This opens up too many unknowns e.g. when dealing with a new project too. What if we want to use another UI framework? Then we would start the research again from the beginning to somehow fiddle together the UI.

Proposal
I think a form library should only deal with handling form data, capturing, validating, saving. That's it, only some minimal API to do just that. Pure functionality, no opinionated UI or markup.

We will always put in the UI, we never want pre-styled forms we then have to figure out how to overwrite. If data management and UI are separated, we reduce the risk of ending up in dead ends where something we need from a form can't be done because library xyz does not support it.

Formik comes to mind, which has a really minimal API, and can be connected with Material UI too. All with 7KB gzipped.

We could then create our own Input/Form component on top of it, which opens up possibility to use any UI in any app in a clean way, without overwriting styles. And we have full control over markup and components. We could still use a json file to create a form with that.

Any other suggestions?

wallet UI glitches

Some weird visuals in wallet popover when not connected to correct network:

  • there never should be NaN
  • spacing above network warning is too much

Screen Shot 2020-08-06 at 14 41 21

As for balance this is tricky. Technically it should not matter to which network I'm connected to, cause I can have a balance in each of those. The wallet UI should actually show those balances then, based on selected network in MetaMask.

But we use ocean to get the balances. If we only use web3 functionality on frontend this would actually work.

People can upload data easily (and we don't store it)

Summary
Provide publishers with an easy solution to drag & drop some files during publish flow and in background they are added to decentralized storage. But where we don't store it or pay for storage.

Motivation
Status quo: publishers must a provide url for their data, which takes technical savvy. Removing this friction will help adoption.

In Ocean Commons, we had where people could "upload to IPFS" for a smoother flow. They would drag and drop the file, and it would get uploaded. However since IPFS does not do storage, we were actually doing the storage on Pacific nodes. This is not a good idea because then we inadvertently start hosting a ton of files for free.

Target outcome

  • have a GUI / UX similar to what we had in Ocean Commons
  • but store somewhere that we don't have to pay for.

Where to store

  • Recommended: Arweave - pay once, store forever
  • (Backup: Filecoin - pay once, store for a long time)
  • IPFS could still be used as the intermediary ("IPFS pinning"). Data would be encrypted.

Add custom provider endpoint during publish

In Ocean Market, OPF has to run Provider and therefore holds the data service decryption key. This means that OPF has custody of these keys.

What to do for this issue:

  • In Ocean Market, give an option for the user to provide an endpoint of the Provider. Therefore the user can run their own Provider, and retain decentralization. We can expect enterprises and others to do this. Handled in #702
  • In Ocean Market, if the user chooses to use the OPF-run Provider, have a GUI affordance to set the expectation that this Provider will only be run until some specific cut-off date: Dec 31, 2021. This ensures that OPF is not bound to some weird long-term commitment that it didn't mean to make. Handled in #703
  • Allow external provider in ocean-lib-js Issue
  • Allow external provider in ocean-lib-py Issue

Note: in the future we will want more decentralized storage of private keys. See related tech spike.

Note: moved from this issue

Review T&C

We need to review the Terms & Condition, they were copy pasted all the time.

[EPIC] that price thing

Figure out the best UI to get the values needed for publishing:

Battle plan

  • new UI for fixed price
  • new UI for dynamic price
  • new price form widget
  • dynamic price: collect and pass correct values to publish method
  • output price on asset details page
  • output price in asset teasers
  • fixed price: wait for new fixed contract, and addition to lib-js
  • fixed price: collect and pass correct values to publish method
  • visual distinction between fixed/dynamic price on Price output
  • output price in asset preview during publish
  • fix price display when user is not connected to correct network (should get fixed with #51)
  • figure out UI for liquidity/pool infos/providing liquidity/removing liquidity
  • output all possible fees during asset creation

User Stories

  1. As a publisher I want to receive 10 OCEAN when someone downloads my data set.
  2. As a publisher I want to receive 10 OCEAN when someone runs a compute job on my data set.
    ...

Ideas

  • Should we have a distinction between Use (Download/Compute) & Trade, like we prepared for asset details already?
  • create data token for each data set?
  • have user decide on data token ticker name?

Web3Feedback: add loading state

Upon activating the wallet users see the message "Not Connected to Pacific" right away although they are connected to Pacific.

This message will stay there for as long as it takes for the app to connect to the Ocean components. While technically correct, it is giving users the impression they have to do something. We should add a loading state for the time between activating wallet and showing the network connection state.

publish form: add entry for date created

Port over the date picker for allowing publishers to set dateCreated. This could follow the pattern we already had where publishers can choose a specific date or a date range for when the data was collected.

When going with date range, the output of dateCreated needs to be updated too.

reset form action does not clear localStorage correctly

Clicking RESET FORM correctly resets the Formik state for the form, but formik-persist seems to fail clearing the localStorage entries properly, resulting in the form values being restored after browsing away and back to the publish page.

wallet UI long balance

If you are rich the wallet doesn't resize properly. So we can do one of the following:

a. Shorten the balance, but how i understood from previous discussions the size is ok based on exchanges
b. Make the wallet size dinamic
c. Make the wallet wider so that the longest value looks ok.

image

Pool default weight and LP fee

Currently the default (hardcoded) values for weight and fee are 9 and 0.03. We discussed the 90% weight but we never discussed the fee for liquidity providers (LPs) to the pool. I randomly set it to 0.03 , I don't even know what % this means

Currency value display

We use https://github.com/coingecko/cryptoformat for formatting our values but that works weirdly in some places:

  • numbers below 1 get a lot of decimals, no matter the currency
  • a full number like 1 will be displayed with lots of 0 decimals too

Screen Shot 2020-09-10 at 11 03 24

Screen Shot 2020-09-10 at 11 03 38

Requirements

We either hack current usage into following those rules, or use another library.

  • For fiat: never more than 2 decimals
  • For crypto: never more than 5 decimals
  • 1.00 needs to be displayed as 1
  • 0.100000 needs to be displayed as 0.1

Fixes #53

Recommend Projects

  • React photo React

    A declarative, efficient, and flexible JavaScript library for building user interfaces.

  • Vue.js photo Vue.js

    ๐Ÿ–– Vue.js is a progressive, incrementally-adoptable JavaScript framework for building UI on the web.

  • Typescript photo Typescript

    TypeScript is a superset of JavaScript that compiles to clean JavaScript output.

  • TensorFlow photo TensorFlow

    An Open Source Machine Learning Framework for Everyone

  • Django photo Django

    The Web framework for perfectionists with deadlines.

  • D3 photo D3

    Bring data to life with SVG, Canvas and HTML. ๐Ÿ“Š๐Ÿ“ˆ๐ŸŽ‰

Recommend Topics

  • javascript

    JavaScript (JS) is a lightweight interpreted programming language with first-class functions.

  • web

    Some thing interesting about web. New door for the world.

  • server

    A server is a program made to process requests and deliver data to clients.

  • Machine learning

    Machine learning is a way of modeling and interpreting data that allows a piece of software to respond intelligently.

  • Game

    Some thing interesting about game, make everyone happy.

Recommend Org

  • Facebook photo Facebook

    We are working to build community through open source technology. NB: members must have two-factor auth.

  • Microsoft photo Microsoft

    Open source projects and samples from Microsoft.

  • Google photo Google

    Google โค๏ธ Open Source for everyone.

  • D3 photo D3

    Data-Driven Documents codes.