Giter VIP home page Giter VIP logo

Comments (20)

JamieLee0510 avatar JamieLee0510 commented on May 24, 2024 1

oh the babel usage is for dealing with test-cases running issue, not for compiling parts.

from client-js.

fuegoio avatar fuegoio commented on May 24, 2024 1

Well in this case let's just roll with tsup I think and try to not have babel as a dependency that's all I ask ahah.

from client-js.

sublimator avatar sublimator commented on May 24, 2024 1

oh right, yeah! babel is to make ts-jest faster eh?

You can probably get around to using swc for jest in another PR :)

from client-js.

JamieLee0510 avatar JamieLee0510 commented on May 24, 2024 1

@fuegoio
I have updated the pr #73 by syncing latest commits
could you kindly help check about it?

from client-js.

sublimator avatar sublimator commented on May 24, 2024

It seems the JSDOC comments are actually incorrect a lot of the time. For example chatStream takes a single object param, but the comment describes multiple params:

/**
   * A chat endpoint that streams responses.
   * @param {*} model the name of the model to chat with, e.g. mistral-tiny
   * @param {*} messages an array of messages to chat with, e.g.
   * [{role: 'user', content: 'What is the best French cheese?'}]
   * @param {*} tools  a list of tools to use.
   * @param {*} temperature the temperature to use for sampling, e.g. 0.5
   * @param {*} maxTokens the maximum number of tokens to generate, e.g. 100
   * @param {*} topP the cumulative probability of tokens to generate, e.g. 0.9
   * @param {*} randomSeed the random seed to use for sampling, e.g. 42
   * @param {*} safeMode deprecated use safePrompt instead
   * @param {*} safePrompt whether to use safe mode, e.g. true
   * @param {*} toolChoice the tool to use, e.g. 'auto'
   * @param {*} responseFormat the format of the response, e.g. 'json_format'
   * @return {Promise<Object>}
   */
  chatStream = async function* ({
    model,
    messages,
    tools,
    temperature,
    maxTokens,
    topP,
    randomSeed,
    safeMode,
    safePrompt,
    toolChoice,
    responseFormat,
  })

This should be more like:

/**
 * A chat endpoint that streams responses using a generator function.
 * This method uses an object to pass named parameters.
 * 
 * @param {Object} config - The configuration object for the chat stream.
 * @param {string} config.model - The name of the model to chat with, e.g., 'mistral-tiny'.
 * @param {Array<{role: string, content: string}>} config.messages - An array of messages for the chat, each containing a 'role' and 'content'.
 * @param {Array<string>} config.tools - A list of tools to use during the chat session.
 * @param {number} config.temperature - The temperature to use for sampling, typically between 0 and 1, e.g., 0.5.
 * @param {number} config.maxTokens - The maximum number of tokens to generate, e.g., 100.
 * @param {number} config.topP - The cumulative probability of token selection, e.g., 0.9.
 * @param {number} config.randomSeed - The random seed to use for sampling, e.g., 42.
 * @param {boolean} config.safePrompt - Whether to use safe mode in prompts, e.g., true.
 * @param {string} config.toolChoice - The tool to use, specified by the user, e.g., 'auto'.
 * @param {string} config.responseFormat - The format of the response, e.g., 'json'.
 * @return {AsyncGenerator<Object>} - A generator that yields the chat response as an object.
 */
async function* chatStream({
  model,
  messages,
  tools,
  temperature,
  maxTokens,
  topP,
  randomSeed,
  safeMode, // If still needed, otherwise remove it.
  safePrompt,
  toolChoice,
  responseFormat,
}) {

If you were to choose to move to TypeScript, the parameters could be documented as part of the interface:

/**
 * Configuration options for the chat stream function.
 */
interface ChatParams {
  /** the name of the model to chat with, e.g. mistral-tiny */
  model: string;

  /** an array of messages to chat with, e.g.
   * [{role: 'user', content: 'What is the best French cheese?'}]
   */
  messages: Array<{ role: string, content: string }>;

  /** a list of tools to use. */
  tools: string[];

  /** the temperature to use for sampling, e.g. 0.5 */
  temperature: number;

  /** the maximum number of tokens to generate, e.g. 100 */
  maxTokens: number;

  /** the cumulative probability of tokens to generate, e.g. 0.9 */
  topP: number;

  /** the random seed to use for sampling, e.g. 42 */
  randomSeed: number;

  /** deprecated use safePrompt instead */
  safeMode?: boolean; // Including as optional, assuming it's still part of the configuration but deprecated.

  /** whether to use safe mode, e.g. true */
  safePrompt: boolean;

  /** the tool to use, e.g. 'auto' */
  toolChoice: string;

  /** the format of the response, e.g. 'json_format' */
  responseFormat: string;
}

from client-js.

fuegoio avatar fuegoio commented on May 24, 2024

I think I prefer to use Typescript and have a compilation step. We could use JS + JSDoc like some other projects that ditched TS but I think at this scale it is not worth it, especially as TS compilation time is gonna be really quick.

We will anyway need a rollup / esbuild step I think at some point.

from client-js.

sublimator avatar sublimator commented on May 24, 2024

from client-js.

JamieLee0510 avatar JamieLee0510 commented on May 24, 2024

Actually, I just created the Typescript version from this repo, mistral-ts.
Maybe could help contribute or refactor?

from client-js.

fuegoio avatar fuegoio commented on May 24, 2024

Actually, I just created the Typescript version from this repo, mistral-ts. Maybe could help contribute or refactor?

Really nice thanks a lot! I think @sublimator wanted to do the TS version, but we could start from your repo. I like the usage of tsup, but I would prefer to use swc instead of Babel for performance and long-term maintenance.

If you can create a PR to import your work I think we can start from there, or we can just use it as inspiration!

from client-js.

JamieLee0510 avatar JamieLee0510 commented on May 24, 2024

Actually, I just created the Typescript version from this repo, mistral-ts. Maybe could help contribute or refactor?

Really nice thanks a lot! I think @sublimator wanted to do the TS version, but we could start from your repo. I like the usage of tsup, but I would prefer to use swc instead of Babel for performance and long-term maintenance.

If you can create a PR to import your work I think we can start from there, or we can just use it as inspiration!

Love to do it~
After switching into swc, I will pr it.

from client-js.

sublimator avatar sublimator commented on May 24, 2024

tsup looks like a really nice way to build both commonjs and esm!

I see that mistral-ts is including a version of code that has been updated recently, fixing the node-fetch logic. I believe it fixed usage in non-window browser environments such as service workers, and added a _fetch hook point to the client, used in the tests.

For me personally, my preference would be to merge #72 (which includes contributions of others, and is also set up to close other PRs/issues) and then move ahead from there, continuing in a line.

Even then, I think it would probably be quite easy to reuse a lot of the config and typescript code from /tests in mistral-ts, so if you opened a PR with that it would be very helpful, and I could include some of the commits (as done with #72).

I'm happy to move ahead with it.

There was also PR #41 for TypeScript, but it seemed to languish. So really, whatever is easy for Mistral. In practice, that seems to be Mr @fuegoio at the moment.

from client-js.

JamieLee0510 avatar JamieLee0510 commented on May 24, 2024

I have create the PR #73 in tsup version; just encounter some issues while migrating to swc+rollup 🥲
Will keep working in progress on it.

If there are there any reference will help a lot~

from client-js.

sublimator avatar sublimator commented on May 24, 2024

@JamieLee0510

Great, I'll cherrypick some commits from that once #72 is sorted out

from client-js.

fuegoio avatar fuegoio commented on May 24, 2024

I have create the PR #73 in tsup version; just encounter some issues while migrating to swc+rollup 🥲 Will keep working in progress on it.

If there are there any reference will help a lot~

@JamieLee0510 You can take inspiration from the openai-node repository, they are using SWC but not tsup unfortunately (they have a custom build script for commonjs + esm). Thanks a lot for your contribution :) Try to also base your work from the latest main updates!

Great, I'll cherrypick some commits from that once #72 is sorted out

@sublimator I'll merge #72 asap as I think it goes into the right direction and can be the source of truth for the types. You can work on TSing like @JamieLee0510 but that may seems to be duplicate work, let's maybe wait a bit.

from client-js.

JamieLee0510 avatar JamieLee0510 commented on May 24, 2024

@fuegoio no problem~
However, I had checked the openai-node repository while working on mistral-ts; maybe they are using tsc-multi instead of swc to compile?

Or there are something I 'm missing? The file I read was [openai-node/scripts/utils/make-dist-package-json.cjs] (https://github.com/openai/openai-node/blob/master/scripts/utils/postprocess-files.cjs).

from client-js.

sublimator avatar sublimator commented on May 24, 2024

@fuegoio
That works for me! Thanks.
I'm happy the time spent on #72 was not in vain

(could have sworn I sent an email (edit: message) like this from email already?)

from client-js.

sublimator avatar sublimator commented on May 24, 2024

Can tsup simply use typescript (i.e. no babel, or swc) ? (at least for now)
It's a pretty tiny project really?

from client-js.

sublimator avatar sublimator commented on May 24, 2024

Oh, it uses esbuild by default, right? Why not just use the default? KISS?

from client-js.

fuegoio avatar fuegoio commented on May 24, 2024

Yes that's fine by me actually! I thought of swc to replace babel as I thought it was somewhere in the chain. But using only what is necessary is the ideal yes. @JamieLee0510 Why do you use babel in your repository?

from client-js.

JamieLee0510 avatar JamieLee0510 commented on May 24, 2024

haha I should have recognized this point sooner

from client-js.

Related Issues (20)

Recommend Projects

  • React photo React

    A declarative, efficient, and flexible JavaScript library for building user interfaces.

  • Vue.js photo Vue.js

    🖖 Vue.js is a progressive, incrementally-adoptable JavaScript framework for building UI on the web.

  • Typescript photo Typescript

    TypeScript is a superset of JavaScript that compiles to clean JavaScript output.

  • TensorFlow photo TensorFlow

    An Open Source Machine Learning Framework for Everyone

  • Django photo Django

    The Web framework for perfectionists with deadlines.

  • D3 photo D3

    Bring data to life with SVG, Canvas and HTML. 📊📈🎉

Recommend Topics

  • javascript

    JavaScript (JS) is a lightweight interpreted programming language with first-class functions.

  • web

    Some thing interesting about web. New door for the world.

  • server

    A server is a program made to process requests and deliver data to clients.

  • Machine learning

    Machine learning is a way of modeling and interpreting data that allows a piece of software to respond intelligently.

  • Game

    Some thing interesting about game, make everyone happy.

Recommend Org

  • Facebook photo Facebook

    We are working to build community through open source technology. NB: members must have two-factor auth.

  • Microsoft photo Microsoft

    Open source projects and samples from Microsoft.

  • Google photo Google

    Google ❤️ Open Source for everyone.

  • D3 photo D3

    Data-Driven Documents codes.