Giter VIP home page Giter VIP logo

hoof's People

Contributors

dax911 avatar dependabot[bot] avatar sammcj avatar simoncollins avatar

Stargazers

 avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar

Watchers

 avatar

hoof's Issues

Current features

I am on Windows, running Ollama on WSL. I am able to run the app, but there isn't very much that I can do in the app, and I would like to know how much of that is because I am on Windows, and how much of it is because the features haven't been implemented yet.

  • When I run pnpm tauri dev in the repo (with the Ollama server running in the background`:
    • The window opens (good) (I can also access the app at localhost:1420)
    • The dropdown listing the models is empty (bad)
  • When I type something in the text box and click "Ask":

Are any of these actual problems? Or have they just not been implemented yet?

UI feedback when processing

When adding input and waiting for ollama - the app should provide the user with a UI hint that it's loading / processing.

Perhaps even an event log stream?

Establish Contribution and Community Guidelines

This is a just in case. Don't need but nice reminder to have if we do grow.

Overview

As our project grows and we begin to receive more contributions from the community, it's important to have clear guidelines in place. This issue is to track the creation of comprehensive contribution and community guidelines to ensure a welcoming and productive environment for all contributors.

Task List for Creating Guidelines

  • Research best practices for contribution and community guidelines.

    • Review guidelines from other successful open-source projects.
    • Compile a list of common policies and procedures that we should consider.
  • Draft the Contribution Guidelines.

    • Outline the process for submitting contributions, including coding standards, pull request process, etc.
    • Specify the types of contributions we are looking for and any that we are not.
    • Detail the setup process for development environment and any tools or tests that contributors should run.
  • Draft the Community Guidelines.

    • Establish a code of conduct that promotes a respectful and inclusive environment.
    • Create a reporting process for any incidents of misconduct.
    • Define the roles and responsibilities of community members, including maintainers and contributors.
  • Review and finalize the guidelines.

    • Have the guidelines reviewed by project maintainers and, if possible, by external parties for feedback.
    • Incorporate feedback and revise the guidelines accordingly.
  • Publish the guidelines.

    • Add the Contribution Guidelines to the CONTRIBUTING.md file in the repository.
    • Add the Community Guidelines to the CODE_OF_CONDUCT.md file in the repository.
    • Announce the new guidelines on the project's communication channels.
  • Monitor and iterate on the guidelines.

    • Set a schedule to review the guidelines periodically.
    • Be open to feedback from the community and make adjustments as the project evolves.

Additional Notes

  • Ensure that the guidelines are easy to understand and follow.
  • Consider the legal implications of the guidelines and consult with a legal professional if necessary.

References

Please add any additional tasks or notes that might be relevant as we work through this process.

MVP Plans

Putting this here for communication and bc if I don't take notes I will loose my train of thought. Thanks ChatGPT for helping me organize this.

Creating a macOS application that integrates with a local Ollama model and is triggered by a hotkey involves several steps. Here's a high-level overview of the tasks you'd need to accomplish to create a minimum viable product (MVP):

  1. Set Up a Local Server for Ollama Model: We will need a local server that can act as a "parser" for the Ollama service (would just start it as brew service) and handle requests. This server would receive text and return the model's response. (Can eventually support plugins for prompt engineering things)

  2. Develop a Tauri Application: Tauri is a framework for building desktop applications using web technologies. You can use it to create a lightweight and secure window for your chat interface.

  3. Implement Hotkey Functionality: We will need to use a library that can register global hotkeys on macOS. This library would listen for your specific hotkey combination and trigger the Tauri window to open.

  4. Clipboard and Selection Integration: The application should be able to grab the current selection or clipboard content when the hotkey is pressed.

  5. Create a User Interface: The Tauri window should have a user-friendly interface for chatting with the Ollama model, including a model selector and a chat display.

  6. Communication Between Tauri and the Local Server: Implement the logic to pass messages back and forth between the Tauri application and the local Ollama server.

  7. Packaging and Distribution: Once your application is ready, you'll need to package it for distribution so that others can easily install and use it on their macOS systems.

Now, let's create a task list for a GitHub issue to organize the development of this MVP:


Title: Develop a macOS Application for Local Ollama Model Interaction with Global Hotkey

Body:

Objective

Create a macOS application that allows users to interact with a local Ollama model using a global hotkey. The application will present a chat interface where users can send and receive messages from the Ollama model.

MVP Features

  • Global Hotkey Activation: Application should be triggered by a hotkey (e.g., Shift + Space).
  • Clipboard/Selection Integration: Automatically use the selected text or clipboard content when the application is activated.
  • Chat Interface: A simple and clean chat window for sending and receiving messages.
  • Model Selector: A dropdown to select the Ollama model if multiple models are available.
  • Local Server Communication: The ability to send requests to and receive responses from a local Ollama server.
  • Tauri Window: Use Tauri for creating the application window to ensure a lightweight and secure application.

Tasks

  • Set up a local server capable of running the Ollama model.
    • This is just the ollama software running on localhost
    • Will need a tiny rust server to sit between it to format the requests to the HTTP endpoint w the correctly set model
    • eventually this can be expanded to support a model file maker or other functionality
  • #7
  • #9
  • #11
  • Develop the functionality to pass messages to the local server and display responses in the chat interface.
  • #12
  • #10
  • #20
  • #22
  • #21
  • #23
  • #24
  • #29
  • #28

Potential Libraries/Tools

  • Tauri for the application framework.
  • Rust or Go for the backend server. (I don't know Go and am learning Rust open to PRs for either)
  • A macOS hotkey library for global hotkey registration.
  • Frontend technologies (HTML, CSS, JS) for the chat interface. (My specialty)

Testing & Validation

  • Ensure the hotkey consistently activates the application.
  • Verify the application correctly handles text selection and clipboard content.
  • Test communication with the local Ollama server.
  • Validate the user interface is intuitive and responsive.

This issue outlines the basic requirements and tasks for the project. You can add more details or break down the tasks further as needed. Once you have this issue created, you can start organizing the work into milestones, assigning tasks to contributors, and tracking progress.

Screenshots

This issue only exists to host screenshots etc... that we don't want in the git repo.

Distribute macOS App via Homebrew

Overview

This issue tracks the progress of preparing and distributing our macOS app through Homebrew. This includes determining the best practices for submission, understanding Homebrew and Homebrew Cask, and ensuring we meet all requirements for a smooth user experience.

Task List for Homebrew Distribution

  • Research how to distribute a macOS app through Homebrew.

    • Determine if our app qualifies for a formula or if it should be a Cask.
    • Understand the differences between Homebrew and Homebrew Cask in terms of distribution.
  • Investigate the submission process for Homebrew.

    • Find out what the requirements are for submitting a formula/Cask.
    • Check if there are any costs associated with the submission and distribution.
  • Learn about the review process for Homebrew submissions.

    • Document the steps involved in the review process.
    • Identify common reasons for rejection and how to avoid them.
  • Prepare the app for submission.

    • Ensure the app meets all Homebrew submission criteria.
    • Create a formula/Cask file for the app.
    • Test the installation process locally.
  • Submit the app to Homebrew.

    • Open a pull request to the Homebrew/homebrew-cask or Homebrew/homebrew-core repository.
    • Monitor the pull request for feedback and respond to any required changes.
  • Post-submission tasks.

    • Set up a system to monitor the app's version updates for maintaining the Homebrew formula/Cask.
    • Plan for user support post-distribution.

Additional Notes

  • Ensure that the app complies with Homebrew's acceptance criteria.
  • Consider setting up a tap for easier updates and more control over the distribution process if the official repositories are not suitable.

References

Please update this issue with any progress or additional information found during the research phase.

UI - Settings

It would be great to have a simple UI view for settings.

  • Default model
  • Hotkey
  • Open on active display | Open on main display
  • Don't close when focus is lost (pinning mode)
  • Font size
  • Ollama API URL
  • Enable/disable automatic copy of selection to input field
  • Enable/disable automatic copy of clipboard to input field
  • Option to unload model on idle after n minutes? (maybe?)
  • Toggle streaming vs complete output

etc...

Implement logic to capture the current selection or clipboard content.

The application should be able to grab the current selection or clipboard content when the hotkey is pressed.


Sam -

As a user, I'd like to be able to highlight text in any app on macOS, trigger the app with a hotkey and have it parse the selected text to Ollama along with a prompt or a list of prompts to choose from.

Examples:

  • <highlight text in an email>, <press hotkey>, select a function (prompt) such as "Make language more formal", <ollama processes the request>, clipboard is updated with the response and/or the app pastes the response back into an active text input field.
  • <copy a list of filenames from a directory, e.g. ls -1>, <press hotkey>, select something like 'ask ollama', input something like 'in the file listing selected, are there any files with the title Sam?', <response>
  • etc...

Design and implement the Tauri application window and user interface.

This needs a lot of work I would like to see a UI not dis-simililar to OpenAi's or Vercels AgentAI bear in mind that the ui will also need to be coherent across the two states, spotlight and chat... See this this repo for a demo of the functionality w hotkeys.

Please feel free to add tailwind to the project and submit layout ideas. Can be as simple as an excalidraw sketch.

Recommend Projects

  • React photo React

    A declarative, efficient, and flexible JavaScript library for building user interfaces.

  • Vue.js photo Vue.js

    ๐Ÿ–– Vue.js is a progressive, incrementally-adoptable JavaScript framework for building UI on the web.

  • Typescript photo Typescript

    TypeScript is a superset of JavaScript that compiles to clean JavaScript output.

  • TensorFlow photo TensorFlow

    An Open Source Machine Learning Framework for Everyone

  • Django photo Django

    The Web framework for perfectionists with deadlines.

  • D3 photo D3

    Bring data to life with SVG, Canvas and HTML. ๐Ÿ“Š๐Ÿ“ˆ๐ŸŽ‰

Recommend Topics

  • javascript

    JavaScript (JS) is a lightweight interpreted programming language with first-class functions.

  • web

    Some thing interesting about web. New door for the world.

  • server

    A server is a program made to process requests and deliver data to clients.

  • Machine learning

    Machine learning is a way of modeling and interpreting data that allows a piece of software to respond intelligently.

  • Game

    Some thing interesting about game, make everyone happy.

Recommend Org

  • Facebook photo Facebook

    We are working to build community through open source technology. NB: members must have two-factor auth.

  • Microsoft photo Microsoft

    Open source projects and samples from Microsoft.

  • Google photo Google

    Google โค๏ธ Open Source for everyone.

  • D3 photo D3

    Data-Driven Documents codes.