Giter VIP home page Giter VIP logo

Rich Lysakowski, Ph.D.

AI and Business Intelligence Developer, Data Scientist, Senior Business Analyst and Process Engineer, DBA and Systems Admin

Passionate about applying leading-edge technologies for intelligent data discovery and collection, analytics, visualization, and reporting in business and industrial environments. Technical expertise encompass full SDLC skills from systems analysis and design, programming, debugging, deployment, support, and training. Favorite languages are Python, SQL, and web languages (HTML, CSS, Javascript) on Windows, Linux, Docker, and Azure/AWS/GCP Cloud environments (as required).

Current R&D Work

I actively research and develop applications, and training to harness Generative AI, Prompt Engineering, and intelligent workflow chaining applications using Large Language Models. I have experience with 100s of Python packages to develop applications and utilities for AI, intelligent advanced predictive analytics, ML, deep learning, and web-based systems. I have 30+ years experience an trainer and educator developing and delivering training curricula, courses, workshops, and short courses. I teach from time to time, but my passion is training people to use tools and methods to learn faster and remember for longer.

I love applying my business analysis and technical skills in large and small enterprises. I am an acknowledged SME expert in data and records management, IT systems, and scientific and lab informatics across value chains in pharmaceutical and biotech R&D through Manufacturing, Finance and trading. Contact me to engage my skills on your project.

Development Work

In my personal time, I am actively learning and applying the State Of The Art (SOTA) Generative AI tools and packages. I am working to SMARTLY SIMPLIFY advanced AI and data analytics. There are (too) many steep learning curves for people to climb before they can apply AI technologies. There is TOO MUCH NOISE ABOUT AI -- and NOT ENOUGH SIGNAL! People must discover first-hand what really works, and raise the baseline of AI wisdom high above the popular media's noise. My job as a trainer transfer hard implementation skills, and instill know-how in my trainees and clients. AI technology is advancing FAST toward "AGI". First it was the pitter patter of baby steps, but now the gallop has started, with leapfrogs every other week. Major breakthroughs in 2024 within months will cause seismic shifts in the balance of power between humans and AI agents.

Every week I am testing yet another LLM Agent builder application like OpenDevin, AutoGPT, MetaGPT, LocalGPT, DoctorGPT, babyAGI, DB-GPT, Pinokio, GPT-Engineer, AutoGen Studio, and other domain-specific tools. My goal is to learn Large Language Models (LLMs) as much as possbile and apply them to text, images, sound, and music. The goal is comprehensive and multi-modal generative experiences to accelerate human learning and individual power. One current project is to create a State-of-the-Art (SOTA) AI Guide and Developer Assistant for Prompt Engineering and Intelligent Agent Chaining Systems. New bleeding-edge tools appear weekly, far too fast for regular humans to consume or master in a normal workmonth. We need automated real-time knowledge acquisition tools that simplify explain new tools to newbies, "on demand" and "just-in-time" in the moment, before the spark of human curiosity extinguishes.

Democratized AI-Powered By End Users Throughout the World

I have a strong commitment for individuals to run high-performing "democratized" LOCAL AI systems -- that is 100% OFF-THE-CLOUD -- without requiring ANY major cloud platforms. LOCAL AI systems must not be considered "edge" nodes on the global internet. What people call "The Cloud" is a few Big Tech cloud platform vendors WITH COMMON GOALS TO INSOURCE AND CONTROL AS MANY TECH JOBS, PLUS AS MUCH AI POWER AS POSSIBLE. They conspire in their commitment to lead the AI Arms race, and beat or consume competitors anywhere they arise. Big Tech companies are steadily and substantially in-sourcing tech jobs to their own corporations, leaving millions of citizens without meaningful work.

Get Out of the Borg - And Back Into User-Centric Computing

If normal citizens don't re-power, re-source, and revert "The Cloud" back to "The Edge", then entire segments of white, brown, and blue collar workers will be marginalized by 2030. Big Tech Cloud vendors will CONTROL ALL JOBS they insource. Once they control ("OWN") them, they will place these jobs anywhere that quality labor is cheapest. AI agents require only electricity to work. AI software agents don't need salaries, nutrition, nurturing, sleep, exercise, sex, emotional support, or other "human overhead". AI agents are an obvious endpoint to "in-source" many jobs. The argument that AI cannot produce the same high-level of quality as humans is simply wrong. AI already (often) produces higher quality and consistency for many tasks and work products. AI systems quality and sophistication are improving faster than most people can understand. From the 2000s Mike Hammer's and SAP's re-engineering waves moved jobs off-shore. In the 2020s, more jobs move off-ground to the cloud, and then disappear to AI agents and robots. Humans left on "The Edge" will be marginalized. The Cloud moves economic power and control away from individuals and gives it to Big Tech. IT managers, CIOs, CTOs, CDOs beware! ... AI is coming for your job sooner than you think. Local, democractized AI will bring power back to the people.

Individuals must ensure that AI stays democratized and runs equally well locally -- off-the-cloud -- and online only when absolutely required. Big Tech Cloud providers will prevail otherwise.

Remember "Human Lives Matter" (HLM)! Cloud computing is a tool, not the endpoint for society.

Updated: 2024.04.12

https://www.linkedin.com/in/rich-lysakowski-phd

success important critical informational inactive

Computer Operating systems

Technical Skills

👨‍💻 Python Libraries

{"Author": "Rich Lysakowski", "Updated": "2023-08-25" }

Rich Lysakowski's Projects

pca icon pca

Breaking down a n dimension data to 2 dimension to visualize it in order to find out best fittable function or other regression

pda_book icon pda_book

Code Examples Data Science using Python

perfplot icon perfplot

:chart_with_upwards_trend: Performance analysis for Python snippets

pexpect icon pexpect

A Python module for controlling interactive programs in a pseudo-terminal

pgcontents icon pgcontents

A Postgres-backed ContentsManager implementation for Jupyter

pipeline icon pipeline

PipelineAI: Real-Time Enterprise AI Platform

piphone icon piphone

PiPhone - A DIY cellphone based on Raspberry Pi

pipreqs icon pipreqs

pipreqs - Generate pip requirements.txt file based on imports of any project. Looking for maintainers to move this project forward.

planter icon planter

Generate PlantUML ER diagram textual description from PostgreSQL tables

plur-google-nlp-toolkit icon plur-google-nlp-toolkit

PLUR (Programming-Language Understanding and Repair) is a collection of source code datasets suitable for graph-based machine learning. We provide scripts for downloading, processing, and loading the datasets. This is done by offering a unified API and data structures for all datasets.

poachplate icon poachplate

A tool for creating boilerplate for Python command line applications.

poetry icon poetry

Python dependency management and packaging made easy.

postgap icon postgap

Linking GWAS studies to genes through cis-regulatory datasets

projectreward_options_spread icon projectreward_options_spread

A software to shortlist and find the best options spread available for a given stock and help it visualise using payoff graphs.

promo icon promo

Prososdy Morph: A python library for manipulating pitch and duration in an algorithmic way, for resynthesizing speech.

propertyinspectionprediction icon propertyinspectionprediction

A Fortune 100 company, Liberty Mutual Insurance has provided a wide range of insurance products and services designed to meet their customers' ever-changing needs for over 100 years. To ensure that Liberty Mutual’s portfolio of home insurance policies aligns with their business goals, many newly insured properties receive a home inspection. These inspections review the condition of key attributes of the property, including things like the foundation, roof, windows and siding. The results of an inspection help Liberty Mutual determine if the property is one they want to insure. In this challenge, your task is to predict a transformed count of hazards or pre-existing damages using a dataset of property information. This will enable Liberty Mutual to more accurately identify high risk homes that require additional examination to confirm their insurability.Liberty Mutual is interested in hiring predictive modelers like you to work on one of many growing analytics teams within our company. As a member of Liberty Mutual’s advanced analytics community, you will have the opportunity to apply sophisticated, cutting-edge techniques, similar to those used in this competition, to large data sets in departments such as Actuarial, Product, Claims, Marketing, Distribution, Human Resources, and Finance.

prophet icon prophet

Financial markets analysis framework for programmers

prudential-life-insurance-assessment icon prudential-life-insurance-assessment

This is a part of predictive analytics project, I have done, on building a predictive model which accurately predicts classes for risk assessment of the citizens enrolled with Prudential Life Insurance

pserve icon pserve

Fast and flexible RESTful API server providing access to Python from many languages and systems.

py-pkgs icon py-pkgs

Open source book about making Python packages.

Recommend Projects

  • React photo React

    A declarative, efficient, and flexible JavaScript library for building user interfaces.

  • Vue.js photo Vue.js

    🖖 Vue.js is a progressive, incrementally-adoptable JavaScript framework for building UI on the web.

  • Typescript photo Typescript

    TypeScript is a superset of JavaScript that compiles to clean JavaScript output.

  • TensorFlow photo TensorFlow

    An Open Source Machine Learning Framework for Everyone

  • Django photo Django

    The Web framework for perfectionists with deadlines.

  • D3 photo D3

    Bring data to life with SVG, Canvas and HTML. 📊📈🎉

Recommend Topics

  • javascript

    JavaScript (JS) is a lightweight interpreted programming language with first-class functions.

  • web

    Some thing interesting about web. New door for the world.

  • server

    A server is a program made to process requests and deliver data to clients.

  • Machine learning

    Machine learning is a way of modeling and interpreting data that allows a piece of software to respond intelligently.

  • Game

    Some thing interesting about game, make everyone happy.

Recommend Org

  • Facebook photo Facebook

    We are working to build community through open source technology. NB: members must have two-factor auth.

  • Microsoft photo Microsoft

    Open source projects and samples from Microsoft.

  • Google photo Google

    Google ❤️ Open Source for everyone.

  • D3 photo D3

    Data-Driven Documents codes.