Skip to content
Permalink

Comparing changes

Choose two branches to see what’s changed or to start a new pull request. If you need to, you can also or learn more about diff comparisons.

Open a pull request

Create a new pull request by comparing changes across two branches. If you need to, you can also . Learn more about diff comparisons here.
base repository: AI-Hypercomputer/jetstream-pytorch
Failed to load repositories. Confirm that selected base ref is valid, then try again.
Loading
base: jetstream-v0.2.2
Choose a base ref
...
head repository: AI-Hypercomputer/jetstream-pytorch
Failed to load repositories. Confirm that selected head ref is valid, then try again.
Loading
compare: jetstream-v0.2.3
Choose a head ref
  • 19 commits
  • 34 files changed
  • 7 contributors

Commits on Jun 4, 2024

  1. Enable jax profiler server in run with ray (#112)

    * add jax profiler server
    
    * update jetstream
    FanhaiLu1 authored Jun 4, 2024
    Configuration menu
    Copy the full SHA
    fe328bb View commit details
    Browse the repository at this point in the history
  2. Add for readme interleave multiple host with ray (#114)

    * add interleave multiple host with ray readme
    
    * add interleave multiple host with ray readme
    FanhaiLu1 authored Jun 4, 2024
    Configuration menu
    Copy the full SHA
    f4426c2 View commit details
    Browse the repository at this point in the history
  3. Fix conversion bug (#116)

    * Fix
    
    * Format
    yeandy authored Jun 4, 2024
    Configuration menu
    Copy the full SHA
    7f6e45f View commit details
    Browse the repository at this point in the history

Commits on Jun 6, 2024

  1. Integrate disaggregated serving with JetStream (#117)

    * add diaggregated server with ray support
    
    * add run_server wity ray
    
    * format
    FanhaiLu1 authored Jun 6, 2024
    Configuration menu
    Copy the full SHA
    52ec00f View commit details
    Browse the repository at this point in the history

Commits on Jun 7, 2024

  1. Support HF LLaMA ckpt conversion (#118)

    * support converting hf checkpoint
    lsy323 authored Jun 7, 2024
    Configuration menu
    Copy the full SHA
    94b576c View commit details
    Browse the repository at this point in the history
  2. Configuration menu
    Copy the full SHA
    e07aee6 View commit details
    Browse the repository at this point in the history

Commits on Jun 10, 2024

  1. Add support for Llama3-70b (#101)

    * Add support for Llama3-70b
    
    * Fix unit tests
    
    * assert model_name is one of llama-2 or llama-3 for weight sharding
    
    * Fix lint
    
    * Revert separate shardings for llama-2 and llama-3
    
    * Fix lint
    bhavya01 authored Jun 10, 2024
    Configuration menu
    Copy the full SHA
    4535bdf View commit details
    Browse the repository at this point in the history
  2. Configuration menu
    Copy the full SHA
    87b8d92 View commit details
    Browse the repository at this point in the history

Commits on Jun 11, 2024

  1. Mixtral enablement. (#120)

    * Initial Mixtral enablement.
    
    * Adds the mistral tokenizer model.
    
    * Updates the convert checkpoint file to handle mistral model.
    
    * Renames the typo of the model name.
    
    * Fixing checkpoing loading. Still has some issue. Push to debug.
    
    * Running on CPU working, temporarily disable the generate jit to see it's moving. But the outputs doesn't make sense yet because weights are not loaded yet.
    
    * Fix checkpoint loading issue. Right now loading from the gpt-fast converter with qkv fusion.
    
    * Fix the ckpt conversion script for mistral model. Fix the freqs_cis for loading pth file.
    
    * Add quantized layer for moe
    
    quantization
    
    * Add the huggingface download script. Improved the convert checkpoints logging.
    
    * Clean up and fix lint errors.
    
    * Missing cleanups.
    
    * Add instructions for Mixtral.
    
    * Renames everything from mistral to mixtral.
    
    * Fix more lint errors.
    
    * Removes the unnecessary checkpoint name mapping from the original Mixtral checkpoints.
    
    * Fix the model calling arg sequence; Fix the checkpoint convert script.
    
    ---------
    
    Co-authored-by: Han Qi 
    wang2yn84 and qihqi authored Jun 11, 2024
    Configuration menu
    Copy the full SHA
    d6bf068 View commit details
    Browse the repository at this point in the history

Commits on Jun 12, 2024

  1. Configuration menu
    Copy the full SHA
    e2ee7dd View commit details
    Browse the repository at this point in the history
  2. Add activation quantization support to per-channel quantized linear l…

    …ayers (#105)
    
    * add activation quant support
    
    * pyink
    
    * fix dtype
    
    * uncomment prompts
    
    * try fix test
    
    add debug print to debug
    
    remove print, add bias to asym quant tests
    
    lint
    
    * add comment
    lsy323 authored Jun 12, 2024
    Configuration menu
    Copy the full SHA
    8a125b6 View commit details
    Browse the repository at this point in the history

Commits on Jun 13, 2024

  1. Remove JSON config mangling for Gemma ckpt (#124)

    update gemma convert
    lsy323 authored Jun 13, 2024
    Configuration menu
    Copy the full SHA
    fe8dbde View commit details
    Browse the repository at this point in the history

Commits on Jun 14, 2024

  1. Configuration menu
    Copy the full SHA
    97aaeae View commit details
    Browse the repository at this point in the history
  2. Add lock in prefill and generate to prevent starvation (#126)

    add lock for prefill and generate to prevent starvation
    FanhaiLu1 authored Jun 14, 2024
    Configuration menu
    Copy the full SHA
    dc90aea View commit details
    Browse the repository at this point in the history
  3. Configuration menu
    Copy the full SHA
    d8d2da4 View commit details
    Browse the repository at this point in the history

Commits on Jun 15, 2024

  1. Update README.md (#128)

    qihqi authored Jun 15, 2024
    Configuration menu
    Copy the full SHA
    8bffb5d View commit details
    Browse the repository at this point in the history

Commits on Jun 17, 2024

  1. Update summary.md (#125)

    qihqi authored Jun 17, 2024
    Configuration menu
    Copy the full SHA
    7526a90 View commit details
    Browse the repository at this point in the history
  2. Update README.md (#129)

    bhavya01 authored Jun 17, 2024
    Configuration menu
    Copy the full SHA
    aa90b05 View commit details
    Browse the repository at this point in the history

Commits on Jun 19, 2024

  1. make sure GPU works (#130)

    * make sure GPU works
    qihqi authored Jun 19, 2024
    Configuration menu
    Copy the full SHA
    fa1f120 View commit details
    Browse the repository at this point in the history
Loading