Skip to main content

JetBrains AI Assistent–Ollama support

I talked about Ollama before as a way to run a Large-Language-Model(LLM) locally. This opens the door to try out multiple modals at a low(er) cost (although also a lower performance) and could be interesting if you are not allowed to share any data with an AI provider. For example you are a developer but your employer doesn’t allow you to use AI tools for that reason.

If this is a use case that is relevant for you, than I have some good news for you. With the latest version of JetBrains AI Assistent(available in JetBrains Rider but also other IDE’s) you can now use

Let me show you how to use this:

  • Open JetBrains Rider(or any other IDE that integrates the JetBrains AI Assistent)
  • Hit Ctrl-Alt-S or go to the Settings through the Settings icon at the top right

 


  • Go to the AI Assistant section under Tools

  • Check the Enable Ollama checkbox.
    • A warning message appears about Data Sharing with Third-Party AI Service Providers.
    • Click OK to continue
  • Hit Test Connection


Now we can use a local language model in the chat window.

Another way to achieve this is directly through the chat window:

  • Click on the current language model in the AI Assistant window

  • Click on Connect… in the Ollama section

 

  • After the connection is made, you can select any of the locally installed models

 


Remark: At the moment of writing this post, the local language model cannot be used at other places.

A nice feature of the JetBrains AI assistent is that it offers a prompt library that you can tweak for specific purposes. For example, here is the prompt to write documentation for a C# code file:

 

Remark: The first time I tried to use the JetBrains AI Assistant I got stuck on an activation. I was able to solve it by logging out through Manage Licenses and login again. (More info here)

More information

Running large language models locally using Ollama

Github Copilot– Some experimentation

Ollama

Popular posts from this blog

Kubernetes–Limit your environmental impact

Reducing the carbon footprint and CO2 emission of our (cloud) workloads, is a responsibility of all of us. If you are running a Kubernetes cluster, have a look at Kube-Green . kube-green is a simple Kubernetes operator that automatically shuts down (some of) your pods when you don't need them. A single pod produces about 11 Kg CO2eq per year( here the calculation). Reason enough to give it a try! Installing kube-green in your cluster The easiest way to install the operator in your cluster is through kubectl. We first need to install a cert-manager: kubectl apply -f https://github.com/cert-manager/cert-manager/releases/download/v1.14.5/cert-manager.yaml Remark: Wait a minute before you continue as it can take some time before the cert-manager is up & running inside your cluster. Now we can install the kube-green operator: kubectl apply -f https://github.com/kube-green/kube-green/releases/latest/download/kube-green.yaml Now in the namespace where we want t...

Azure DevOps/ GitHub emoji

I’m really bad at remembering emoji’s. So here is cheat sheet with all emoji’s that can be used in tools that support the github emoji markdown markup: All credits go to rcaviers who created this list.

.NET 9 - Goodbye sln!

Although the csproj file evolved and simplified a lot over time, the Visual Studio solution file (.sln) remained an ugly file format full of magic GUIDs. With the latest .NET 9 SDK(9.0.200), we finally got an alternative; a new XML-based solution file(.slnx) got introduced in preview. So say goodbye to this ugly sln file: And meet his better looking slnx brother instead: To use this feature we first have to enable it: Go to Tools -> Options -> Environment -> Preview Features Check the checkbox next to Use Solution File Persistence Model Now we can migrate an existing sln file to slnx using the following command: dotnet sln migrate AICalculator.sln .slnx file D:\Projects\Test\AICalculator\AICalculator.slnx generated. Or create a new Visual Studio solution using the slnx format: dotnet new sln --format slnx The template "Solution File" was created successfully. The new format is not yet recognized by VSCode but it does work in Jetbr...
OSZAR »