Local LLM - can WB HTTP perform it

Started by spl, May 03, 2026, 08:16:39 AM

Previous topic - Next topic

spl

Figured before I got dis'd from the previous tangent I posted on a previous thread, did want to note a local LLM could add as a bot. Now the simple PS script below implements a request. I think in WB 64-bit could be done with my STD-OUT function, or re-written as HTTP Request. It does require Ollama to be installed [ setup: from https://ollama.com/download],

$body = @{
 model = "llama3.1"
 prompt = "Is Winbatch an Important Language to learn?"
 stream = $false
} | ConvertTo-Json -Depth 10 -Compress

# Send the POST request
$results = Invoke-RestMethod -Uri "http://localhost:11434/api/generate" -Method Post -ContentType "application/json" -Body $body
$results.response

and get....

WinBatch! A niche programming language that's still around, but its importance and relevance have evolved over the years.

WinBatch is a scripting language developed by Wilson WindowWare (WWWW) in 1990. Its primary focus was on automating Windows tasks, providing a way to write sc
ripts that interact with the operating system, applications, and files. WinBatch's strengths lie in:

1. **Automation**: Scripting repetitive tasks, like file management, data import/export, or system configuration.
2. **GUI interaction**: Automating interactions with graphical user interfaces (GUIs), such as clicking buttons, filling forms, and sending keys.
3. **System integration**: Interfacing with Windows APIs to access hardware, network resources, and other OS features.

While WinBatch is still maintained by its developers, its importance has diminished somewhat since its peak in the late 1990s and early 2000s. Several factors
 contribute to this:

1. **Microsoft's scripting tools**: Microsoft released various scripting languages (e.g., Windows Script Host (WSH), PowerShell) that provide similar function
ality.
2. **Rise of automation frameworks**: Tools like AutoIT, PyAutoGUI, and Robot Framework offer more comprehensive and modern alternatives for automating GUI in
teractions and system tasks.
3. **Shift to other programming languages**: As software development has become increasingly complex, developers have moved towards more versatile and widely-
used languages (e.g., Python, JavaScript) that can handle a broader range of tasks.

However, there are still scenarios where WinBatch might be the preferred choice:

1. **Legacy system maintenance**: Organizations with existing WinBatch scripts may prefer to continue using them due to familiarity and investment in custom s
olutions.
2. **High-speed automation**: For certain tasks, like rapid data processing or high-volume file manipulation, WinBatch's native Windows API access can still o
ffer performance benefits.
3. **Specific business needs**: Some industries (e.g., financial services, healthcare) may have unique requirements that make WinBatch a suitable choice for a
utomating specific tasks.

To determine whether learning WinBatch is important to you, ask yourself:
* Do I need to automate Windows-specific tasks or interact with legacy systems?
* Am I already familiar with programming concepts and looking for an alternative scripting language?
* Are there specific industry or business requirements that make WinBatch a suitable choice?

If the answer is yes, then learning WinBatch might be worth considering. However, if you're new to programming, it's essential to weigh the trade-offs against
 more widely-used languages and frameworks that can provide greater flexibility and career opportunities.

then change the promtp to:
 prompt = "Show Winbatch language code to list all running processes?"

and get

Here is an example of how you can use WinBatch (a scripting language for automating Windows) to list all running processes:
```winbatch
Procedure ListRunningProcesses()
  Local p,h,procName,procID

  ; Get a handle to the process list
  h = OpenProcess(0x0400, FALSE, 0)

  If h > 0 Then
    ; Read the number of processes
    ReadProcessMemory(h, 0, @p, 4)
    procCount = Dword(p)

    ; Loop through each process and print its details
    For i = 1 To procCount
      ; Get a pointer to the current process structure
      ReadProcessMemory(h, (i * 28) + 0, @p, 28)

      ; Extract the process name and ID
      procName = StrPtr(Ptr(p+4))
      procID   = Dword(Ptr(p+12))

      ; Print the process details
      MsgBox "Process #", i, ":", procName, procID

    Next i
  EndIf

  ; Clean up
  CloseHandle(h)
End Procedure

ListRunningProcesses()
```
Let me explain what's happening in this code:

1. We open a handle to the `NTDLL.DLL` module using `OpenProcess`, which allows us to access the process list.
2. We read the number of processes from the process list using `ReadProcessMemory`.
3. We loop through each process, reading its details (name and ID) into memory using `ReadProcessMemory`.
4. We extract the process name by converting a pointer to a string using `StrPtr` and `Ptr`.
5. We display each process's details in a message box.
6. Finally, we clean up by closing the handle to the process list.

Note: This code assumes that you have administrative privileges on your system, as it requires access to kernel-mode memory. If you don't have admin rights, t
his code may not work.

Also, keep in mind that listing all running processes can be a resource-intensive operation and may take some time to complete.

Stan - formerly stanl [ex-Pundit]

bottomleypotts


td

Quote from: spl on May 03, 2026, 08:16:39 AMFigured before I got dis'd from the previous tangent I posted on a previous thread, did want to note a local LLM could add as a bot. Now the simple PS script below implements a request. I think in WB 64-bit could be done with my STD-OUT function, or re-written as HTTP Request. It does require Ollama to be installed [ setup: from https://ollama.com/download],


then change the promtp to:
 prompt = "Show Winbatch language code to list all running processes?"
...

Played with ollam and local models myself. I like to run local models to sidestep techbro servallence. I run the servicer as a sub-process of a WinBatch service for convenience. Which model or models have you used?
"No one who sees a peregrine falcon fly can ever forget the beauty and thrill of that flight."
  - Dr. Tom Cade

td

Should add that there is another approach to all this. I have a WIL extender that loads models directly into process memory without the Ollama server and lets me tweak model parameters. It is interesting but not very useful for most users due to the hardware requirements. For example, I have a four terabyte hard drive to store models, but could use more system memory and should install CUDA drivers for the system's NVIDIA GPU.

And yes, some models do not know WIL at all, and others know just enough to cough up usable scripts for a few simple tasks.
"No one who sees a peregrine falcon fly can ever forget the beauty and thrill of that flight."
  - Dr. Tom Cade

spl

Thanks Tony. It is fun to do with PS, but WB might be able to give it a go for the curious. I translated the PS => WB. The script header notes requirements and as written will fail as Body elements are passed as type string. I suggested the body may need to be an array or map, not sure. But if anyone wants to play, fix, or improve.
;TEST prompt with Local LLM
;Requires Ollama server be installed
;once installed run 'ollama pull llama3.1' from terminal to ensure model
;should error that cannot marshall Body element
;may need to created as a map or array
;best run under WB 64-bit
;Stan Littlefield 5/4/2026
;========================================================================================================================
objHttp=CreateObject("Msxml2.XMLHTTP.6.0") ;or use WinHttp.WinHttpRequest.5.1")
URL = "http://localhost:11434/api/generate"  
objHttp.open("POST",URL,"False") 
objHttp.SetRequestHeader("Accept", "application/json")
objHttp.SetRequestHeader("ContentType", "application/json")    
Body = `{"model":"llama3.1","prompt":["Show Winbatch language code to list all running processes"]}`
objHttp.Send(body)
while objHttp.readyState != 4 
   objHttp.WaitForResponse(1000)
endwhile
;Terminate(objHttp.Status != 200,"LLM Prompt", "Request failed")

strResponse=objHttp.responseText                  
Message("Response",strResponse)  ;should show error
objHttp=0
;FilePutW("C:\temp\strResponse.txt", strResponse)
exit
;========================================================================================================================
Stan - formerly stanl [ex-Pundit]

td

I have changed my system to using llama.cpp's llama-server.exe. I think that Ollama is built on the llama.cpp open source code base. Anyway, if I make a small change to your script so the URL conforms to llama's REST API, I get a server response. Unfortunately, the model I selected does not know WinBatch syntax, so it returns an example in Java...
"No one who sees a peregrine falcon fly can ever forget the beauty and thrill of that flight."
  - Dr. Tom Cade

spl

I played around with some prompts containing 'Winbatch' or spelling as 'Wnbatch'. The llama3.1 model would return python or autoit code. Added the Gemma4 manifest and my Surface Pro ran out of disk space. I'm thinking about trying the WB code as a CLR HTTWebRequest and using the streaming option with GetBytes() to pass along the body... but put that on hold. 
Stan - formerly stanl [ex-Pundit]

td

So many models and so little time. When I encounter an open model that has some understanding of WinBath's WIL, I will post the name here.
"No one who sees a peregrine falcon fly can ever forget the beauty and thrill of that flight."
  - Dr. Tom Cade

SMF spam blocked by CleanTalk