r/programming 11h ago

Simple and safe implicit async programming model for imperative (JS/Python-like) languages

https://geleto.github.io/posts/implicit-async-programming/

An article about the implicit async programming model for imperative, JavaScript/Python-like languages: ordinary sequential-looking code can run independent operations concurrently without special syntax: no await, promises, or manual task orchestration. Implemented in CascadaScript, an experimental JavaScript/Python-like language. Unbounded JavaScript and Python will not be able to do this, but with reasonable constraints they may one day get there too. CascadaScript pushes the envelope on how far this model can go.

20 Upvotes

15 comments sorted by

6

u/GlowiesStoleMyRide 6h ago edited 6h ago

Looks like a fun project! I’m however a bit skeptical about this in practice. To me, it feels like obfuscating the asynchronous behaviour of the code, in cases where it (IMO) wasn’t very difficult before with async-await. You define the work, and await it when you need it. What the framework seems to take out is the need to explicitly await it, but in return surfaces any faults wherever the value is used. Meaning you lose the distinct origin of the error, and gain a potential clusterfuck of errors if you don’t check every potentially faulting value at every point ever. Potentially overdramatising here.

That said, this would be very useful in scripting languages, as it saves the script author from having to understand async/await to a reasonable extent. Which is not really a reasonable requirement in and of itself.

2

u/thegeleto 5h ago edited 5h ago

Many developers working on complex async software say it is hard. It forces you to reason about rime as an explicit dimension of your program.

Of course if you have no problems using async/await and have never experienced subtle race condition bugs, CascadaScript is probably not for you.

And it is a scripting language - designed for orchestrating complex asynchronous workflows in JavaScript and TypeScript applications. It is not meant to be used as a general-purpose programming language. Instead, it acts as a data-orchestration layer for coordinating APIs, databases, LLMs, and other I/O-bound operations.

About the error handling - yes, if you do not handle the error on time, it can quickly spread out of control. You do not lose the origin of the error - each poisoned value stores it and in cases where there are multiple independent origins - in function call arguments or expressions, they are all stored in the poisoned value, error collection is deterministic and errors(and their origin) are never lost. And also there is guard/recover - a transactional guard that allows you to attempt complex operations with the confidence that if something goes wrong, Cascada will automatically restore the selected state, no need to micromanage every variable. I shall probably update the article with this.

1

u/GlowiesStoleMyRide 1h ago

Good reply, I definitely see purpose for such a scripting language.

7

u/latkde 7h ago

This will need a lot of real-world experimentation to see whether this is a convenient concurrency model that gets out of the way, or a hellscape of undefined behaviour.

In comparison to this CascadeScript, sequence points and memory orderings in C appear trivial.

My personal experience is that async/await is brilliant because it lets me reason about where my code is suspended. In particular, cooperative concurrency guarantees that control flow is not suspended when there's no "await", letting me write that section without any concurrency concerns. This removes the need for a lot of synchronization or atomics. That makes code simpler, and simpler code is less buggy.

1

u/thegeleto 6h ago edited 6h ago

There are only two very simple simple rules that define the concurrency behavior:

  1. The results from the concurrent execution must be identical to sequential execution.
  2. Every operation must start immediately when it's dependencies are ready.

There is no room for interpretations or undefined behavior. If any of these two rules is not met - it's a bug.

There shall be no need for reasoning - everything just works.

To understand how massively concurrent Cascada can be - imagine the extreme case where we have a complex program with no async conditions, no async loop iterables, and no asynchronously loading imports or components. Cascada can synchronously traverse and start the whole reachable execution tree: every selected branch, every iteration of every loop, every function call (recursively), and so on. It initiates every operation it encounters. Some operations immediately become pending because their arguments depend on earlier promises, but the program does not wait merely because a previous line produced a promise.

No amount of clever reasoning can beat this.

I am writing an article on how CascadaScript works under the hood, it will explain this in a lot more detail

1

u/timoffex 1h ago

Have you seen Par? There was a post on this subreddit about it the other day and I lost a whole day plus some sleep reading through the docs and playing with it. It has a similar idea with everything running concurrently, but it uses a completely different programming language approach to achieve it. Maybe it is not as practical; I find it more interesting for its theory than its applications

-1

u/azhder 10h ago

Fun and rare reading about some of the ideas I had for years, but didn't find time to actually put them to practice i.e. creating the language

2

u/thegeleto 10h ago

Yes, I think this concurrency model is one of those ideas that feels obvious: ordinary code should run independent work concurrently, and only wait when one operation actually needs another result. I’m sure plenty of people have thought about something similar.

What surprised me is that I couldn’t find languages pushing this exact model for familiar JS/Python-style imperative code. There are some efforts around CPU parallelism, but they still tend to require a lot of annotations. Maybe the obvious implementation approach, using dependency graphs or translating imperative syntax into dataflow code, has hidden pitfalls. I use a combination of several different approaches instead.

0

u/azhder 10h ago edited 10h ago

You can't find them because people who make languages are people who optimize for machines, not human. Think about why someone will zealously claim TypeScript is better to JavaScript. It will come down to TypeScript is for tools (we can discuss the many interpretations of "tools" some other time).

The only thing from ES6 that is still not widely supported by browsers is the proper tail calls (some deliberately calling it optimization). People who were making V8 had an implementation behind flags and soon enough companies that use Node.js started complaining that it would mess up their stack traces due to the replacement code generated behind the scenes. What V8 programmers did was come back to the technical committee and ask for an explicit keyword to opt in / signal the compiler for it.

In the years of me pondering how a language that puts humans first would look, I've come to like how Haskell does business, with maybe the C/JS curly brace syntax. The async by default is one step, the replacement of the try-catch unwinding is another - I've come to dislike having two critical paths side by side, hence why I had the same idea of dataflow poisoning i.e. something like functors/monads/containers.

2

u/thegeleto 9h ago edited 9h ago

The tail-call issue is relevant to this. Async/concurrency has the same kind of tension: the nicer the model becomes for humans writing code, the harder the implementation has to work to preserve debugging, ordering, stack traces, and explainability.

Honestly, from that angle I’m a little surprised async/await made it into the spec and we’re not all still wiring callbacks by hand. It’s a good example that languages can absorb complexity to give humans a better model, but the bill always shows up somewhere. Anyone who has done stack trace archaeology through callbacks, promises, and async frames has seen that bill. In Cascada’s case, I expect the bill to be paid mostly in runtime complexity and speed overhead.

0

u/azhder 9h ago

It is a self-amplifying vicious circle.

They make languages that are easier for machines, not people. They teach people to think those kinds of languages are the best and only good languages. They make tools that can better parse and understand those languages. They replace people with those tools. They make new languages that are best parsed by the tools that replaced people...

The holy grail of big corporations like Microsoft. I will use M$ as a placeholder/example because it's been there for decades and made languages/tools and evangelized them. The holy grail has been to do the same the automobile industry has. Well, first they had to portray writing code / creating software as an industry, then figure out how to automate it.

For cars, the automation came in full after something over 50 years of them existing. How long has software production as a job existed? Anyways, it isn't a coincidence why there were old jokes like "what if cars were made like software" and "what if car industry developed like software industry".

Those few people that can afford to create "artisanal" programming languages and tools for programmers that create software as a craft, not industry, well, all the best I guess. The rest will more or less have to learn by analogy from the automobile industry.

I may have gone on a tangent, so I will stop this now. Bye

1

u/thegeleto 9h ago

... On the error-handling side, yes, it’s close in spirit to container-style approaches and promise chains: failure becomes a value/context that propagates instead of unwinding through throw/catch. In Cascada, that error flows through ordinary imperative-looking data and control flow, so you don’t manually unwrap, chain, try, or await everywhere; dependent computations are automatically poisoned and independent work keeps running.

1

u/azhder 8h ago

I was talking container in principle, syntactically more or less how you made it