I'm specifically asking about software system design tasks like:

Designing backend architectures Tradeoff analysis (DB, queues, caching, others) Infra diagrams Documentation

My current pick would be Claude Opus 4.6, because I've found it strong at structured reasoning and long context

But I'm curious what others are using today

raw_anon_111117 hours ago | | | parent | | on: 47753223
None. I work in consulting specializing in AWS Architecture + app dev. All of my projects over the last 6 years have been green field - empty AWS account + empty git repo.

I’ve been using Codex and Claude code over the past 6 months. They both do pretty well creating well structured normalized database schemas for both RDMS databases and single table design + appropriate GSIs with enough (business context). But they both have are suboptimal architecture and ETL designs.

I know AWS very well (trust me). But I would never go in blind for instance doing a green field implementation on Azure trusting AI.

sminchev23 hours ago | | | parent | | on: 47753223
Opus is OK, but I would suggest to use also frameworks, like BMAD. It has really nice agents that can help with brainstorm session, UI/UX, Security, architecture, documentation, and all needed so that the the system has good basis before development.

It is expensive, I know. I mean, it takes a lot of tokens and the $20 is just not enough, but the price worth it, in my opinion. I did it with my project.

I tried, Gemini, GLM, Sonnet, and each of them has it strengths for specific things, but for initial architecture - Opus and BMAd

verdverm1 day ago | | | parent | | on: 47753223
Actual intelligence, no models are capable of such high level design because they cannot handle the intricacies of the details
kevin0000101118 hours ago | | | parent | | on: 47753901
I'd be interested in the evidence for this, in some respects I guess it depends on your framing, versus ideal? Or versus humans?
raw_anon_111116 hours ago | | | parent | | on: 47759707
One anecdote: I was working on a feature for an internal web app where the user could upload a 60K line CSV file to a website, it was saved in S3 and then a Lambda uploaded it to a Postgres database.

The naive way to do it is just to submit the file to the API, the API saves it to S3 and then does bulk INSERTS.

Claude did the first part right - create a pre signed S3 URL, send it to the browser and the file is uploaded directly to S3.

But it did the second part incorrectly - a bulk INSERT after getting the file from S3.

The correct way was to use the AWS extension that lets you upload the file directly from S3 into a table. The difference is 40 minutes vs 2 minutes. Of course Lambda times out in 15.

icedchai56 minutes ago | | | parent | | on: 47760196
If it took 40 minutes to insert a 60K line CSV with native postgres using bulk inserts there is something seriously wrong with the code.
Isn’t that what I just said?
verdverm3 hours ago | | | parent | | on: 47759707
It is well known the models/agents are only so good. Not sure what kind of evidence you are looking for, but system design is among the more challenging tasks for humans. Machines are definitely not up to the task.