Working at a company that uses react-native I wish nothing more than for the end of app stores and differing platform languages.
We're heavily considering just having a website next year with a mobile app using webview, and then native code for the native notifications, GPS and healthkit / health connect.
I feel like AI is changing the equation, its nearly better to write your business UI 3 times one for each platform.
It’s called a “WebView app” and you can get a really good experience on all platforms using them. Just:
- don’t make any crazy decisions on your fundamental UI components, like breadcrumbs, select dropdowns, etc
- add a few platform-specific specialisations to those same components, to make them feel a bit more familiar, such as button styling, or using a self-simplifying back-stack on Android
- test to make sure your webview matches the native browser’s behaviour where it matters. For example, sliding up the view when the keyboard is opened on mobile, navigating back & forth with edge-swipes on iOS, etc
I also went the extra step and got service workers working, for a basic offline experience, and added a native auto network diagnostic tool that runs on app startup and checks “Can reach local network” “Can reach internet (1.1.1.1)” “Can resolve our app’s domain” etc etc, so users can share where it failed to get quicker support. But this is an app for small-to-medium businesses, not consumer-facing, and the HTML5 part is served from the server and cached. I haven’t thought much about what you might need to do additionally for a consumer app, or a local-first app.
It’s because if a webview app experience is good, you don’t notice it, you only notice if it’s bad.
A while ago saw a blog link on HN that explained how Apple uses it everywhere and we never notice it because they are done well. Of course I can’t find that link now, I summon the HN gods…
On mobile the webview app experience is crap and it's immediately obvious that an app is not native. Simply nobody asks customers how they like it. The management assumes that as long as nobody complains and the users don't leave in droves, the experience must be impeccable.
> It’s because if a webview app experience is good, you don’t notice it, you only notice if it’s bad.
Aside from Apple’s apps (which imo are noticeably worse than the old ones, but that’s beside the point), what are some good WebView apps on iOS right now?
Somebody scraped the play store and checked the framework, so a list for Android WebView apps, built with capacitor, is here: https://capgo.app/top_capacitor_app/ Maybe an equivalent is there on iOS for the same app...
> It stands to reason that Apple wouldn't have developed this feature [liquid glass css property] if they weren't using it. Where? We have no idea. But they must be using it somewhere. The fact that none of us have noticed exactly where suggests that we're interacting with webviews in our daily use of iOS without ever even realising it.
There's some jump from _a property exists_ to _it must be used_, but a massive one from _a property exists_ to _Apple uses it everywhere and we never notice it because they are done well_.
I've done it before on a personal project and I was pretty obsessed with user experience. For example, I changed the way buttons work (because they were natively links with Cordova, which trigger upon tap, not "finger lift", like native buttons). Also, implemented some gestures to e.g. switch between pages (tab-style navigation). While not really in line with system UI (wasn't my goal), I think usability is quite decent.
I try this every decade. Love the first few months for speed. Then I end up paying the price later when I want to integrate "new OS feature X" or make a system gesture/style/animation feel native.
Lack of swipe for back on iOS is usually the easiest way to tell I'm looking at a web view.
I recently downloaded the Moodle app and was surprised to find it's powered by Ionic and a webview, which I only realized due to CSS misconfigurations that caused the app to fall back to Serif font for CJK glyphs.
Recent mid-tier phones are powerful enough that webview has a negligible impact on performance.
CapacitorJS makes for a REALLY awesome app dev experience compared with React Native. It's basically just a really well integrated system for building exactly what you describe. The company I work at made the switch from an RN app to a CJS one and it was might and day in so many ways, performance included!
I personally have a preference for Apple's native frameworks. From a purely engineering standpoint, they're very well thought out and have very clear separations of concerns. Spending my time with their libraries helped me write good, scalable code for platforms beyond their own.
That said, platform lock-in is bad for business because it makes operations dependent on a single provider, but I have no delusions that a web front-end is better.
From an engineering standpoint, front-end web frameworks are less complete and require too many third-party libraries and tooling to assemble. From a UX standpoint, it's actually worse--almost every website you visit today spams you upfront with Google sign-in and invasive cookie permission requests that you can't refuse. But never mind that--from a purely business standpoint, a single platform accessible anywhere saves costs. Most importantly, however, the web is a "safe space" for deploying software anti-patterns without an intermediary entity (i.e an app store) to police your code, so you can do whatever the heck you want.
I'd wish for nothing more than the end of web and app front-ends in favor of purely structured data derived from natural language prompts by users. However, the more realistic mindset seems to be that: the front-end layer is such a high level of abstraction with a very low barrier to entry, so that its tech stack will be in constant flux, in favor of whoever's currently the best-financed entity seeking the most market share, the most developer mind-share, and the most behavioral control among its users.
Using apps made with Electron or those so-called "universal" UI frameworks I wish nothing more than for everything to be native.
They always have to give up some basic or hidden conveniences that native controls get for free, and they always feel slightly different in a weird way which induces a constant vague annoyance when using them, like walking with a little pebble in your shoe, or a sitting in an chair that doesn't feel right.
It's "funny" how even after 50 years UI still isn't "solved" ..before writing a universal API we first need universal consensus, or at least some kind of authority, on how controls should behave.
That. And specifically, fuck Apple and their prohibition on JITs.
We have a React Native app that shares some code with a webapp, and that needs to do some geometry processing. So we're constantly playing the game of "will it interpret quick enough". Everything works fine in browsers, but in a RN app it often slows down to unusable speeds.
If you are curious how components' state is handled, they employed the React class components method:
// Import the StatefulComponent
import { StatefulComponent } from 'valdi_core/src/Component';
// ViewModel + State interfaces for component
export interface TimerViewModel { loop: number }
interface TimerState { elapsed: number }
// Component class
export class Timer extends StatefulComponent<TimerViewModel, TimerState> {
// Initialize the state
state = { elapsed: 0 };
// When creating the component, start a periodic logic
private interval?: number;
// Initialize the setInterval that will update state once a second incrementing
// the `elapsed` state value.
onCreate() {
this.interval = setInterval(() => {
// Increment the state to trigger a re-render periodically
const elapsed = this.state.elapsed;
const loop = this.viewModel.loop;
this.setState({ elapsed: (elapsed + 1) % loop });
}, 1000);
}
// When component is removed, make sure to cleanup interval logic
onDestroy() {
if (this.interval) clearInterval(this.interval);
}
// Render visuals will depend both on the state and the view model
onRender() {
<view padding={30} backgroundColor='lightblue'>
<label value={`Time Elapsed: ${this.state.elapsed} seconds`} />;
<label value={`Time Looping every: ${this.viewModel.loop} seconds`} />;
</view>;
}
}
I was at Snap during this project’s early days (Screenshop!) and spent a bit of time debugging some stuff directly with Simon. He’s a wonderful engineer and I’m thrilled to see this project out in the open.
Congratulations Snap team! Well deserved.
I'm surprised Snap of all companies invested in a cross-platform UI framework given how simple their app seems in comparison to more complex ones out there.
And more importantly, Snapchat seems like an app which could highly benefit from tight integration with native features (eg. camera, AR features, notifications, screenshot detection, etc.)
These companies have super talented engineers and can afford to invest in skunkworks projects like these when they can’t find any suitable options in the market.
More like "super talented engineers" will keep increasing cost of such projects until they get told this stuff doesn't generate revenue/cost cutting is happening, so work more on ad tech, leave or find external free labor("community") to maintain it.
As far as Im concerned, Snapchat is onw of the most complicated apps thats routinely used by hundreds of millions of people. You yourself listed all the features they have. And every one of them is pixel perfect, with insane amounts of time spent perfecting the user experience of every single ome of those features. In fact, the success of Snap could be attributed to how pixel perfect the app is.
Definitely one of the cooler projects to watch while I was there. I recall the goal was to open-source it from early on, so I'm glad to see it come to fruition!
I think it’s been changed since, but wow was it weird finding out that instead of taking photos, the Android app used to essentially take a screenshot of the camera view.
I worked on the camera in Instagram iOS for a while. There at least, there could be a 5,000ms latency delta between the “screen preview” and the actual full quality image asset from the camera DSP in the SOC.
I don’t know a thing about Android camera SDK but I can easily see how this choice was the right balance for performance and quality at the time on old hardware (I’m thinking 2013 or so).
Users didn’t want the full quality at all, they’d never zoom. Zero latency would be far more important for fueling the viral flywheel.
I worked on the Snapchat Android back in 2017. It's only weird for people who have never had to work with cameras on Android :) Google's done their best to wrangle things with CameraX, but there's basically a bajillion phones out there with different performance and quality characteristics. And Snap is (rightfully) hyper-fixated on the ability to open the app and take a picture as quickly as possible. The trade off they made was a reasonable one at the time.
Things have improved since then, but as I understand it, the technical reason behind that is that it used to be that only the camera viewfinder API was universal between devices. Every manufacturer implemented their cameras differently, and so developers had to write per-model camera handling to take high quality photos and video.
:) this is exactly how we used to do it even on iOS, back in the days before camera APIs were not made public, but Steve Jobs personally allowed such apps to be published in the iOS App Store (end of 2009) ...
> Valdi is a cross-platform UI framework that delivers native performance without sacrificing developer velocity. Write your UI once in declarative TypeScript, and it compiles directly to native views on iOS, Android, and macOS—no web views, no JavaScript bridges.
I often wonder how the economics are justified in making in house frameworks. What is it about Snapchat that requires a new framework but not other apps?
This looks promising. I would love to see more examples of what this can do along with screenshots. As is, there is a single Hello World and the components library is “coming soon”. But if it can deliver what it promises that would be pretty cool. React Native is I think the main popular framework in this space.
If the framework is used, eventually there will be 3rd party lib adding new features (from the top of my mind, maps), and someone will need to write the bridging with the native SDK. It means the bridge will most likely need to be written in objective-C instead of Swift
If your entire company is all about making a single good app, I doubt having AI changes the architecture much. Its still an architectural decision of whether you're going to have a single codebase for your app and focus on a framework to transpile it. Either way you need actual teams of experts on iOS android etc, no single person is going to master all of them.
One of the Valdi's authors here.
It's using native views under the hood, like React Native, and there are 3 modes of compilation/execution for the TS source. It can be interpreted from JS (TS compiled to minified JS source), interpreted from JS bytecode (TS compiled to JS source, minified, then compiled to JS bytecode ahead of time), or compiled to native code directly (TS compiled to C ahead of time).
An AOT TS -> C compiler is fantastic - how much of the language is supported, what are the limitations on TS support? I assume highly dynamic stuff and eval is out-of-scope?
Most of the TS language is supported, things that are not can be considered bugs that we need to fix. Eval is supported but it won't be able to capture variables outside of the eval string itself. We took a reverse approach than most other TS to native compiler projects: we wanted the compiler to be as compatible with JS as possible, at the expense of reducing performance initially, to make it possible to adopt the native compiler incrementally at scale.
There are significant trade-offs with this compiler at the moment: it uses much more binary size than minified JS or JS bytecode, and performance improvements goes from 2x to sometimes zero. It's a work-in-progress, it's pretty far along in what it supports, but its value-proposition is not yet where it needs to be.
So now I can finally implement the most god-awful, ugly, cumbersome and unintuitive GUI methodology ever to face a large population of users into my own apps? This abomination that started the whole user-experience decline by making this kind of yuck the gold standard for apps today is finally open source?
I hope it has "load spam ads directly into the list the user was about to touch somehow the millisecond before they touch it using magical force field technology so they click the wrong thing every time" functionality. I've been missing that in my apps
Snap is the greatest innovator of user experiences in this generation. This is evidenced by the fact that literally every other social media app is just a hodgepodge copycat of sorts of what snap invented. For people who introduced themselves to tech with snap as one of their first apps, its the most intuitive thing ever.
When they first introduced video calls, schools had to close for a day.
Imagine then you come here and see someone calls it awful. Can't help but think its just an instance of "old man yelling at clouds".
Don't take me argument as one against the point that it was massively popular and influential, en contraire, that's often the common factor in things I tend to dislike: the norm/ the mainstream.
It's mostly subjective when talking UI, and a lot plays in too regarding the lack of competive platforms with different UI's.
All valid points. Made me actually think about the likely first actions an alien or newborn would take with a 2025 touch screen device. Chomp. Chew. Swipe.
Working at a company that uses react-native I wish nothing more than for the end of app stores and differing platform languages.
We're heavily considering just having a website next year with a mobile app using webview, and then native code for the native notifications, GPS and healthkit / health connect.
I feel like AI is changing the equation, its nearly better to write your business UI 3 times one for each platform.
I did this and never looked back.
It’s called a “WebView app” and you can get a really good experience on all platforms using them. Just:
- don’t make any crazy decisions on your fundamental UI components, like breadcrumbs, select dropdowns, etc
- add a few platform-specific specialisations to those same components, to make them feel a bit more familiar, such as button styling, or using a self-simplifying back-stack on Android
- test to make sure your webview matches the native browser’s behaviour where it matters. For example, sliding up the view when the keyboard is opened on mobile, navigating back & forth with edge-swipes on iOS, etc
I also went the extra step and got service workers working, for a basic offline experience, and added a native auto network diagnostic tool that runs on app startup and checks “Can reach local network” “Can reach internet (1.1.1.1)” “Can resolve our app’s domain” etc etc, so users can share where it failed to get quicker support. But this is an app for small-to-medium businesses, not consumer-facing, and the HTML5 part is served from the server and cached. I haven’t thought much about what you might need to do additionally for a consumer app, or a local-first app.
I have never once experienced a WebView app that I would say had “a really good experience.”
It’s because if a webview app experience is good, you don’t notice it, you only notice if it’s bad.
A while ago saw a blog link on HN that explained how Apple uses it everywhere and we never notice it because they are done well. Of course I can’t find that link now, I summon the HN gods…
Maybe this https://blog.jim-nielsen.com/2022/inspecting-web-views-in-ma... and https://news.ycombinator.com/item?id=30648424?
On mobile the webview app experience is crap and it's immediately obvious that an app is not native. Simply nobody asks customers how they like it. The management assumes that as long as nobody complains and the users don't leave in droves, the experience must be impeccable.
> It’s because if a webview app experience is good, you don’t notice it, you only notice if it’s bad.
Aside from Apple’s apps (which imo are noticeably worse than the old ones, but that’s beside the point), what are some good WebView apps on iOS right now?
Somebody scraped the play store and checked the framework, so a list for Android WebView apps, built with capacitor, is here: https://capgo.app/top_capacitor_app/ Maybe an equivalent is there on iOS for the same app...
Is this what you mean? https://news.ycombinator.com/item?id=45250759
(In the context of "Apple has a private CSS property to add Liquid Glass effects to web content")
Yes, thank you!
> It stands to reason that Apple wouldn't have developed this feature [liquid glass css property] if they weren't using it. Where? We have no idea. But they must be using it somewhere. The fact that none of us have noticed exactly where suggests that we're interacting with webviews in our daily use of iOS without ever even realising it.
There's some jump from _a property exists_ to _it must be used_, but a massive one from _a property exists_ to _Apple uses it everywhere and we never notice it because they are done well_.
Can you give examples of good webview apps on iOS?
I've done it before on a personal project and I was pretty obsessed with user experience. For example, I changed the way buttons work (because they were natively links with Cordova, which trigger upon tap, not "finger lift", like native buttons). Also, implemented some gestures to e.g. switch between pages (tab-style navigation). While not really in line with system UI (wasn't my goal), I think usability is quite decent.
In case you're interested, the app is named "QuickÖV" - not relevant to anyone outside Switzerland, but just for trying it out: https://play.google.com/store/apps/details?id=com.billhillap...
I made a (hobby) project that utilized this strategy (Flutter + wrapped webview app), and it honestly seems like the way to go for my needs.
Do you use some framework for "WebView app" ? Like Tauri, etc ? Or is everything coded from scratch ?
Works until you need complex native code for things like automatic image capture assisted by a bounding model.
There is no reason you can't do that via web. Image capture in a canvas gives you access to the raw image pixmap data.
What is your app? Would love to try it out to get a feel for the experience.
Perhaps you mean PWAs and not WebView apps? WebView apps suck big-time.
At least now I know who the offending devs are.
https://developer.android.com/develop/ui/views/layout/webapp...
I try this every decade. Love the first few months for speed. Then I end up paying the price later when I want to integrate "new OS feature X" or make a system gesture/style/animation feel native.
Lack of swipe for back on iOS is usually the easiest way to tell I'm looking at a web view.
But it's been about a decade so I'm due...
I recently downloaded the Moodle app and was surprised to find it's powered by Ionic and a webview, which I only realized due to CSS misconfigurations that caused the app to fall back to Serif font for CJK glyphs.
Recent mid-tier phones are powerful enough that webview has a negligible impact on performance.
CapacitorJS makes for a REALLY awesome app dev experience compared with React Native. It's basically just a really well integrated system for building exactly what you describe. The company I work at made the switch from an RN app to a CJS one and it was might and day in so many ways, performance included!
I personally have a preference for Apple's native frameworks. From a purely engineering standpoint, they're very well thought out and have very clear separations of concerns. Spending my time with their libraries helped me write good, scalable code for platforms beyond their own.
That said, platform lock-in is bad for business because it makes operations dependent on a single provider, but I have no delusions that a web front-end is better.
From an engineering standpoint, front-end web frameworks are less complete and require too many third-party libraries and tooling to assemble. From a UX standpoint, it's actually worse--almost every website you visit today spams you upfront with Google sign-in and invasive cookie permission requests that you can't refuse. But never mind that--from a purely business standpoint, a single platform accessible anywhere saves costs. Most importantly, however, the web is a "safe space" for deploying software anti-patterns without an intermediary entity (i.e an app store) to police your code, so you can do whatever the heck you want.
I'd wish for nothing more than the end of web and app front-ends in favor of purely structured data derived from natural language prompts by users. However, the more realistic mindset seems to be that: the front-end layer is such a high level of abstraction with a very low barrier to entry, so that its tech stack will be in constant flux, in favor of whoever's currently the best-financed entity seeking the most market share, the most developer mind-share, and the most behavioral control among its users.
we shipped this last year. Best decision ever.
I saw another comment calling it "webview app", which is also valid, but we call it "hybrid app".
Using apps made with Electron or those so-called "universal" UI frameworks I wish nothing more than for everything to be native.
They always have to give up some basic or hidden conveniences that native controls get for free, and they always feel slightly different in a weird way which induces a constant vague annoyance when using them, like walking with a little pebble in your shoe, or a sitting in an chair that doesn't feel right.
It's "funny" how even after 50 years UI still isn't "solved" ..before writing a universal API we first need universal consensus, or at least some kind of authority, on how controls should behave.
That. And specifically, fuck Apple and their prohibition on JITs.
We have a React Native app that shares some code with a webapp, and that needs to do some geometry processing. So we're constantly playing the game of "will it interpret quick enough". Everything works fine in browsers, but in a RN app it often slows down to unusable speeds.
You could just use Kotlin Multiplatform and Multiplatform Compose instead.
Then why bother making an app? The user can access the web app using the browser.
We (they) wanted "easy accessibility" and "constantly seeing this app's icon" bs.
It’s a bad idea to put the fox (front-end developers) to guard the henhouse (great, consistent user experiences).
lol, yeah what do front ender devs know about UX. SMH…
Wholeheartedly agree.
If you are curious how components' state is handled, they employed the React class components method:
https://github.com/Snapchat/Valdi/blob/main/docs/docs/core-s...I was at Snap during this project’s early days (Screenshop!) and spent a bit of time debugging some stuff directly with Simon. He’s a wonderful engineer and I’m thrilled to see this project out in the open. Congratulations Snap team! Well deserved.
I'm surprised Snap of all companies invested in a cross-platform UI framework given how simple their app seems in comparison to more complex ones out there.
And more importantly, Snapchat seems like an app which could highly benefit from tight integration with native features (eg. camera, AR features, notifications, screenshot detection, etc.)
Perhaps for the same reason we got Airbnb of all companies to create Lottie. https://lottie.airbnb.tech/#/
These companies have super talented engineers and can afford to invest in skunkworks projects like these when they can’t find any suitable options in the market.
More like "super talented engineers" will keep increasing cost of such projects until they get told this stuff doesn't generate revenue/cost cutting is happening, so work more on ad tech, leave or find external free labor("community") to maintain it.
As far as Im concerned, Snapchat is onw of the most complicated apps thats routinely used by hundreds of millions of people. You yourself listed all the features they have. And every one of them is pixel perfect, with insane amounts of time spent perfecting the user experience of every single ome of those features. In fact, the success of Snap could be attributed to how pixel perfect the app is.
And then you call it simple?
Same! I've worked with Simon on this and tried (and failed) to port it to web. Truly a smart guy - and congratulations to the rest of the team!
Definitely one of the cooler projects to watch while I was there. I recall the goal was to open-source it from early on, so I'm glad to see it come to fruition!
Would you use this framework for a project today?
“Composer” ;)
I’m not sure I trust snap of all companies to make a good cross platform framework after how terrible their android app has been.
I think it’s been changed since, but wow was it weird finding out that instead of taking photos, the Android app used to essentially take a screenshot of the camera view.
I worked on the camera in Instagram iOS for a while. There at least, there could be a 5,000ms latency delta between the “screen preview” and the actual full quality image asset from the camera DSP in the SOC.
I don’t know a thing about Android camera SDK but I can easily see how this choice was the right balance for performance and quality at the time on old hardware (I’m thinking 2013 or so).
Users didn’t want the full quality at all, they’d never zoom. Zero latency would be far more important for fueling the viral flywheel.
> Users didn’t want the full quality at all, they’d never zoom.
Dating apps use awful quality versions of the photos you upload too. Seems to be good enough for most people.
I worked on the Snapchat Android back in 2017. It's only weird for people who have never had to work with cameras on Android :) Google's done their best to wrangle things with CameraX, but there's basically a bajillion phones out there with different performance and quality characteristics. And Snap is (rightfully) hyper-fixated on the ability to open the app and take a picture as quickly as possible. The trade off they made was a reasonable one at the time.
Things have improved since then, but as I understand it, the technical reason behind that is that it used to be that only the camera viewfinder API was universal between devices. Every manufacturer implemented their cameras differently, and so developers had to write per-model camera handling to take high quality photos and video.
:) this is exactly how we used to do it even on iOS, back in the days before camera APIs were not made public, but Steve Jobs personally allowed such apps to be published in the iOS App Store (end of 2009) ...
> Valdi is a cross-platform UI framework that delivers native performance without sacrificing developer velocity. Write your UI once in declarative TypeScript, and it compiles directly to native views on iOS, Android, and macOS—no web views, no JavaScript bridges.
“We’ve got both kinds. Country and western!”
Favourite Blues Brothers quote!
I often wonder how the economics are justified in making in house frameworks. What is it about Snapchat that requires a new framework but not other apps?
This is so cool! I'm a React-Native developer, and I'm glad to see more options like this coming into existence.
This looks promising. I would love to see more examples of what this can do along with screenshots. As is, there is a single Hello World and the components library is “coming soon”. But if it can deliver what it promises that would be pretty cool. React Native is I think the main popular framework in this space.
I wish the native iOS part was written in Swift rather than Objective-C like RN.
Why though? You aren’t interacting with it. What difference does it make?
If the framework is used, eventually there will be 3rd party lib adding new features (from the top of my mind, maps), and someone will need to write the bridging with the native SDK. It means the bridge will most likely need to be written in objective-C instead of Swift
How are you not interacting with it? It’s a UI library, no?
You are using the components, not interacting with that 3rd party code. Unless you are debugging/contributing back
Its hard to imagine not going fully native in the modern day with coding agents. Most of the code can just be clanked out.
If your entire company is all about making a single good app, I doubt having AI changes the architecture much. Its still an architectural decision of whether you're going to have a single codebase for your app and focus on a framework to transpile it. Either way you need actual teams of experts on iOS android etc, no single person is going to master all of them.
So this is like all those other frameworks that compile to native components, except this one is natively Typescript?
I’ll take it
I think? there isn't a typescript runtime? just a build time? I'm not positive how business logic gets executed but:
> it compiles directly to native views
One of the Valdi's authors here. It's using native views under the hood, like React Native, and there are 3 modes of compilation/execution for the TS source. It can be interpreted from JS (TS compiled to minified JS source), interpreted from JS bytecode (TS compiled to JS source, minified, then compiled to JS bytecode ahead of time), or compiled to native code directly (TS compiled to C ahead of time).
An AOT TS -> C compiler is fantastic - how much of the language is supported, what are the limitations on TS support? I assume highly dynamic stuff and eval is out-of-scope?
Most of the TS language is supported, things that are not can be considered bugs that we need to fix. Eval is supported but it won't be able to capture variables outside of the eval string itself. We took a reverse approach than most other TS to native compiler projects: we wanted the compiler to be as compatible with JS as possible, at the expense of reducing performance initially, to make it possible to adopt the native compiler incrementally at scale.
There are significant trade-offs with this compiler at the moment: it uses much more binary size than minified JS or JS bytecode, and performance improvements goes from 2x to sometimes zero. It's a work-in-progress, it's pretty far along in what it supports, but its value-proposition is not yet where it needs to be.
So now I can finally implement the most god-awful, ugly, cumbersome and unintuitive GUI methodology ever to face a large population of users into my own apps? This abomination that started the whole user-experience decline by making this kind of yuck the gold standard for apps today is finally open source?
Color me yellow.
I hope it has "load spam ads directly into the list the user was about to touch somehow the millisecond before they touch it using magical force field technology so they click the wrong thing every time" functionality. I've been missing that in my apps
Now offering 4 swipe directions!
With instant, subpixel precision!
Snap is the greatest innovator of user experiences in this generation. This is evidenced by the fact that literally every other social media app is just a hodgepodge copycat of sorts of what snap invented. For people who introduced themselves to tech with snap as one of their first apps, its the most intuitive thing ever.
When they first introduced video calls, schools had to close for a day.
Imagine then you come here and see someone calls it awful. Can't help but think its just an instance of "old man yelling at clouds".
Don't take me argument as one against the point that it was massively popular and influential, en contraire, that's often the common factor in things I tend to dislike: the norm/ the mainstream.
It's mostly subjective when talking UI, and a lot plays in too regarding the lack of competive platforms with different UI's.
> For people who introduced themselves to tech with snap as one of their first apps, its the most intuitive thing ever
By definition, the first app someone uses will be from their POV the most intuitive app ever. It will also be the least intuitive app ever.
All valid points. Made me actually think about the likely first actions an alien or newborn would take with a 2025 touch screen device. Chomp. Chew. Swipe.
xD
God forbid someone try something different. The app isn’t really made for people that only know how to doom scroll.
Not related to this, but abandoning Key DB was the worst thing they could do.
Rename it Snapp
Not to troll , Do you need such shims in the era of llm ?
Yes? Dear lord I want determinism
by llm you surely mean tech debt generators right?