213
all 37 comments
sorted by: hot top controversial new old
[-] mox@lemmy.sdf.org 53 points 1 year ago

Also:

  • Simple sites allow visitors to stay safe from browser exploits by keeping scripts disabled.
  • Simple sites pose very little threat of fingerprinting or other invasive tracking techniques.
  • Simple sites can look beautiful, with a bit of well-crafted CSS.
[-] Kushan@lemmy.world 16 points 1 year ago

I don't think your second point is correct. You can still embed analytics on a static website. I believe you're conflating it with your first point by assuming that scripts are disabled on the browser side, in which case it's a bit of a redundant point.

I also think it's a bit unrealistic in this day and age to run with scripts completely disabled. I know it sucks, but we need better ways of protecting our privacy and disabling all scripts is a bit of an extreme measure given so much of the modern web relies on it.

[-] mox@lemmy.sdf.org 4 points 1 year ago* (last edited 1 year ago)
  1. My first two points make a distinction between fingerprinting and more invasive attacks that JavaScript has enabled, including data exfiltration. You might not have encountered the latter, but that doesn't make them the same thing. (Also, the analytics you refer to that are possible without scripts are far less invasive than what scripts can do, as is hinted in my second point.)
  2. It's not unrealistic, since scripts can be turned off by default and enabled selectively when needed. (But were that not the case, it would be reason to use them less, not more.)
[-] thesystemisdown@lemmy.world 3 points 1 year ago* (last edited 1 year ago)

I think it's impossible if you want for things to work. JavaScript is so ubiquitous it's been baked into the browser since 1995.

[-] pixxelkick@lemmy.world 38 points 1 year ago

I think the reason experienced devs tend to have minimalist websites that look like they are from the 90s, is because software devs aren't UX experts.

At a senior level at large companies, someone else designs the look and figmas to make the site be pretty. I don't do that shit.

I can do some basic stuff as a front end dev, but react has nothing to do with css animations and all the stuff you typically associate with a "pretty" website.

Reactive frameworks are just handy for updating the dom on a mutatable website (ie forms, web socket stuff, data in out, pulling data from a db)

Blogs tend to be statically generated so there should be zero reason to use reactive frameworks anyways, unless you add something dynamic like perhaps a comment box folks can login to and leave comments/likes/shares etc. Loading those comments will prolly want a framework.

Aside from that, it's mostly css to do fancy stuff.

[-] some_guy@lemmy.sdf.org 24 points 1 year ago
[-] simonced@lemmy.one 8 points 1 year ago

what the fuck else do you want?

Lol, maybe a max-width on body at least, so I don't lose my line when reading long lines.
It's said to have narrow lines helps readeability for a reason.

Although, I agree, you don't need much to make a website that is functional.

[-] Kolanaki@yiffit.net 3 points 1 year ago* (last edited 1 year ago)

And here I am remembering when I first learned HTML and had a website that used every single random feature HTML was capable of, and more once I discovered the joys of JavaScript.

It was horrible and glorious all at once.

[-] UNWILLING_PARTICIPANT@sh.itjust.works 22 points 1 year ago* (last edited 1 year ago)

Agree with the article (and the 10 other ones I've already read on the topic) but Paul Graham's website looks like ass on mobile as of 2024. I couldn't even figure out how to get to the content, at least on cursory examination.

Good point about solo/team or simple/scalable though. Right tool for the job and all that. Good stuff

[-] No1@aussie.zone 17 points 1 year ago* (last edited 1 year ago)

I.d rather take any website than being continuously forced to download apps, or being told to go to Facebook for some business's information.

There's 2 things a website should respect - simple do it more often -, and not doing these will earn you my wrath:

  • you should be able to at least zoom/shrink text. Some websites have things so locked down, I can never read their teeny tiny text. Fuck you ESPN. Why would you let desktop zoom, and stop it on mobile where my screen is smaller and I most need it? (I'll leave alone the original intent of the web of separating presentation from content for another day).
  • Browser Back button should take you back to the previius 'page'. I'm terrified to use it because you're really showing multiple 'pages' on 1 real page, so who knows where I'll end up.
[-] Phoenix3875@lemmy.world 16 points 1 year ago

Static websites can be beautiful and easy to use without being complex.

PG's blog and HN can definitely use some CSS tweaks. I can't remember how many times I clicked the wrong thing in HN.

On the other hand, it's easy to get reader mode/custom CSS/alt frontend working for such websites, so maybe it's alright after all.

[-] frezik@midwest.social 13 points 1 year ago

Hidden benefit: never having a cookie acceptance popover, because you don't have cookies.

[-] singularity@lemmy.ml 13 points 1 year ago

I made mine with Hugo. It's super simple and looks great.

[-] stoy@lemmy.zip 12 points 1 year ago

I have had an account on Deviantart for almost 20 years, and up untill last year I used to upload my photos to my gallery there.

However over the years it has only gotten worse, it is slow, annoying and have had features removed that I wanted.

So last year, I set up a simple menu system and started generating photo galleries in digiKam, and upload galleries there instead, and it is soo much more responsive.

The menu I wrote is built in HTML and CSS, the galleries digiKam exports for me do use Javascript but only to aid in navigating the galleries with the arrow keys, so everything loads instantly.

When I publish new galleries I do need to edit the HTML code in the menu (and one line in the gallery) but it is as easy as I can make it while still giving me some options.

[-] mox@lemmy.sdf.org 5 points 1 year ago

The menu I wrote is built in HTML and CSS, the galleries digiKam exports for me do use Javascript but only to aid in navigating the galleries with the arrow keys, so everything loads instantly.

I love sites like this. Fully functional with plain HTML and CSS. JavaScript used only for optional enhancements. Fast, light, and trustworthy.

[-] stoy@lemmy.zip 3 points 1 year ago

Exactly, even now after half a year of using it, I am blown away by how fast it loads, and I love how I know exactly what is going on when it loads.

I even tried it on my phone, and the galleries have a responsive design, but better yet, they recognize swipes, making it easy to navigate on phones and tablets

[-] Gloria@sh.itjust.works 3 points 1 year ago* (last edited 1 year ago)

How do you solve the discoverability issue? A platform gives you some place where people could stumble upon you, while a website is an island in the middle of an ocean that people have to actively browse to. Do you crosspost your new work now more to get the word seen by others? I find it hard to believe that people would like to browse to x different websites to see if an artist has new works, only to find out that they don’t. For finding new artist a central place or a feed, like a platform can provide, seems to be nearly impossible to replace.

[-] stoy@lemmy.zip 5 points 1 year ago

I don't really use it for advertising, I have actively added the directory to the robots file and requested that search engines not index the page, I like it being hidden, but available for me to show people on their own computer, I also have a link to the page on my CV under hobbies.

[-] jadero@programming.dev 3 points 1 year ago* (last edited 1 year ago)

Edit: the bits barely had a chance to dry on my comment when I came across https://rss-parrot.net/

This is a way of integrating RSS feeds into your personal timeline on Mastodon. I don't know how this affects the work I describe at the bottom of this comment, but I bet it has a role to play.


I find it hard to believe that people would like to browse to x different websites to see if an artist has new works, only to find out that they don’t.

RSS FTW!

Every site I've ever created or been involved with in even the tiniest capacity has supported RSS. Sometimes it was enabled just to shut me up.

I'm not sure how to better promote the use of RSS and get people to use feed readers, but I think it is the answer to at least that particular issue.

My personal opinion is that a "platform" should really be just a collection of searchable and categorized feeds with it's own feed. That way there is both discoverability and the ability for individuals to construct their own personal feed on their own personal device (no server required!) while staying abreast of new feeds on the master feed aggregation "platform."

There are innumerable ways for people to get their own content into something that supports RSS and that feed could be easily submitted to the master feed aggregation "platform" to deal with the discoverability issue. For example, Mastodon and most compatible systems support RSS and registration is child's play on any server that allows public registration.

In fact, the "platform" could set up a crawler to automatically discover RSS feeds. If the author has done the metadata right, the results would even be automatically categorized.

Done right, the "platform" might actually run on a pretty small server, because it would be linking to sites, and only pulling summaries from them.

Even comments could be supported with a little creativity. As I said, there are innumerable ways for people to get their own content out there. If there were a standard metadata tag "comment: ", some fancy footwork could produce a threaded discussion associated with a particular article, even if the original author has no internal commenting system. (And my favoured internal comment system would permit nothing but pure HTTPS links to the commenters own content, extracting a short summary for display.)

Side note: I acquired a domain explicitly for the purpose of setting up such a feed aggregation "platform." Now that I'm retired, I'm slowly working on creating it. Everything is highly experimental at this point and, to be honest, shows no visible progress to that end, but that is my ultimate goal.

[-] brisk@aussie.zone 2 points 1 year ago

This is an interesting sounding project, do you have a feed/blog/mastodon/mailing list you're likely to announce on?

[-] jadero@programming.dev 1 points 1 year ago

Thanks for your interest!

Apart from here and "self-hosting" and other communities, if you're a glutton for punishment, you can see what's up at https://walloftext.ca. I'm currently in the process of rebuilding everything from the ground up, including an associated mastodon-compatible instance. I've not yet rewritten my project outline to account for all the new stuff I've learned about in the past few months, but it's coming in the next few days.

Just note the most important part of my tagline: "Unstable by nature". Some would argue that applies more to me than the stability of the site and projects. 😛 Either way, chaos is probably the order of the day for at least the rest of this year. (And I mostly take summers off to reenergize by fishing, working in my shop, etc.)

[-] projectmoon@lemm.ee 1 points 1 year ago

Where are you uploading galleries? Just your own HDD connected to a static website?

[-] lemmyvore@feddit.nl 2 points 1 year ago

Not OP but I would use a CDN like bunny.net. It's cheap and you get geo redundancy and all kinds of perks with it.

You can set the Bunny CDN to pull from your home server or you can upload your files to a Bunny storage and it can pull from there so it doesn't matter if your home server is on or not.

I'm currently running only the dynamic parts at home (CMS, generators etc.) and I "host" all the static generated stuff on there.

[-] projectmoon@lemm.ee 1 points 1 year ago

Yeah, that sounds like a good idea. I am using photoprism for photo management. It doesn't really support S3 or any CDN. You could use a fuse filesystem or something, but it's very slow.

[-] lemmyvore@feddit.nl 1 points 1 year ago* (last edited 1 year ago)

It's probably better to export the photos if you want to make a public presentation gallery. Many image viewers can create static HTML pages for a given set of images, GThumb, DigiKam etc. But it could work with a photo management app too if it has presentation gallery support and can be configured to serve images from a CDN prefix.

The catch with CDN support in dynamic apps is that they need to be aware that you want to use a CDN so they can provide both a dynamic view (of whatever resource you're trying to cache) so you have something to pull the original from, as well as use the CDN URL for their main pages so they take advantage of the caching.

Alternatively, if they don't have CDN support, or you want to isolate the dynamic app from the public, if the app makes good static-looking URLs you can scrape it, make static pages and upload that to the CDN.

I recently did this for someone who was using a gallery app that was made with super old MySQL and PHP versions and was super unsafe and inefficient. I used URL-rewriting to make all pages end in .html and .jpg, then scraped it with wget, uploaded the whole thing to CDN and pointed a CNAME of their domain to the CDN domain. The dynamic app now lives on a private server in docker containers which they access over VPN, and when they change stuff I just run a script that takes a new snapshot and uploads it to the CDN.

[-] projectmoon@lemm.ee 1 points 1 year ago

Definitely a good way to do it. Photoprism supports uploading to WebDAV for sharing. Could front a CDN upload with a web dav server 🤔

[-] stoy@lemmy.zip 2 points 1 year ago

Currently I borrow space on my dad's web host, he wasn't using it and was ok with me doing it.

[-] Montagge@kbin.earth 7 points 1 year ago

All websites should strive to be like rockauto.com

[-] kurwa@lemmy.world 5 points 1 year ago

I'll counter with lingscars.com

[-] litchralee@sh.itjust.works 5 points 1 year ago

I'm a fan of Pelican for static blog generation from Markdown files. Separating template and content into CSS/HTML and md files, and having it all in a Git repo for version control, is only a few hundreds of kilobytes. Lightweight to work on, and lightweight to deploy. It's so uncomplicated, I can probably pick right back up if I left it alone for ten years.

[-] owenfromcanada@lemmy.world 2 points 1 year ago

I've found GetSimple to have similar advantages. It's not as much a "static" site generator, but it uses flat XML storage for content instead of a database, so I can back it up in a git repo and deploy by just copying files.

[-] LinearArray@programming.dev 4 points 1 year ago

I love simple sites, takes less resources to load & lightweight.

[-] henfredemars@infosec.pub 2 points 1 year ago

I love simple sites, but I feel that there's something to be said for design philosophy vs tooling.

Take vanilla WordPress for example. I find it relatively easy to manage static content, especially when running it in a container to categorically prevent dependency concerns. Is it overkill for a simple site? Perhaps, but does it work and is it easy to use? It's possible to use these tools to manage a mostly static, text based, minimal to no script website. The key is recognizing the value of that simplicity and providing that simple to read, simple to use experience without distractions.

WordPress will never be as simple and performant as a truly static site, but we can do a lot to cut down the cognitive load, and we should.

[-] owenfromcanada@lemmy.world 1 points 1 year ago

If you want the ease and functionality of WordPress, but you plan to have a relatively simple site, there are other CMS options that are lighter weight and easier to work with. I've used GetSimple for years with some of my sites, it's much more performant and easier to maintain. And the non-technical folks that manage content actually found it easier to use than WordPress.

[-] Omega_Haxors@lemmy.ml 1 points 1 year ago* (last edited 1 year ago)

I once went to a professional to get a website done (as my ability (read: patience) to code websites had proved inadequate) and they constantly tried to upsell me on just the most stupid bullshit. When I pointed out how a lot of moving parts just means more things that could possibly break they blew me off and acted like it was a completely unreasonable concern. Needless to say ended up using a website builder instead and despite a few small glitches it works pretty well with JS completely disabled.

EDIT: I was particularly concerned with how heavily they were leaning on JS, to the point it flat out wouldn't load at all for some users. Having JS flair is perfectly fine on the side but when you can't even get fucking text to load without it, that's a problem.

this post was submitted on 13 Feb 2024
213 points (100.0% liked)

Programming

19896 readers
133 users here now

Welcome to the main community in programming.dev! Feel free to post anything relating to programming here!

Cross posting is strongly encouraged in the instance. If you feel your post or another person's post makes sense in another community cross post into it.

Hope you enjoy the instance!

Rules

Rules

  • Follow the programming.dev instance rules
  • Keep content related to programming in some way
  • If you're posting long videos try to add in some form of tldr for those who don't want to watch videos

Wormhole

Follow the wormhole through a path of communities !webdev@programming.dev



founded 2 years ago
MODERATORS