Conlang.org's community list
Anthologica Universe Atlas / Forums / Miscellaneria / Conlang.org's community list

? Serafín posts: 48
, 農, Canada
message
I'm writing an email telling the LCS to update their community list (and also their Javascript menu, which only works on cellphones and huge screens). Annie is still listed there, from back in the days in 2014 when we had an actual interest in becoming another normal conlanging forum, with activity happening...

Can I tell them to de-list Annie because of its low forum activity? It is still useful as an online dictionary, although unfortunately it appears its server backend problems will never be fixed. (Don't read this as any sort of bitter complaint though. I understand Rhet has preferred to take care of all sorts of other matters, and that the work needed to be done on this website is not usually a source of much joy.)
? Rhetorica Your Writing System Sucks
posts: 1292
, Kelatetía: Dis, Major Belt 1
message
Jeez. That's a bit morbid. There are actually two Anthologica listings on their communities page at the moment, one no-longer-accurate link1 to the forums specifically, and one more general link under the miscellaneous resources section. There are quite a few entries on that page overall that are unambiguously dead and haven't seen a soul in years. Perhaps lobby for a purge of those, first?

I have a semi-concrete timeline now for developing the mythical tumblr-esque crossposting format, also, which will make it a lot easier for people to share their recent additions, and should make the community appear less dormant.

_________________________
1. I got tired of seeing 'academia' constantly. So now the optimal URL is anthologi.ca/forums, and the name is 'The Colonnade.'
? Serafín posts: 48
, 農, Canada
message
quoting Rhetorica, Kelatetía: Dis, Major Belt 1:
There are quite a few entries on that page overall that are unambiguously dead and haven't seen a soul in years. Perhaps lobby for a purge of those, first?

Yeah, okay, I can just tell them to update the labels and links for Anthologica then.
I have a semi-concrete timeline now for developing the mythical tumblr-esque crossposting format, also, which will make it a lot easier for people to share their recent additions, and should make the community appear less dormant.

Honestly, all I ask to be able to use this website in the first place and recommend it to other people is a faster serving of pages. It is too slow, sometimes my browser even gives up on it. There was one time when I tried to show it to someone I was having a conversation about conlanging with, here in Vancouver in late 2018, but my phone would manage to load its pages only now and then, haphazardly, like an unfortunate demo... The "languages" main page seems to be especially painful to load for some reason (what makes the server have problems with it?). I suppose the crossposting format would be amusing to use, but IMO that is not the right priority here...

Again, it's not something I'm angry or annoyed about, but it's also not something I'm that hopeful about. I just don't use Annie.
? Hallow XIII Primordial Crab
posts: 539
, 蘇黎世之侯
message
I would say your reaction to "this is a platform I don't use" was somewhat tactless.

An interesting question raised here though is what you (or people in general) are looking for in a Conlanging Community. I, for example, almost never use the Languages page. My favorite landing page is the Recent Site Activity page, which is much more interesting, since it shows you what's being worked on and, more importantly, allows you to keep track of what people whose output you know you enjoy are doing.

The community aspect, I suppose, is cut somewhat short by the forums not being used all too frequently, but part of that is people not posting creations to the forums as they would on a phpbb. I suspect a combination of the vaunted tumblrfeature and some sort of "official" IM space would alleviate this.
? Serafín posts: 48
, 農, Canada
message
quoting Hallow XIII, 蘇黎世之侯:
I would say your reaction to "this is a platform I don't use" was somewhat tactless.

Well yes, but the thing is, I would like to use it. The dictionary feature is very much the best there is (all the alternatives being either more or less annoying, or laughable), and having articles grouped with them under a single language entry was a great idea.
? Rhetorica Your Writing System Sucks
posts: 1292
, Kelatetía: Dis, Major Belt 1
message
And now, a monologue about site performance and what I'm going to do about it.

Truth be told, the current software stack behind Annie is never going to be blazing fast. All of its features come at a cost to performance. The single worst culprit of this is a conflict between programming paradigms: Cadre's page template philosophy encourages encapsulation (routing all activity through smaller pieces of code to keep it cleanly isolated) but this results in a lot of individual SQL queries (which might otherwise be grouped into one single query.) The forums and the languages page both have this problem, although I've done some work to mitigate it on the forums. (The site's front page is laggy because it counts unread forum posts; by contrast, it's very fast if you're not logged in, as are the forums more generally.)

I haven't mentioned this much, but the truth is that Cadre wasn't really designed as a production-grade platform for websites. Way, way back in 2009, when the project was new, my goal was to build a coding sandbox for prototyping new social platforms—alternatives to wikis, forums, Facebook, et cetera—so performance was never a top priority; instead, I was concerned with making it possible for users to program new site functionality through the website itself while respecting file-system-like access permissions so they couldn't damage the content made by others. This was a pretty unexplored piece of territory at the time, and I think Cadre's still better at it than, say, MediaWiki.

For a while I was looking at translating the Cadre software stack into C++, as it was the language I worked in the most while I was in grad school. These days most of my work is actually in JavaScript, for better or worse, and I've decided to do this with node.js instead. As I despise omnibus frameworks and libraries this might even be passably performant, and I'll get a lot of web-relevant functionality for little or no effort (HTTP handling, Unicode, multithreading) which would be a lot more granular to put together in C++. So, consider this notice that I'm starting work on this new project. It will still have an Octavia-like hosted language, but I may rethink some of its syntax choices.
? Hallow XIII Primordial Crab
posts: 539
, 蘇黎世之侯
message
you really weren't kidding when you said you were a pl/1 enthusiast were you
? Serafín posts: 48
, 農, Canada
message
quoting Hallow XIII, 蘇黎世之侯:
you really weren't kidding when you said you were a pl/1 enthusiast were you

I think she has described her relationship with PL/I in the past as not so much a fan's enthusiasm as an eternal summer sunset kind of infatuation.
? Rhetorica Your Writing System Sucks
posts: 1292
, Kelatetía: Dis, Major Belt 1
message
I had a teenage phase as a computing history enthusiast, which was centred specifically around Multics and other MIT developments of the seventies, particularly Lisp machines and ITS. (Separately and concurrently I have a soft spot for the C64 and the Amiga. I usually win at retrocomputing trivia.) My admiration for PL/I is one unfettered by any actual experience with it, although when one understands the differences between it and C, the motivation between new projects like Rust and Go becomes blindingly obvious. In the nineties all of the software engineers who had the necessary expertise to reverse C's momentum were working on Java, so only now, after Oracle has ruined Java, has the brain drain been over for long enough to move attention back to putting the other academic discoveries of the seventies and eighties into real use. Rust has a built-in fixed-precision library, so even COBOL's day in the sun could conceivably end.

One other thing about PL/I: it was somewhat infamous in its heyday as being cumbersome and overly complex, but compared to Java and mid-nineties C++ it was quite small. Anything you might read about PL/I being obnoxious or baroque is propaganda written by people with access to, at best, a few hundred kilowords of core, i.e., less than a megabyte of RAM. You might not want to use it for embedded programming these days, but the most important implementation, Multics EPL, was a systems-oriented subset that didn't include all of the language's features.

Anyway. I assume this was brought up because I casually made mention about tinkering with Octavia. To be honest, programming a basic language interpreter isn't really that hard, though it does require some creative problem-solving if you've never taken a course on it. Clean tokenization was something that took me a few tries to understand properly; the first few designs I wrote in high school were based on string prefix-matching, which invariably led to problems when function notation and infix operators were combined. Even Octavia's implementation of tokenization, which was written all the way back in 2009 when I was still a first-year undergrad, had some deficiencies in this regard for the longest time—it would occasionally do bizarre things like abort parsing because of unmatched parentheses inside string literals. I haven't actually coded a language from scratch since that time (Sappho's p3 interpreter was written around the same timeframe, in an even worse language) so I'm somewhat looking forward to doing a new implementation, especially one with less of a post-hoc attitude toward functions and objects.

Programming language theory and linguistics have a lot in common, but unfortunately enthusiasts of both are scarce. PLT does a lot of work with formalizing concepts of a statement's context that are potentially a gold mine for conlanging, especially philosophical languages. As far as I know Lojban and Ithkuil never approached these subjects and have mostly concentrated on providing the tools that their respective authors felt they needed to describe their own thoughts—by far my favourite is that Lojban wasted a three-phoneme word (which is prime real estate from a Huffman coding perspective) on a connective that indicates the Cartesian product (every pairwise combination of elements from sets X and Y), which I sincerely doubt anyone has ever used or will ever use.

So... yeah. Look forward to that. I'll try to make it interesting without making it too hard to convert Octavia code.
notices