00:10:39 Stable (0.33) branch on cbro.berotato.org updated to: 0.33.0-4-g1c0d958e09 01:12:40 -!- SleeperService is now known as aoei 04:19:25 <03s​emi_tonal> congrats on the release! have been meaning to check back in sometime, looks like a pretty substantial set of changes since I last played! 04:19:47 Coo, there's a name I haven't heard in a while 04:38:49 I'm still getting centaur ghosts in Necropolis in 0.33 - is this intended? 04:38:52 (this is CXC) 05:07:03 Unstable branch on crawl.akrasiac.org updated to: 0.34-a0-35-g6dd01c9 (34) 05:28:43 Bronksi (L2 DsFE) ASSERT(!invalid_monster(&mons)) in 'mon-death.cc' at line 2285 failed. (D (Sprint)) 09:52:46 <09g​ammafunk> regarding centaur ghosts, I think DracoOmega did leave some ancient ghosts in the permastore 09:52:53 <09g​ammafunk> she mentioned something about this for certian branches 09:55:59 <04d​racoomega> I did not, actually 09:56:32 <04d​racoomega> But I actually wondered whether the change to the base permastore would affect live servers, since I think that may be copied into a different directory if a current permastore does not exist? 09:58:10 <04d​racoomega> Like, I changed what was in dat/dist_bones but there are permastore files in saves/bones for existing installations and I think those may be copied from the former location initially? ...though should a 0.33 branch be using fresh ones? I'm uncertain, I guess. 09:59:35 <04d​racoomega> Er, to be more explicit, I considered merging old Zig bones into new Zig ones so that they could have an adequate number of ghosts, but opted to merge current Pan ghosts into it instead, so there should be no old ghosts at all in the permastore I pushed 10:12:38 <04d​racoomega> (I can't, of course, look at those files on servers myself) 10:22:08 <09g​ammafunk> hrm, right, doesn't the crawl process only copy from dist_bones if there's no permastore? 10:25:04 <09g​ammafunk> yeah, I see, it's in _bones_permastore_file() 10:26:20 <09g​ammafunk> 0.33 installations on DGL servers would indeed create a new (empty) bones dir that wouldn't derive from e.g. trunk. Trunk has its own saves dir (where the bones dir lives) that's shared for all trunk versions, but that's only used for trunk versions managed by the trunk installer/updater 10:27:03 <04d​racoomega> Yeah, so this probably doesn't explain where this centaur ghost came from, if it was indeed 0.33 10:27:21 <09g​ammafunk> and any new installation of 0.33 would see the new datadir for the 0.33 binary, which should have data distributed with 0.33 10:27:40 <09g​ammafunk> I guess I could do some investigation on CDI to confirm all of this 10:27:56 <09g​ammafunk> is there a command to get a quick listing of all ghosts in a bones file? 10:27:57 <04d​racoomega> NormalPerson7: I don't suppose you could give me the name of that ghost and what floor the necropolis was entered from? 10:28:08 <04d​racoomega> crawl --bones ls [filename] 10:32:36 <09g​ammafunk> well, I do indeed have centaur ghosts in my permastore 10:32:46 <04d​racoomega> Which branch? 10:32:56 <09g​ammafunk> Bones file 'bones.store.Zot', version 34.145: reasonance XL27 DsFi stpyramids XL27 DDFi wilo XL26 MfBe runemage XL27 GnFi Heisao XL27 TrBe Mikey69 XL27 DsBe Emnejil XL26 VSBe Sofait XL27 GnFE Muataran XL25 MiMo ShutUppercut XL27 DsGl Flapstick XL27 HaBe Netemen XL26 GnFi Pongem XL27 MiAK Dawneret XL26 GnFi Juzit XL27 GnFi Heoriaro XL27 GnFi Hicanoach XL26 DsGl 10:32:56 Decrayps XL23 FeEn Beuseq XL26 GnFi Icycared XL26 GrFi Alunk XL27 GnFi Jofris Vesimp XL27 GnFE Toelubet XL25 CeHu Cyqimiab XL27 MfGl Sebriys XL27 TrBe Beclu XL27 GnSk Jiorhin XL27 TrBe Uneer XL27 GnFE 10:33:13 <09g​ammafunk> I can pastebin the full listing for all store files 10:33:22 <04d​racoomega> Er, I mean, is this trunk or 0.33? 10:33:47 <09g​ammafunk> this is 0.33 10:33:54 <04d​racoomega> Since this doesn't look like the contents of the new permastore file for there 10:34:42 <04d​racoomega> Yes, this is definitely not the bones.store.Zot file in dat/dist_bones 10:34:56 <09g​ammafunk> /home/crawl/DGL/crawl-master/crawl-0.33/saves/bones$ for i in *.store*; do ~crawl/DGL/usr/games/crawl-0.33 --bones ls "$i" >> ~/ghosts.txt; done 10:35:05 <09g​ammafunk> https://dpaste.org/vqhcY is the full data 10:35:38 <09g​ammafunk> maybe I need to take a closer look at the install/update script 10:36:24 <04d​racoomega> Yeah, if you look at that output, some of those files belong to different versions 10:36:51 <04d​racoomega> 34.294 are the new ones and 34.145 appear to be old ones 10:37:03 <09g​ammafunk> oh, so it's getting some of the new dist? 10:37:07 <04d​racoomega> It seems so?? 10:37:26 <04d​racoomega> But this is very wierd. How is Lair 1, 4, 5 old and Lair 2, 3 new? 10:37:41 <09g​ammafunk> right, guess I need to read this install script more carefully to see if anything odd is going on 10:38:01 <09g​ammafunk> say-do crawl-do nice make -C source \ GAME=${GAME} \ GAME_MAIN=${GAME} MCHMOD=0755 MCHMOD_SAVEDIR=755 \ INSTALL_UGRP=$CRAWL_UGRP \ WEBTILES=YesPlease USE_DGAMELAUNCH=YesPlease WIZARD=YesPlease \ STRIP=true DESTDIR=${DESTDIR} prefix= bin_prefix=/bin \ SAVEDIR=$CHROOT_CRAWL_BASEDIR/${GAME}/saves \ DATADIR=$CHROOT_CRAWL_BASEDIR/${GAME}/data \ WEBDIR=$CHROOT_CRAWL_BASEDIR/${GAME}/data/web \ USE_PCRE=y 10:38:02 \ EXTERNAL_FLAGS_L="-g" 10:38:14 <09g​ammafunk> is our build, which does seem to set everything up centrally for the specific version 10:39:52 <04d​racoomega> (I just verified that somehow the versions did not end up mixed in dist_bones and it seems they did not) 10:39:55 <04d​racoomega> Just in case 10:41:13 <09g​ammafunk> hrm, well it looks like during install we're in the checkout of the repo, which is switched to the branch being built (so e.g. stone_soup-0.33) 10:41:17 <09g​ammafunk> we do that make 10:41:50 <09g​ammafunk> and from that dir we then do https://github.com/crawl/dgamelaunch-config/blob/master/chroot/sbin/install-stable.sh 10:44:18 <09g​ammafunk> I don't really see anything that might mix in data from elsewhere 10:44:22 <09g​ammafunk> so it's weird 10:44:50 <09g​ammafunk> copy-data-files() is just doing a simple recursive copy from the checkout into the data dir of the installed stable version 10:45:41 <09g​ammafunk> will have to look into this more later, I guess 10:46:33 <09g​ammafunk> I suppose we could have admins delete their stores manually so that the dist is copied from, and we could certainly modify our stable installer if need be, but it's not looking like this is an installation issue per se 10:46:51 <04d​racoomega> Oh. Wait. 10:47:02 <04d​racoomega> 0.33 was branched before I updated the permastore 10:47:11 <04d​racoomega> And some servers probably built it 10:47:35 <04d​racoomega> And the mix of files is possibly because those floors were asked for before the new permastore files arrived 10:47:51 <04d​racoomega> But ones that had not yet been requested used the new files 10:48:22 <04d​racoomega> (Okay, I actually don't know how Zot could have been asked for at the moment, tbh.) 10:48:31 <04d​racoomega> But that may be a possible explanation for this weirdness 10:49:44 <09g​ammafunk> oh 10:50:24 <09g​ammafunk> regarding zot, isn't that going to happen fairly often due to pregen? 10:50:49 <04d​racoomega> Well, Necropolises can't generate in Zot, so I am not sure what circumstances are causing a request for a player ghost there 10:51:09 <04d​racoomega> Oh, wait, maybe dying there is good enough? 10:51:14 <09g​ammafunk> right 10:51:35 <04d​racoomega> (I considered not even including/updating permastores for branches that don't use ghosts at the moment, but did so in case some future use arrives) 10:52:13 <09g​ammafunk> ghosts eventually get purged from permastore over time, right? 10:52:25 <09g​ammafunk> so if our understanding is correct, centaurs should eventually go away from permastore 10:52:36 <09g​ammafunk> aside from the zot ones, I guess, but those don't matter 10:52:37 <04d​racoomega> I don't think so, actually 10:52:50 <04d​racoomega> I thought the point of the permastore is that those ghosts are never deleted 10:53:02 <09g​ammafunk> but how is a permastore ever updated, then? 10:53:10 <09g​ammafunk> it can't only ever add ghosts to it 10:54:23 <04d​racoomega> Well, mostly it doesn't add ghosts to it anymore, by my understanding. I may be wrong, but my understanding is that the idea of a permastore that would be added to automatically predates there being a larger one packaged with the game. So it would grow up to a certain point (only 10-20 ghosts in some cases!) and I think stop? And later on, this idea was extended to something that could be packaged for offline and given a larger 10:54:23 set of files (meaning new ghosts were, I think, never put into them anymore) 10:54:37 <04d​racoomega> I think a bunch of that code is effectively dead 10:54:47 <04d​racoomega> (But I didn't really have time to verify and rip it out in 0.33) 10:55:07 <09g​ammafunk> I see 10:55:22 <04d​racoomega> Like, the minor version of the files in that directory is still old, suggesting they can't have been resaved with newer content 10:56:00 <09g​ammafunk> I guess then ideally to have a better permastore, I could identify and delete those live store files with old versions 10:56:08 <09g​ammafunk> so it would copy the newer ones from the dist 10:56:11 <04d​racoomega> I think deleting all of them is safe 10:56:19 <04d​racoomega> All the .store ones, I mean 10:56:22 <04d​racoomega> In saves/bones 10:56:29 <09g​ammafunk> ok 10:56:38 <04d​racoomega> (It's only going to cause one file copy later on) 10:56:45 <04d​racoomega> If you delete a modern one 10:57:32 <09g​ammafunk> I guess I want to temporarily stop webtiles before doing this though. It also doesn't solve the issue for other servers, of course. I suppose the real fix is going to have to wait for 0.34 10:58:31 <09g​ammafunk> just as an experimental, tonight after peak hours are over I can stop webtiles, backup the 0.33 current stores, delete the current stores, then restart webtiles 10:58:59 <09g​ammafunk> and just confirm after a day or so that we have all new store files working as they should 11:00:10 <04d​racoomega> Sounds reasonable to me 15:41:02 dracoomega: if it's still useful, I believe it was Spider:1 15:41:38 sadly I don't know the name of the ghost because I left that Necropolis without attempting to fight it so it never entered los 15:41:46 hence no note in the dump 15:42:14 Unstable branch on underhound.eu updated to: 0.34-a0-35-g6dd01c98c1 (34) 15:42:55 <04d​racoomega> Nah, I'm fairly satisfied we found the root cause in the meantime 15:43:01 <04d​racoomega> But hanks 15:48:20 -!- The topic of #crawl-dev is: Crawl Development | https://github.com/crawl/crawl | Logs: http://s-z.org/crawl-dev/, temporarily http://crawl.akrasiac.org/logs/cheibriados/ | People with +v have commit access, devs on bridged discord as well | General Crawl-related chat to #crawl | Long stuff to a pastebin service, please 15:48:20 -!- The topic of #crawl is: Play Dungeon Crawl Stone Soup online now! Type ??online for instructions, ??lg / !lg for play stats | PM Sequell for long queries | http://crawl.develz.org | FooTV game replays: ??footv for instructions | #crawl-dev for dev discussion, #crawl-offtopic for offtopic 15:54:13 -!- The topic of #crawl-dev is: Crawl Development | https://github.com/crawl/crawl | Logs: http://s-z.org/crawl-dev/, temporarily http://crawl.akrasiac.org/logs/cheibriados/ | People with +v have commit access, devs on bridged discord as well | General Crawl-related chat to #crawl | Long stuff to a pastebin service, please 15:54:13 -!- The topic of #crawl is: Play Dungeon Crawl Stone Soup online now! Type ??online for instructions, ??lg / !lg for play stats | PM Sequell for long queries | http://crawl.develz.org | FooTV game replays: ??footv for instructions | #crawl-dev for dev discussion, #crawl-offtopic for offtopic 20:49:08 <04C​gettys> with the tournament on, maybe a good time for some perf work? https://github.com/crawl/crawl/pull/4465 <- from wizardike https://github.com/crawl/crawl/pull/4401 <- from me 20:59:24 <04C​gettys> CAO seems to be struggling / hitching 21:28:30 <09g​ammafunk> no, not good to merge changes where we don't really know the impact during tournament 21:29:53 <04C​gettys> Fair. On the flip side, if it does help, it's immediately obvious 😄 21:29:57 <04C​gettys> but I see your point 21:30:09 <09g​ammafunk> well, it won't really be obvious will it? 21:30:21 <09g​ammafunk> if it were obvious, you'd probably have measured the results in e.g. local testing 21:31:12 <04C​gettys> I never said load testing was easy 21:31:54 <04C​gettys> But fair point, I should revisit it and try to measure better 21:32:25 <09g​ammafunk> I don't want to sound at all sanctimonious about needing strict metrics for how much improvement there will be for something 21:32:45 <04C​gettys> But also if it broke things you'd be the one getting the call, so easy for me to say "great time for perf stuff" 21:32:55 <09g​ammafunk> But if you haven't really measured the impact of the change AND it's the single busiest time of the year for servers where breakage causes the biggest headache possible 21:32:56 <04C​gettys> (not a literal call but you get my drift) 21:33:22 <09g​ammafunk> Merging it is basically an unnecessary "hail mary" when you don't need to make one 21:34:02 <04C​gettys> Fair, I see your point 21:34:22 <04C​gettys> (though CAO has been struggling pretty badly for me tonight, I'm talking like multi-second pauses) 21:34:37 <04C​gettys> Obviously I should pick a different, less busy server 😄 21:34:40 <09g​ammafunk> yeah with this many running processes it does tend to get stretched to the max 21:34:53 <09g​ammafunk> load is 1.97 21:35:01 <04C​gettys> I do think it might be worth pulling out just 1 line of my PR and trying it - turnign down compression ratio 21:35:22 <04C​gettys> that I did measure 21:35:48 <09g​ammafunk> python3 is a top process constantly 21:36:18 <09g​ammafunk> 49 players 21:36:56 <04C​gettys> Now that's the thing that would be worth doing right now that I just can't replicate locally, would be to profile the webserver under real world conditions 21:39:50 <09g​ammafunk> There's a lot you could replicate locally. Really shouldn't be an issue to run a webtiles server with lots of e.g. qw processes running concurrently; if you set their turn delay to be decently high at like 1-2s you could probably run quite a few. But for CAO you are dealing with a HD for storage 21:40:16 <04C​gettys> Sure, I can replicate a lot of things locally 21:40:32 <04C​gettys> The tricky part, is knowing what to replicate 21:40:38 <09g​ammafunk> I'm not sure what profiling one would do that's not fairly invasive for a live server 21:41:21 <04C​gettys> network latency, actual player usage behavior, etc 21:41:24 <04C​gettys> But I see your point 21:41:28 <04C​gettys> perf is probably too high overhead 21:42:47 <04C​gettys> well, depends on settings 21:48:38 <04C​gettys> I do think it'd be worth trying replacing zlib.Z_DEFAULT_COMPRESSION with zlib.Z_BEST_SPEED I did measure that, numbers are about a month back, it's something like ~2x less time spent in compression, for not much larger payloads (e.g. still get 80% compression ratio, whereas maybe you get 86% or something at Z_DEFAULT_COMPRESSION 21:48:55 <04C​gettys> But your call, at the end of the day 21:50:50 <04C​gettys> I see your point, is a risk tradeoff and I should have circled back to that PR last week when it would have been much less risky 21:51:29 <04C​gettys> Instead I'll go back to adding more load on the server 😛 21:58:27 <09g​ammafunk> well, I think it'd be nice to consider merging them shortly after t. Things are less critical then 21:59:22 <09g​ammafunk> Cgettys: One question about the change in this PR. Is the server/client thing of removing 00FF before send and adding in the client upon receipt still being done? 21:59:45 <09g​ammafunk> Because there are websocket clients (namely ones I've written) that also do this, since it's required to talk to webtiles 22:00:24 <04C​gettys> Oh boy, it's been a month or two, and life's been crazy... I think so, but I'll have to double check 22:02:36 <04C​gettys> It might be a breaking change for webtiles clients, iirc 22:03:15 <09g​ammafunk> I have a feeling it wouldn't be done any more since it was something done manually in the "implementation" on both sides 22:03:27 <09g​ammafunk> yeah, that's fine, it's mostly only my clients that would even exist 22:03:42 <04C​gettys> Yeah, I'm second-guessing myself, I think you're right and it's gone 22:03:47 <09g​ammafunk> I think some others have made some webtiles clients but they're all sort of one-off research projects 22:03:53 <04C​gettys> you're actually able to read the messages in the web browser with it 22:03:55 <04C​gettys> it's very nice 22:04:13 <09g​ammafunk> that would be helpful, yeah 22:04:32 <09g​ammafunk> just have to figure out how to make my clients set the websocket extensions to do this 22:04:53 <09g​ammafunk> this is basically only relevant to beem right now, but beem would be broken if this were merged, until I updated it 22:05:19 <09g​ammafunk> but I'm in the process of reworking its database right now anyhow, so I might be able to research this and future proof it 22:05:37 <04C​gettys> it's a pretty standard extension now 22:05:44 <04C​gettys> not trying to say it isn't work or anything 22:05:47 <09g​ammafunk> I'm using the python websockets module and not using tornado 22:05:52 <04C​gettys> but back when this was written I don't think the RFC was out yet 22:05:55 <04C​gettys> now it's old news 22:06:00 <04C​gettys> (or like, it was a draft RFC 22:06:20 <09g​ammafunk> no I'm sure it's supported in python websocket module, but I have to actually know how to set this extension to be used by the python module etc 22:06:37 <09g​ammafunk> and of course rip out the code that's doing a manual deflate setup with the 00FF stripping etc 22:06:38 <04C​gettys> ah, gotcha. Wouldn't shock me if it auto-negotiates it if you let it 22:06:50 <09g​ammafunk> yeah, true it might just do that 22:06:58 <04C​gettys> https://websockets.readthedocs.io/en/stable/reference/index.html#extensions 22:07:03 <04C​gettys> "The Per-Message Deflate extension is built-in. You may also define custom extensions." 22:07:44 <04C​gettys> looks like at worse you use factory method to tell if what parameters and are done 22:08:30 <09g​ammafunk> not sure that I'd need any special parameters 22:08:39 <04C​gettys> yeah, it's supposed to negotiate 22:08:53 <04C​gettys> so providign parameters is more l ikely to just screw things up 😄 22:10:04 <09g​ammafunk> python async def connect(self, websocket_url, *args, username=None, password=None, **kwargs): """Connect to the given websocket URL with optional credentials. Additional arguments are passed to `webscokets.connect()`.""" if username and not password: raise WebTilesError("Username given but no password given.") self.websocket = await websockets.connect(websocket_url, 22:10:05 *args, # WebTiles servers use a custom ping/pong JSON message instead # of using the WebSocket protocol ping/pong control frames. # Hence we disable ping_timeout, which uses the control frame # protocol, implement the JSON pong response, and let the # WebTiles server close the connection if it sees a timeout. # Any such closure will lead to an 22:10:05 exception being raised. ping_timeout=None, open_timeout=None, **kwargs) if username: await self.send_login(username, password) async def read(self): """Read a WebSocket message, returning a list of message dictionaries. Each dict will have a 'msg' component with the type of message along with any other message-specific details. Returns None if we can't 22:10:06 parse the JSON, since some older game versions send bad messages we need to ignore.""" if not self.websocket: raise WebTilesError("Attempt to read when stopped.") comp_data = await self.websocket.recv() comp_data += bytes([0, 0, 255, 255]) json_message = self.decomp.decompress(comp_data) json_message = json_message.decode('utf-8') try: message = 22:10:06 json.loads(json_message) 22:10:57 <09g​ammafunk> so I'm guessing that connect might not change an dI remove the modification to comp_data and remove the call to .decomparess() 22:11:06 <04C​gettys> yeah, that's about what I'd expect 22:12:25 <04C​gettys> it was mostly removing code that became duplicated (assuming that tornado versions are only somewhat ancient, and worse case, I believe any servers that don't have a new enough tornado version just won't compress at all, which would probably be fine too) 22:12:26 <09g​ammafunk> now I'm wondering about that weird ping/pong json message we do according to my comments written years ago there, and whether that's necessary, but that's just a totally different issue 22:13:04 <09g​ammafunk> right, I think the version we require is new enough that this shouldn't be an issue hopefully; cao likewise has this version 22:13:20 <09g​ammafunk> but in any case, if there are problems when rolling this out, they'll be seen quickly 22:13:51 <04C​gettys> Quickly as soon as the webserver is restarted, anyway 😄 22:13:59 <04C​gettys> Also might need that cache busting PR first 22:14:48 <04C​gettys> https://github.com/crawl/crawl/pull/4429 this one 22:15:06 <04C​gettys> because I'm pretty sure users might see problems if they don't pick up the new javascript but do have new webserver 22:15:15 <04C​gettys> for the 00FF reason described 22:15:39 <04C​gettys> Actually, no, I'm wrong 22:15:51 <04C​gettys> I don't know why it wasn't showing in the webbrowser before 22:15:59 <04C​gettys> unless it was messages being markedbinary instead of text 22:16:14 <04C​gettys> that whole 0x0000FFFF bit? it's from the RFC 22:16:16 <04C​gettys> https://www.rfc-editor.org/rfc/rfc7692 22:17:05 <04C​gettys> Riiight, that's coming back to me now 22:17:07 <04C​gettys> , binary=True 22:17:15 <04C​gettys> So I don't think it'll break your clients 22:18:11 <04C​gettys> I think your clietn could just change now and work fine 22:18:14 <04C​gettys> I think 22:18:26 <04C​gettys> Which would also mean that we don't need the cache busting PR first 22:22:41 <04C​gettys> Probably still wise tho 22:25:18 <09g​ammafunk> # TODO: should we change to binary=true? probably not? 22:26:00 <09g​ammafunk> If I understand this change, setting binary=True would not be what we want, right? 22:26:18 <04C​gettys> The question was really, "why was it bnary=True" before 22:26:51 <09g​ammafunk> Well, it had to be, right? since we were doing the deflate ourselves? But I confess I don't really know how websockets work at that level, in terms of what binary websocket frames imply 22:26:57 <04C​gettys> probably because the compressed payload wouldn't be valid utf-8, right 22:27:39 <09g​ammafunk> there's also the trivial answer to that TODO: we shouldn't change to binary=truebecause that'd be a syntax error 22:27:51 <04C​gettys> yeah, I should rip out the TODO 22:28:09 <04C​gettys> And yeah it meant = True 22:28:50 <04C​gettys> The C++ and Rust is leaking out in my comments apparently 😄 22:29:22 <09g​ammafunk> regarding my client, I'd still have to take out the code to try to decompress the websocket received data though 22:29:37 <09g​ammafunk> and the byte manipulation as well of course 22:29:40 <04C​gettys> I'm not sure you actually do though, that's the thing 22:29:56 <04C​gettys> I'm pretty sure what was done is just implementing the extension kinda behind tornado's back 😄 22:30:10 <04C​gettys> But I'm not sure, I didn't try testing another client 22:30:42 <09g​ammafunk> hrm, but presumably my python module is already decompressing the data after your PR is merged, right? 22:30:58 <09g​ammafunk> so what it gets from that recv is already the utf8 encoded json text 22:31:08 <04C​gettys> only if your client is enlightened 22:32:03 <04C​gettys> maybe. I'm no longer sure of anything 22:32:34 <09g​ammafunk> well, I'm just calling websocket's connect which is presumably performing the negotiation and seeing that it needs to do inflating etc is what I mean 22:32:52 <09g​ammafunk> so I think that recv is just getting me the json as text, but yeah I guess one can try it and actually see for sure 22:33:23 <09g​ammafunk> right now it's not getting me json text because we're doing deflate on our own with a binary websocket connection etc, but I mean after the PR is merged 22:33:37 <04C​gettys> If this is the "legacy" client, it may be that you could rip out the code today and just add extensions=['deflate'] or whatever it is 22:33:38 <04C​gettys> https://websockets.readthedocs.io/en/stable/reference/legacy/client.html 22:34:06 <04C​gettys> that's the thing tho 22:34:14 <09g​ammafunk> no, it's not that 22:34:16 <04C​gettys> if you look here https://github.com/crawl/crawl/pull/4401/files#diff-d22c88f12f7df69bb8fd206c1606b2c5e369ed0b6f5ed9313d92aae7ec555641 22:34:20 <09g​ammafunk> it's websockets.connect 22:34:23 <09g​ammafunk> whatever that is 22:34:49 <04C​gettys> we're advertising deflate-frame 22:35:07 <04C​gettys> so 1of 2 thigns, either we're not actually implementing it right, but are claiming ot 22:35:09 <04C​gettys> (very possible) 22:35:23 <04C​gettys> or we are, but client has to understand 22:36:52 <04C​gettys> there's literally like 4 or 5 python websockets implementations these days 22:37:11 <04C​gettys> makes finding anything messy 😄 22:38:13 <04C​gettys> The thingis, from jsut the function name and signature, it could be almost any of these 22:38:29 <04C​gettys> is there a using statement up top? 22:38:39 <04C​gettys> or rather an import statement 22:55:24 <09g​ammafunk> well according to that module doc, from version 14 (of the module) onwards it's the new client 22:55:38 <09g​ammafunk> unfortunately locally I'm using 12 and on my ec2 instance it's probably a bit older still 22:56:19 <09g​ammafunk> yeah 10.2 there it seems 22:57:04 <09g​ammafunk> I don't think it will be hard to adapt this code, that's all that matters in the end 22:59:28 <04C​gettys> You've successfully nerd-sniped me, I'm modifying your code into my own client now 😛 23:14:07 03Cgettys02 07https://github.com/crawl/crawl/pull/4401 * 0.34-a0-36-gaa8d4f5cb1: feat: Let tornado and the browser handle compression for us 10(6 weeks ago, 3 files, 19+ 2265-) 13https://github.com/crawl/crawl/commit/aa8d4f5cb125 23:16:45 <04C​gettys> Yeah, it's literally just like 4 lines go away and you just json decode straight away async def read(websocket): """Read a WebSocket message, returning a list of message dictionaries. Each dict will have a 'msg' component with the type of message along with any other message-specific details. Returns None if we can't parse the JSON, since some older game versions send bad messages we need to ignore.""" if not 23:16:46 websocket: raise Exception("Attempt to read when stopped.") comp_data = await websocket.recv() try: message = json.loads(comp_data) return message except: pass async def main(): await connect('ws://localhost:8080/socket', username='test', password='test') if __name__ == '__main__': asyncio.run(main()) 23:18:20 03Cgettys02 07https://github.com/crawl/crawl/pull/4401 * 0.34-a0-36-g3733b90d49: feat: Let tornado and the browser handle compression for us 10(6 weeks ago, 3 files, 18+ 2265-) 13https://github.com/crawl/crawl/commit/3733b90d496e 23:21:12 <04C​gettys> I tested with the newer async client, but older one afaik will do much the same 23:24:20 <09g​ammafunk> nice, thanks 23:24:26 <09g​ammafunk> as I thought it might be 23:24:56 <04C​gettys> I figured too, but I was curious, maybe i'll write myself an alternate webtiles client one of these days 😄 23:25:12 <04C​gettys> yah know, like running the client locally but talking to the server 23:25:24 <04C​gettys> yeah the webaseembly way would be cool too 23:25:26 <09g​ammafunk> I use it for beem and for qw 23:25:56 <09g​ammafunk> I could probably make the webtiles module public again but I made it all private years ago when certain trolls where trying to run spam and ban evasion bots 23:26:24 <04C​gettys> Don't sweat it, if I decide to do it I'd end up doing it from scratch anyway 23:26:28 <04C​gettys> in C++ or Rust or something 😄 23:26:41 <09g​ammafunk> its design is not very good anyhow, probably need to use some kind of signals library to handle message types instead of a big if statement, but it does what I need 23:27:08 <04C​gettys> Of course, if I really wanted a project, I should probably rewrite the webtiles server a bit 23:27:20 <04C​gettys> to support offloading the websockets stuff onto multiple threads 23:27:28 <04C​gettys> or processes or whathaveyou 23:29:01 <04C​gettys> But easier said than done 23:45:42 Monster database of master branch on crawl.develz.org updated to: 0.34-a0-35-g6dd01c98c1