#ossasepia Logs for 19 Sep 2019



April 20th, 2020 by Diana Coman
diana_coman: http://ossasepia.com/2020/04/20/ossasepia-logs-for-18-Sep-2019#1003056 - all right. [03:30]
ossabot: Logged on 2019-09-18 19:26:25 shrysr: diana_coman: re:V - i'll do it. and no not 'train me for them'... its as you just said - useful for me and tmsr related. useful for me i.e that cover only major skill/proj exp deficits. to illustrate: work with 'large data' on remote sql server and applied text analysis to do XXX. Re: apps now, yes, already sent, and more down the line def. [03:30]
diana_coman: !o uptime [03:57]
ossabot: diana_coman: time since my last reconnect : 0d 0h 32m [03:57]
diana_coman: whaack: what sort of pedal-powered net connection do you have there, lolz [12:00]
shrysr: http://logs.nosuchlabs.com/log/ossasepia/2019-09-18#1002848 << presume this still stands, i.e to be re-written? [12:07]
snsabot: Logged on 2019-09-18 05:58:16 diana_coman: shrysr: and don't you even dare to change or "update" that post! Let it stand there and go and write another one and if it's still not good, you'll write another one and another one and I don't care if it gets to 1000 or 1mn until you finally do it right - the pile will stand there as a monument to your worship of stupidity, as taller as it gets. [12:07]
diana_coman: shrysr: yes. [12:09]
shrysr: ok. [12:19]
shrysr: http://ossasepia.com/2020/04/20/ossasepia-logs-for-18-Sep-2019#1003009 << I thought I should spell this out in more detail than my answers yesterday incl possible options. It would be shaped by the skill deficit list + feedback on apps and would eventually change ofc - but I thought itz still important to sketch out an overall plan now? [12:26]
ossabot: Logged on 2019-09-18 17:47:41 diana_coman: shrysr: that sounds like "latest date - march"; and if it still doesn't happen by march? [12:26]
diana_coman: shrysr: only if you need help with it really; this particular topic&interest is mainly yours so in this case I don't really need to see a plan from you, it's enough if you actually have one. [12:28]
diana_coman: shrysr: when you go through the logs again for the re-write, check please on both logs.ossasepia.com and logs.nosuchlabs.com and let me know if you see anything still missing/strange on either/both, ok? [12:29]
shrysr: diana_coman: okay .... but lol is that a trick to make me read twice :P [12:31]
diana_coman: shrysr: lolz, not a trick, just a very efficient use of all resources available. [12:32]
diana_coman: seriously, I'd just tell you "read it 10 times" if I was specifically after that :P [12:32]
diana_coman: shrysr: but yes, it *helps you too*; that's an important consideration when I choose and give tasks. [12:33]
shrysr: :D okay.. glad to be of serviz massa. fwiw i thought i saw something missing even on yest stuff.... i had nosuchlabs on bookmark and switched to ossasepia ... but will re-check all and revert as directed. [12:34]
diana_coman: shrysr: nosuchlabs got updated only today so please look again, yes. [12:35]
diana_coman: you can keep whichever one you want on bookmark otherwise; I suppose ossasepia might be handier because it lists #o as default chan (while nosuchlabs lists #trilema as default) [12:36]
diana_coman: but other than that, there shouldn't be any difference [12:36]
diana_coman: jfw: weren't you saying something about getting back on Wednesday aka yesterday? [12:37]
shrysr: btw: in terms of checking both - shdnt there be more efficient way? i dloaded the db snapshot initially thinking of finding some way to filter /regex ... i havent dealt with a db 'dump' so went back to manual. I see it is a complete dump of all logs. [12:38]
diana_coman: shrysr: you can use the raw knob [12:38]
shrysr: diana_coman: didnt follow that? raw knob ? [12:39]
diana_coman: shrysr: eg http://logs.nosuchlabs.com/log-raw/ossasepia?istart=999600&iend=999700 [12:40]
diana_coman: vs http://logs.ossasepia.com/log-raw/ossasepia?istart=999600&iend=999700 [12:40]
diana_coman: I guess you can just get them with a curl and compare with a diff [12:40]
diana_coman: for that matter, make yourself a bash script so it does all (if there is some difference it will be in lines with id < 1000000) [12:41]
asciilifeform: diana_coman: dun work w/ conventional line-based unix diff, as the timestamps (at least post-1000000, current era) are ~always different [12:41]
diana_coman: the raw mechanism above works for at most 500 lines at a time so you'll need to do it in batches [12:41]
asciilifeform: ( i use another tool 'meld', for these, character-based differ, but requires x11 ) [12:42]
diana_coman: asciilifeform: well, this is pre-era and copied so they should be the same though, no? [12:42]
asciilifeform: diana_coman: these indeed oughta be 100\% same [12:42]
asciilifeform: was speaking of the general case [12:42]
diana_coman: so shrysr see, it's up to you if you make out of this task a read-twice task or a learn-to-automate-with-bash-and-curl task [12:42]
asciilifeform: ( and imho we are gonna need some semi-automatic tool for this job ) [12:42]
diana_coman: in either case...win-win, though. [12:43]
diana_coman: asciilifeform: we certainly need a semi-automatic tool for sync, more generally, absolutely. [12:43]
diana_coman: shrysr: where are you lost in there? [12:44]
shrysr: hmm ...i was actually thinking when the review task was given - to use R to connect to the DB snapshot - get each line into a CSV file and figure out a way to grep through, and was thinking the same for the comparison in fact. [12:44]
diana_coman: shrysr: massively bloated solution [12:45]
asciilifeform: shrysr: db snapshots are the wrong place to do it, as they tend to be 1x/day , and the exact time will differ by what the various boxes think the time of day is [12:45]
diana_coman: is it driven by your lack of knowledge of bash/curl or what? [12:45]
asciilifeform: shrysr: the clocks aint synced and i dun think they ever will be [12:45]
asciilifeform: as for 'r', i dun even have it on any of my machines. [12:47]
shrysr: ah ok..yes, i'm not fluent in bash scripting per se, but i've always wanted to - so ok. wd like to go the bash/curl way. [12:47]
diana_coman: eh, he wants it for data-diddling; not even a bad tool at all *for that*; but this here is a bash+curl+diff job [12:47]
asciilifeform: shrysr: generally folx won't take up 'here let's use my oddball prog lang' w/out a ~very~ good reason [12:47]
asciilifeform: diana_coman: rright but if he does it in cobol or whatever, he'll be the only 1 who can run.. [12:48]
diana_coman: shrysr: man curl is your friend; but seriously, it's easier than you think so go for it; and *ask* if you get stuck, don't waste time. [12:48]
shrysr: yes, i und. its exactly as diana_coman said tbh. [12:48]
shrysr: i.e data-diddling. but i go for it.. and will revert. [12:49]
diana_coman: cool. [12:50]
diana_coman: shrysr: how's it going? [15:56]
shrysr: working on it. This is fun. not stuck yet, but figuring out how to assign values to variables in bash and include in curl for start and end id's [15:57]
diana_coman: heh, enjoy then. [15:58]
shrysr: diana_coman: hmm - i tried the raw log for id > 1000000 - both visual and diff inspection seem to indicate they are still the same, i.e in reference to http://ossasepia.com/2020/04/20/ossasepia-logs-for-19-Sep-2019#1003089 ? [17:31]
ossabot: Logged on 2019-09-19 12:41:37 asciilifeform: diana_coman: dun work w/ conventional line-based unix diff, as the timestamps (at least post-1000000, current era) are ~always different [17:31]
asciilifeform: shrysr: it'll all be same up to the point where diana_coman plugged in ossabot [19:03]
asciilifeform: shrysr: elementarily, errything prior to that is either from diana_coman's era2 dump (so same in both) or from snsabot's db (again same in both, diana_coman imported) [19:03]
asciilifeform: in entries logged by snsabot and ossabot in realtime, the timestamps (which come from local machine, always, not fleanode) will differ. [19:04]
asciilifeform: ( the ordering of lines will also, at certain times, differ. ) [19:04]
shrysr: aye. I saw what you mean while playing ... probably somewhere around 1000900+. [22:46]

Comments feed: RSS 2.0

Leave a Reply