Register a SA Forums Account here!
JOINING THE SA FORUMS WILL REMOVE THIS BIG AD, THE ANNOYING UNDERLINED ADS, AND STUPID INTERSTITIAL ADS!!!

You can: log in, read the tech support FAQ, or request your lost password. This dumb message (and those ads) will appear on every screen until you register! Get rid of this crap by registering your own SA Forums Account and joining roughly 150,000 Goons, for the one-time price of $9.95! We charge money because it costs us money per month for bills, and since we don't believe in showing ads to our users, we try to make the money back through forum registrations.
 
  • Post
  • Reply
galenanorth
May 19, 2016

Trying out a program that tries to systematically scrape from a store locator page that requires address input. I'm trying an approach that involves making requests over a grid of latitude/longitude coordinates, and using Google to automatically convert it to an address. The problem is that some of the coordinates of the grid will be in the ocean and not have addresses, so that scenario will probably require a long workaround.

For now, I'm skipping those coordinates and hoping that won't cause the scan to come up short. The workaround is probably going to involve automatically finding the nearest ZIP code from this spreadsheet which has a coordinate associated with each

Edit: I found the workaround. It involves splitting the grid unit centered in the ocean over and over until reaching a grid unit of a specified width, keeping the subunits that are on land, and discarding the ocean grid units once they've been split too much so the width is below the minimum

galenanorth fucked around with this message at 11:28 on Oct 23, 2018

Adbot
ADBOT LOVES YOU

galenanorth
May 19, 2016

The worst thing about web scraping is having to wait as long as 20 hours to see if a program collected all the records so that the number of records matches up with what it should be. I'm waiting for it to finish so I can see if my fix for the "no address exists for a latitude-longitude coordinate in the ocean" problem works

I guess I could have told the program to collect all the locations in a boxed area bounding the borders of Maine instead of the entire U.S., but it's been long enough I might as well finish it now

galenanorth fucked around with this message at 23:20 on Oct 23, 2018

galenanorth
May 19, 2016

Yes, I am rate limiting at about one second per request. Nearly all the requests are to Google and they have a 40,000 free requests/month cap so I'm hoping the scan completes before reaching that amount. I realized that I should start saving every Google response in a file, so when I repeat the scan to check whether anything has changed, I won't have to send the requests again. That'll allow the scan to progress 100x faster after the initial scan.

galenanorth
May 19, 2016

bob dobbs is dead posted:

next time, do small samples first, do the caching, do cache clearance. dunno which scraping lib you're using

Well, I decided to cut my losses and reduce the search area down to from the entire U.S. to a rectangle bounding Maine. There was a problem where the Walmart store locator returns zero results when it can't interpret the address, rather than returning an error, and I expect the other store locators which require addresses will do the same thing. I was able to work around it and detect all Walmart locations in Maine by instructing the program to go a level deeper and subdivide the grid unit into four grid units, so that one of the addresses generated by Google would likely be valid.

I implemented server response caching, though I still have to do some refactoring so that the program doesn't have to pause like when it's requesting from the server, and that's going great

Edit: I couldn't find a scraping lib that handles geospatial scraping from store locators over a grid, aside from this GitHub project. I had to make several months of modifications because it was written like it was by someone used to another programming language, in terms of indentation and style, it kept track of a lot of statistics and logging information that I didn't need and made it harder to read. It had a lot of bugs directly in the algorithm like doing a radius search consisting of a circle in a square unit instead of a square unit in a circle so that locations in the gaps between circles were missed. It didn't keep track of unique records, so there were duplicates. That sort of thing.

galenanorth fucked around with this message at 19:48 on Oct 25, 2018

galenanorth
May 19, 2016

I finished writing all my programs for scraping data from McDonald's, Subway, Walmart, Walgreens, CVS Pharmacy, U.S. Bank, and Fifth Third. Essentially, I wrote a lot of scraping classes from scratch which involve calculating finer coordinates over a grid, and also one which spiders over a business directory. It's going to take a few days to run them and collect the latest data, then I'll put them online, since I already finished making the website. After that, I'll register as an official business, sign the business up for a PayPal account, and take out $50 in Google Ads. The Google Ads buy comes with some deal where they throw in an extra $100 in ads if the ad placements do well enough.

I put about 900 hours into the project so far, so I hope it works. I've run into a lot of problems, like at first I cached files with the file names using the querystring from the request, then I got a bug the first time I ran into a URL longer than 255 characters, the maximum file name length, and then it took eleven hours to work around that and switch to another system. If it doesn't work, at least I can call it a challenging and engaging hobby.

galenanorth
May 19, 2016

NihilCredo posted:

that's great, but uh, what is the service you're selling, again?

I'll be selling CSV files containing scraped location data, competing with https://www.aggdata.com https://www.redliondata.com and https://www.scrapehero.com/store/shop/
The third one is the latest entrant to the market other than me, I feel reasonably sure. I'll also take web scraping job suggestions. As long as I can resell the collected data to other people, it'll be cheaper to people than paying an hourly rate because I'll be charging the flat $50-$100 fee, but I'll also be willing to work at an hourly rate if the client doesn't want the data shared with anyone else. Aggdata.com makes about $2.1m in annual revenue, so it'd be great if I could get a fraction of that. I think I'll eventually transition to a GIS or custom mapmaking service, something so that I'm not in direct competition with three other businesses

galenanorth fucked around with this message at 16:25 on Feb 4, 2019

galenanorth
May 19, 2016

Is this URL "502 Bad Gateway" for everyone or just me? https://geo.fcc.gov/api/census/block/find?format=json&latitude=36.05891418457031&longitude=-89.39492797851562

galenanorth
May 19, 2016

Okay, good. I was using it last night to fill out the missing "County" field for 50k records for my web scraping project, with a two second delay. I'd get "502: Bad Gateway" three times in a row, then it'd go back to normal. I figured this sort of error meant I could keep having the program retry like someone trying to get into a busy line during a radio station's "sixth caller gets a free T-shirt" contest and for a moment I was thinking "I didn't kill it, did I? :ohdear:"

I should replace it with client-side GIS soon and using the API was a stopgap for it. It's probably going to take three days to finish adding the rest of the County field even after the API comes back up

galenanorth fucked around with this message at 16:38 on Feb 9, 2019

galenanorth
May 19, 2016

I switched from FCC Block API to OpenStreetMap for getting the county data, as my placeholder until I get a more permanent solution like GIS in place. It's a lot less reliable and straightforward than FCC Block API for getting the county, as in sometimes the 'county' field is missing and I have to check the 'town' and 'city' fields to see if they have the word ' County' in them or they're in Virginia where there are independent cities. A few records out of 5,000 didn't have any county data at all, but it's better than nothing since the FCC API is still down.

galenanorth
May 19, 2016

I'm running the last CSV-creating program, for all the Fifth Third Bank branch locations and ATMs, before I register as an official business. It's got 3 days to go until it finishes running because the FCC's API for adding county data given lat-long data is slow and I've been putting off learning to do client-side GIS with Python

In the meanwhile I'm getting started on scraping data from Kroger, a grocery store chain which has its data available as a JSON object buried in the HTML source code of its store locator page. After I list my business on Google Ads for a month, if it doesn't bring in enough money, I'll get a full-time job and switch to working on the project on weekends. My parents have been sick and my mom is getting colon surgery soon. The project is the only reason I haven't gotten a job yet, but I can always quit the job if I need to in order to take care of my mom when she gets back from surgery.

galenanorth fucked around with this message at 19:21 on Feb 18, 2019

galenanorth
May 19, 2016

I tried finding out the prospective competition's pricing information with regard to annual subscriptions and got a "we only answer information from corporate emails, not anonymous ones" reply

I'm just going to offer to price match the annual subscription, then, with proportionality to the lesser amount of data possessed by my newcomer business. I'll also price match custom data scraping work and see if I can keep up

galenanorth
May 19, 2016

A site being scraped banned my bot 3/4 of the way through making 4000 requests for the final program run to get all their store locations' data. I tried seeing if there was a way to alter my HTTP headers that I hadn't tried, and I tried learning how to use proxies, but neither worked. It might be best to give up and try scraping the data of another company, put this program on ice, and try again later. I may run into this problem again, though. I suspect the problem is that they've got an anti-scraping tool that blocks all the proxies listed on all the most popular websites which list free proxies.

galenanorth
May 19, 2016

cinci zoo sniper posted:

yeah, if were an unethical, ruthless capitalist I’d just roll my vpn on digital ocean or similar

This practice is so entrenched in the web scraping industry that they have anti-anti-scraping tools like Crawlera, used by Amazon and Walmart in "they build a better mouse trap, they build a better mouse" competition. All I see is that it's legal and the target is a large enough corporation that they might as well have all the dollars, all of my competitors are scraping it, and I haven't even launched yet so I have none of the dollars. It's unethical, but I'll do it anyway, while always being sure to use large (e.g. 10s) amounts of time between requests. Thank you for the advice, everyone.

galenanorth fucked around with this message at 21:15 on Feb 25, 2019

galenanorth
May 19, 2016

Shaggar posted:

I wonder if its worth using goog cached pages for scraping

I tried Googling the URL of an example "store details" page. Sometimes it loses formatting, but in this case it worked. It's something down the list of techniques to resort to.

For now I'm aspiring to divide my time 50/50 between "work that only applies to a specific company", work which will be helpful toward expanding the range of sites which can be scraped in the long-run, and work which improves efficiency. This seems like a good balance between being able to write programs to generate data spreadsheets faster and still having something to sell in the meanwhile

galenanorth
May 19, 2016

I tried using the free version of the Hotspot Shield VPN, and that worked to get past the IP ban in-browser. For some strange reason, though, the server is nevertheless unresponsive when my program makes a request using the requests module, which doesn't make any sense. I tried randomizing the user agent in combination with how the IP address changes when I disconnect and reconnect to the VPN, but it didn't work. Tomorrow I'll have to resort to using Google Translator, the Google cache, or Internet Archive.

galenanorth
May 19, 2016

Not yet. A long while ago I installed it and tried using it for a browser for a bit and then uninstalled it. I'll see if that works with Python's requests module later. Thank you

galenanorth
May 19, 2016

Penisface posted:

the tor network itself is separate, but you can use tor as a proxy as well to access regular websites - this is literally how people in censorship countries get to view sites that are blocked

you can use tor to proxy any request you have, which naturally includes your scraper script

How do I use Tor as a proxy on Windows 7? When I tried looking for a tutorial, I found https://www.marcus-povey.co.uk/2016/03/24/using-tor-as-a-http-proxy/ but it doesn't analogize over to Windows 7 well. When I opened up the torrc file in the Tor Browser installation folder, there wasn't anything to uncomment. I tried downloading the Expert Bundle, but it didn't contain a torrc file. I came across this StackOverflow Q&A that pointed me to Polipo. I think I'll go with scraping Google Translate/Google's cache/Internet Archive for now

galenanorth fucked around with this message at 07:17 on Mar 1, 2019

galenanorth
May 19, 2016

The company whose locations I was trying to scrape lifted my ban and I got all the rest of the possible responses cached. In a way, this isn't how I'd like it to have turned out, because it's like an opponent in a match letting me win. The VPN working in-browser but not with Python's requests module is going to be a mystery. I have a HeaderlessScraperMixin for urllib3 when it doesn't seem like the website requires any headers, and I switch to HeaderScraperMixin for the requests module otherwise, so my next approach was going to be learning to use Selenium and writing a mixin for that. I found a proxy that isn't banned, too, so I think I'll go ahead and implement proxy juggling using the free website it came from.

galenanorth
May 19, 2016

I implemented proxy juggling in my web scraping project. For the test company, the only one I've come across that seems to be meticulous with its bot detection, it works as long as I only use a recent version of Chrome or Firefox for the user agent. I tried pairing it with user agent juggling, but I used the fake-useragent package as a shortcut because it is very frequently mentioned on StackOverflow. Not only is the database severely outdated, as far as only having the recent version of Chrome and Firefox, but its statistically weighted randomness only extends to the type of browser, so it's turning out old versions of Chrome in the 20-40 range very frequently. The company's server is smart enough to block outdated browsers like Chrome v32 with a ReadTimeoutError, meaning no packets are being read. It's good to be making progress with this and getting it straightened out.

galenanorth
May 19, 2016

One of the more annoying part of web scraping an API is all the stuff where I'm not sure what it means, but maybe I'll find out if save it anyway after running the statewide or national scan for store locations using cached responses, like 'curbside': 'Y' in the JSON. There's a 'mobile': 'Y' which might be 'On-the-Go Mobile Ordering' but there's also a 'adv_ord': 'Y' which might also be 'On-the-Go Mobile Ordering'. There's also loyalty: 'Y' which I'd guess to be the same thing as dunkincardenabled: 'Y'. A lot of this stuff doesn't look like it's even used in the website's store locator GUI.

I'll figure it out eventually. It'll probably take a few more hours, though. Right now my dad is outside in a wheelchair smoking and I'm going to need to wheel him back in, and I'll wait until I can concentrate more

galenanorth
May 19, 2016

Google's geocode API was doing this thing where it was associating 2808 S Church St, Murfreesboro, TN 37128, USA with a lat-long coordinate in Texas, even though it worked when I made the request in my browser by directly entering the URL. I think it's because the server keeps track of user-preferences, and that's hard to test for. When I added the filter "components=administrative_level:TN" to the URL, that fixed the problem, even after making another request with that part of the URL removed. It drove me crazy for about two hours

galenanorth
May 19, 2016

I noticed just now that Google's Geocoding API inconsistently labels different locations in Puerto Rico as having Puerto Rico in the "country" field, due to U.S. territories each having their own code in an ISO standard, while other locations have Puerto Rico at the "administrative_level_1" field for subnational entities (e.g. Quebec, Alabama, and Wales just the same). I'm going to have to work around that while I'm working around a dozen different other things. I've run into Google geocoding errors which are severe enough to return a coordinate in the wrong state, rarer than 1 in 2000, but still annoying. Doing little things like changing "United States" to "USA" or removing the second address line or replacing the comma separating it from the first address line works to prevent the "wrong state" error or "zero results" error only sometimes

galenanorth
May 19, 2016

I was about to register my business with the city clerk when I read that I need a Home Occupation License with permission from my apartment manager, and I'm staying with my parents in a retirement home so I can't get that. I guess I'll get a "real job", save up some money, try again in a year, and hope the #3 competitor ScrapeHero in this type of business hasn't pulled too far ahead of me by then. The website is www.locatr.tk if anyone wants to take a look at what I've done so far

galenanorth
May 19, 2016

How much effort should I go toward to expand my programming skillset without getting paid for it? I only have three years of experience working with Python/Django, HTML5, CSS3, and JavaScript in personal projects. Every ad has a different additional technology involved like React or Angular or Celery or Redis or "AWS services like S3, CloudWatch, CloudTrail, Redshift, EC2". These other technologies aren't strictly required, so I've been figuring I'd immediately dig into learning them on my own time if I managed to get the job anyway. I have a geoscience degree instead of a computer science degree, and I haven't had a remote programming job yet.

galenanorth fucked around with this message at 06:05 on Apr 3, 2019

galenanorth
May 19, 2016

welp, I've sent out eight remote Django job applications on Ziprecruiter (edit: and one on LinkedIn). Now all I have to do is wait and fill time with leisure activities

I guess if I don't hear anything by a week, I'll settle for remote call center work or something instead. I could learn NumPy and SciPy, but I don't really want to do it unless I can get a job doing it and which sounds interesting since I have a math minor and there don't seem to be any remote jobs available on Ziprecruiter that don't require a master's in analytics or statistics, or they require four years of experience to start with

If I can't find anything, I could launch my business anyway even though I can't get a Home Occupation License and see if I can make enough money to move out

galenanorth fucked around with this message at 23:46 on Apr 3, 2019

galenanorth
May 19, 2016

I'm not 100% sure about why React is useful. This Packt textbook starts by talking about the view, or the template in Django, and reduction of boilerplate code, but Django already has a template inheritance system built in for that. Maybe it's that it binds the inheritance of the HTML, the inheritance of the JavaScript, and the inheritance of the CSS into one inheritable app or widget. If that's true, I can see how that might be useful for something like creating custom form widgets without having the code in three separate documents. I see a lot of job listings with React in combination with Django, though, so I'll probably try it anyway and see where I end up.

galenanorth fucked around with this message at 01:47 on Apr 4, 2019

galenanorth
May 19, 2016

Lutha Mahtin posted:

are you able to get a PO Box or a coworking space?

also did u try "well im technically doing all the business online, so" :v:

No, but the thought of trying the "well im technically doing all the business online, so" line of questioning did occur to me. I couldn't get through to the Zoning Board and they wanted me to leave a message, and I'm hesitant about leaving details about something that could get me in trouble, but I'll call the county clerk's office and ask them instead. One of the nine places where I applied for a Django job wrote me

"There's been a lot of interest in the position making the selection process extremely competitive. Although your experience is impressive, it's not the best match for this specific role. I'll keep your resume and contact you if I come across other opportunities that may be a better match."

I'm going to work on getting an online tutoring job now

galenanorth fucked around with this message at 02:46 on Apr 9, 2019

galenanorth
May 19, 2016

They told me that whether I need a Home Occupation License would depend on how much money the business makes. I said "I have no idea" and they said they needed an estimate but redirected me toward a customer service desk for public works which connects to a bunch of permit offices, since I'd need a permit if I'm working from home, before I could answer. In all likelihood, yes, I need a permit for working at home even working from a computer, since they didn't have a favorable reaction when I told them the business essentially sells data in spreadsheet form. It's really annoying and I'd wish I'd known I'd need apartment manager permission at the outset

galenanorth
May 19, 2016

Does this sound like it suffices for an answer to "Personality counts! Take some time to craft the perfect answers to the following questions. Tell me why you would be an amazing fit" for some programming job

"When I set a goal or need to solve a problem, I don't stop until it is done. I can learn new technologies quickly. I can work well with others. When I need clarification about how to meet specifications for a task, I ask the right questions."

I don't know the latest version of JavaScript, ES6, or React or Angular as in the job description, only HTML5, CSS3, and JavaScript, but I'm applying anyway because I got invited to apply to it. What should I ask for the salary? This would be my first programming job and I might want it badly enough to say $30,000.

My plan has been that I'm going to try online tutoring, but first I need to reorganize my formula sheets and notes, and then if online tutoring doesn't work out then I'll make a good effort to learn all these common new requirements and try remote programming job searching again. My parents don't want me applying for a job where I have to leave the home because they think the retirement home manager is only letting me stay because I'm seen consistently taking care of them. So, I'm only applying for remote work.

Honestly, though, my parents keep me busy for about 4 hours a day over 12 hours taking care of them and sometimes I have to lay down "I'm working on something and I need to be uninterrupted, please", and they have a caregiver over until 2pm each day about four days a week, and sometimes it's so chaotic I think it may be a severe impediment to working remotely. I suspect that online tutoring would be better suited toward being able to axe the connection when I need to take care of my parents than an online programming job where I have responsibilities and need to be continuously virtually present for meetings and sharing information, but I don't want to be someone who doesn't give things a chance, and this may work out

Edit: I went ahead and submitted it. Part of me is hoping I don't get it so that I can try the online tutoring thing and become better prepared

galenanorth fucked around with this message at 20:00 on Apr 28, 2019

galenanorth
May 19, 2016

I'm going to apply to some AngelList.LA "startup accelerator" with $50k-$200k in seed funds using the pitch

"Over the course of a year, I've created programs which scrape geolocation data and location information, such as store hours, for about a dozen different national U.S. corporations, including restaurants, supermarkets, banks, and pharmacies. For example, for companies that let their customers find locations via a radius search, the program performs a sweep over a grid.

The company then sells the data as CSVs. There are three main competitors: Aggdata.com, Redliondata.com, and Store.scrapehero.com. The first of these makes about $2.5 million per year, according to the market analysis company Owler. My spreadsheets often contain more columns than Aggdata.com in a way that is better formatted. I live with my parents in an apartment where I can't get permission to start a business from the landlord. Fund the first year or two of operation, and you'll receive access to all of the programs and data, plus a significant enough share of the profits to glad to have encountered this opportunity."


I don't really like the idea of going to LA at all, but I really need the money. On my city's website, the notice that I had to have permission from my apartment owner to get a home occupation permit was buried three links deep through the "business advice" section. I didn't see it until I had done a year of work and was ready to start selling CSVs, and not checking for that was really dumb.
:negative:

galenanorth fucked around with this message at 16:28 on Sep 25, 2019

galenanorth
May 19, 2016

Penisface posted:

maybe dont mention that you live with your parents?
also isn’t there a way to start a business with an address in delaware or some poo poo like that?

The state sales tax seems to apply as long as an employee is present in the state, unfortunately.

galenanorth
May 19, 2016

Penisface posted:

no, what i mean is that i find it impossible to believe that you can not open a business in the US unless your landlord lets you - aren't there PO boxes, shared offices or whatever you call the establishment that you pay some money to so they let you have an "office" there?

like isn't the US supposed to be the cradle of entrepreneurship where you can fart out a business in a matter of seconds??!

A shared office would work, but I don't have the money for even that at this time.

http://www.nashvilleclerk.com/business/obtaining-a-business-license/ -->
http://www.nashvilleclerk.com/business/starting-a-new-business-in-davidson-county/ -->

Nashvilleclerk.com posted:

15. You may consider operating your small business from my home, under certain limitations, and with a “Home Occupation Permit”. Permits are required for all home occupations and may be obtained through the Department of Codes & Building Safety at a cost of $50.00. Contact our Zoning Division at 862-6500 to obtain complete information regarding current requirements. The basic limitations include: There can be one employee associated with a home occupation that is not a resident on the property. There can be no signs or advertising of a business at the residence (except in the areas of child care, or tutoring). One vehicle, not exceeding an axle load of 1.5 tons is permitted, but no advertising is permitted on the vehicle. A reliable “rule of thumb” is: if you can see it, hear it, or smell it, the business is not allowed as a home occupation.
https://www.nashville.gov/Codes-Administration.aspx -->
https://www.nashville.gov/Codes-Administration/Land-Use-and-Zoning-Information/Zoning-Examinations/Home-Occupation-Permits.aspx

Nashville.gov posted:

If you rent, you will need a letter from the management of your apartment complex or the landlord of your rental property permitting you to have a home occupation.

galenanorth
May 19, 2016

edit: post removed and transferred to the job hunting thread

galenanorth fucked around with this message at 07:27 on Oct 5, 2019

galenanorth
May 19, 2016

Krankenstyle posted:

fwiw i would prefer github link or similar over google drive

I haven't put anything on GitHub because I had the attitude that I don't want to put online to share with everybody something that I created for free, on the grounds that it's like being asked to get paid in exposure and other industries wouldn't tolerate it so programming shouldn't tolerate it either, but I'll try anything at this point. I created a Django add-on which allows screening file uploads by file specifications, such as maximum file size or media dimensions, and I think a lot of people will find it very useful. It is very specific and able to handle video data and photo EXIF data. You might've seen the photos of the cats in the YOSPOS thread that render at a 90 degree angle until you open them in a new browser tab, right? Well, the EXIF-handling code takes care of that.

Edit: I mean, I know a few Django projects that could benefit humanity by using that code if they chose to incorporate it, since I saw an ad for a Django programmer for the international database of exoplanets. I don't want some corporation on the moral scale of Facebook or Twitter incorporating it without paying me, though.

galenanorth fucked around with this message at 18:06 on Oct 5, 2019

galenanorth
May 19, 2016

I'm still going back through my notes and revising them to include things I missed, like jQuery's deferred objects since those weren't mentioned in the Ajax section of my jQuery reference book. Monday, Tuesday, and Wednesday, I'll call the local zoning office to see if I can reach somebody and leave answering machine messages if I can't. If I can't get in contact with anyone, I'll go ahead and launch the website despite not having a home occupation license and within a few months either do well enough that I can get an office or it'll fail anyway. My friend offered to pay me $30/session to tutor math, starting by the end of the month, so that's where I can get the money for hosting and Google ads. While I'm waiting on that, I'll try submitting my notes to a publisher company in case there's a chance of that happening.

None of the six applications I submitted for remote front-end or Django web development jobs have worked out. I kinda want to skip plans A through E, with plan E involving an online tutoring job or remote call center job. If those plans don't work out, it'll probably take three more months to work through them all. I want to skip straight to plan F of working locally at a minimum wage job until I can get hired for a local geology or web development job on the bus line. The manager lets me stay with my parents at a retirement home because I'm my parents' caregiver, although not off-the-books, and my parents don't want me to rock that boat. I'm almost done with everything, but it's been nearly four years since I graduated with a geology degree while having failed to have saved up money for after school, and I don't want to stay here forever.

edit: The chapter was in there, I just didn't see it because it was past the chapter on writing jQuery plugins, so I thought it was something extra I didn't need. I'm using "Bibeault, Bear et al. JQuery In Action. 3rd ed., Manning Publications Co., 2015."

galenanorth fucked around with this message at 15:49 on Oct 14, 2019

galenanorth
May 19, 2016

edit: nm

galenanorth fucked around with this message at 00:36 on Nov 1, 2019

galenanorth
May 19, 2016

I spent an hour and a half looking for jobs this morning. I looked through some employment agency websites and most of their positions, minimum wage type or not, seem out-of-state. I applied to a UX design job, but they want experience working with all these programs and I've only worked in CorelDraw. It's a little depressing. I applied to a Mapco gas station and they wanted me to be available seven days a week, and told me to apply again if my availability changes. I'm not available on Thursdays because I'm tutoring a friend in algebra for $30/session, and I'm not available on Fridays because a family friend takes me grocery shopping for my parents that day. The tutoring commitment ends in mid-December, so I'll start saying I'm available seven days a week after that. I have a geology degree, but all the local listings require having taken the license test ($600) or a driver's license. I'm signed up with an employment agency, but the job listings I received last week weren't on the bus line.

I could get started on learning some new skills in addition to my Sass/React-level knowledge of web development, like knowledge of SQL languages such as PostgreSQL (I've only worked with Django ORM so far), C++, and Adobe Photoshop, and hope that then I'll be able to get a job. I'll go ahead and give that a chance. I'm considering running for office in a place where the incumbent is usually unopposed, and the deadline for starting on that is probably a few months away, so if I'm going to do that. The two things I'd want to do to start with that are make a WordPress website and create presentation software which automatically visits a different news article and highlights a different passage each time I click, using Selenium, so I don't have to think about it as I'm talking and making videos.

galenanorth fucked around with this message at 16:43 on Nov 4, 2019

galenanorth
May 19, 2016

I started learning how to use Visual Studio Code and completed the installation and configuration after 7 hours

I tried using the CodeLite IDE, but the output pane is blank when I try to debug. Someone suggested it could be because of a missing DLL and another person said it's because of a missing space in a directory somewhere, so I switched from C:\My Name\My Documents\Documents to C:\CodeLite\CodeLite, and that didn't work. This is all such an ordeal, compared to how Python and IDLE start running 5 minutes after downloading and installing it. I don't really need the CodeLite IDE anyway, and I was just curious because my old university switched from vi and SciTE to that one.

galenanorth
May 19, 2016

A year ago, I started an online web scraping business called Locatr and closed it down when I realized I lived somewhere that doesn't meet the zoning requirements unless I get permission from my landlord. I needed the $100 or so in launch money for Google ads or whatever for personal expenses anyway. I didn't make a single sale, so I closed down my business tax account and sales and use tax account with the state government, owing a grand total of $0. Today I get a notice that my taxes are due April 15, so I log into the website, they tell me I owe a minimum business tax of $44, $22 for the state and $22 for the city, and I don't have that much in my bank account. They've also got an error under Sales and Use Tax where they say I didn't have a location registered, some error in the closing process, so I'll have to call a number to see if there's a separate minimum charge for that. The information that I'd have to pay a minimum tax was on

https://www.tn.gov/revenue/taxes/business-tax.html

and I didn't notice it or forgot about it. Luckily I have $1000 in credit on my credit card in case of emergencies. They added on a $1.01 fee for that.

I was planning on re-opening the business once I'd saved up enough money and finished studying more computer programming skills. The site sells .csv files with more up-to-date store location information than even Google can offer via direct site scraping. I should study PostgreSQL and store the records in a proper database as well, and I'll also finish studying C++ before then.

If I had known I'd be charged such a high minimum, I might have doubled down and taken a risk with the $100 in Google ads. I had about 20 CSVs for chains like McDonald's, Fifth Third bank, and Wal-Mart.

galenanorth fucked around with this message at 15:56 on Mar 19, 2020

Adbot
ADBOT LOVES YOU

galenanorth
May 19, 2016

I'm reading through Learning PostgreSQL 11: A beginner’s guide to building high-performance PostgreSQL database solutions, 3rd Edition by Juba and Volkov from Packt. It's a little annoying that they use terms explained later in the book and seem to explain things backwards, leaving the definition of a term as the last sentence in a section. I used to think that for a Django job, having used PostgreSQL as a backend was probably enough to fulfill the job's PostgreSQL requirement, but at the same time I kind of knew that wasn't true. When I'm done studying PostgreSQL, C++, and I've switched from Corel Draw 10 to Adobe Photoshop, I'll apply to more places again.

I'd been using Corel Draw 10 for web development since my professor used it in a class about drafting in geology, but that program doesn't support opacity as a feature, and it isn't useful for getting a job web developing for somebody else. It has an annoying problem in its UI where 100% zoom doesn't refer to "1 inch on the screen equals 1 inch on paper", it refers to a ratio with what the developer considered optimal for vector drafting. Deep in the GUI, like under Tools > Options > Workspaces > Toolbar > Zoom options, there's a "1:1 zoom ratio" checkbox that fixes it, but it really should be checked by default. It also does this weird thing after where it multiplies my input by 0.89, so I have to enter 112% to get it to show 100%. 100% zoom only works how I thought it should work in Print Preview mode.

galenanorth fucked around with this message at 19:27 on Apr 2, 2020

  • 1
  • 2
  • 3
  • 4
  • 5
  • Post
  • Reply