Hello! 👋

  • 0 Posts
  • 50 Comments
Joined 1 year ago
cake
Cake day: June 11th, 2023

help-circle

  • For me has ubuntu and pop os just worked with all my hardware. I have tried manjaro (it broke often after updates so I gave up on it. This was maybe 5 years ago), arch (worked great but got tired of the tinkering), mint (never wanted to work with my wifi and Bluetooth adapter so I don’t have a ton of experience with it), debian (too long ago for me to remember why I switched), ubuntu and now pop OS for 3 years and I won’t switch. I have only had problems with yakuza 1 (couldn’t save and input delays were crazy) and mafia 1(all icons for the game pad were blank) . I also own a steam deck and play a lot of games on it without any problems. But I do not mod that much, skyrim is the only one I heavily moded and I played it on windows. Cyberpunk did I mod some on my steam deck to fix bugs but it was bellow 30 mods.

    Tldr; i am with kaldo! Pop OS is great! And yes I use Nvidia GPU and I have never had a problem with the drivers.


  • I am with you. We should use the ai as a tool to automate or remove things that is frustrating or in the way of the actual goal to help the customers. Plus I don’t think any model is good enough (yet) to act as tech support (they can use open ai if it was enough). I think ai is great as a tool tho. For example you can use it to go through a lot of documents of products, policies, other tickets and so on so the tech support person can find the relevant information faster. We can also use ai to create summerise of the call or take notes and so on. A lot of great potential to make everyone happier but I don’t believe in replacing actual ppl.




  • Not a db, I just want to share one reason that happened to the startup I was working at.

    The owners were thinking about keep business as usually which means paying more to the employees or scaling up which is very expensive, they only had small to medium sized companies as their customers(but many). Then this big company came from a different country, they were on a shopping spree buying a lot of companies(scaling up and taking over the market). The owners of the company I worked at were soon 65 or above 65 so they thought that it was a opportunity. Because if they sell then they don’t have to be worried about money after retirement. So they did. But they did think the company would be taken care of, but I think they also looked away from the bad stuff, wishing this would be great. Almost everyone left the company after a year or two (myself included), it was a sinking ship. Same goes for the other companies they acquired.

    Tldr; selling the company to get retirement money while hoping the company will be taken cared of. Took only a year for ppl to leave because of how bad it was.



  • Same (proton and ublocker), but I have also found that some web pages cares what browser you use if you are on proton. If I use chrome then they may just do the verification when you wait 3 seconds but with Firefox and proton (not without proton) do I get a lot of captcha sometimes even after each other just to make triple sure I am not a bot… or even get blocked entirely…




  • I am getting scared… That is not a normal pay here for an experienced developer. Who gets over 10k a month?! Sign me up! I would say even 100k in a year is a lot for someone, 60k to 80k is a bit more normal. But we also get payed vacationdays (30 days) plus all of the payed holidays and half days, and payed sickleave (80% of your pay) and monthly pension (4-6% of the pay). But that does not cost 140k - 120k for a company, and that was low?..

    Everyone think this is normal in the us?!




  • I don’t know about other countries, but in the nordic countries was it not a day to get present until after 1600 so there may have been a time with less visible capitalism. The presents started as a gift to ppl in need and later became a thing you give to family and friends. That day wasn’t even a Christian day at first until we converted. It was called midwinter(no presents, just a celebration). But like the person who responded to you before me said; capitalism has been around a lot longer.



  • I didn’t known that it was seen as a bad thing by some devs. At my company (consulting) are we saying that we failed if we spin up a full server. we do infra as code very often and that wouldn’t be as easy or possible as with serverless. It is easier to monitor what cost money (need more performance) that way too. I have seen some wish to get into the server, you don’t have to, that is the thing, all your configurations are done with in a portal like azure, the only times (extremely few) i have went into a serverless is when i have to check the apps configuration for a very old app that may have been deployed manually (get surprised every time) and i don’t know the values that need to be set and there has been times logging is done to disk instead of using application insight. But thankfully these are exceptions not the norm. It is usually applications that was a fire and forget project and have always worked until they want some new functionality.



  • Ooof, i am glad you don’t have to do it anymore. I have a customer who is in the same situation. The company with the site were also ok with it (it was a running joke “this [bot] is our fastest user”) but it was very sketchy because you had to login as someone to run the bot. thankfully did they always tell us when they made changes so we never had to be surprised.


  • Kuma@lemmy.worldtoLemmy Shitpost@lemmy.worldChad scraper
    link
    fedilink
    arrow-up
    9
    arrow-down
    2
    ·
    1 year ago

    This kinda reminds me of pirating vs paying. Using api = you know it will always be the same structure and you will get the data you asked for. Otherwise you will be notified unless they version their api. There is usual good documentation. You can always ask for help.

    Scraping = you need to scout the whole website yourself. you need to keep up to date with the the websites structure and to make sure they haven’t added ways to block bots (scraping). Error handling is a lot more intense on your end, like missing content, hidden content, query for data. the website may not follow the same standards/structuree throughout the website so you need to have checks for when to use x to get y. The data may need multiple request because they do not show for example all the user settings on one page but in an api call they would or it is a ajax page and you need to run Javascript scripts and click on buttons that may change id, class or text info and they may load data when you do x with Javascript so you need to emulate the webpage.

    So my guess is that scraping is used most often when you only need to fetch simple data structures and you are fine with cleaning up the data afterwards. Like all the text/images on a page, checking if a page has been updated or just save the whole page like wayback machine.