Hi,

My question certainly stems from the imposter syndrome that I am living right now for no good reason, but when looking to resolve some issues for embedded C problems, I come across a lot of post from people that have a deep understanding of the language and how a mcu works at machine code level.

When I read these posts, I do understand what the author is saying, but it really makes me feel like I should know more about what’s happening under the hood.

So my question is this : how do you rate yourself in your most used language? Do you understand the subtilities and the nuance of your language?

I know this doesn’t necessarily makes me a bad firmware dev, but damn does it makes me feel like it when I read these posts.

I get that this is a subjective question without any good responses, but I’d be interested in hearing about different experiences in the hope of reducing my imposter syndrome.

Thanks

  • lmaydev@lemmy.world
    link
    fedilink
    arrow-up
    2
    ·
    10 days ago

    I’ve been using c# since .net 2 which came out around the turn of the century (lol)

    I’d happily call myself an expert. I can do anything I need to and easily dive into the standard library source code or even IL when needed.

    But even then there are topics I could easily learn more on particularly the very performance focused struct features and intrinsics.

    I’ve found LLMs to be super useful when you have a very specific question about a feature. I use bing ai at work so it sources all its answers and you can dive into the articles for more detail.

    Programming is a never ending learning journey and you just have to keep going. When you get something you don’t fully understand to a deep dive there are always resources for everything.

  • Tyfud@lemmy.world
    link
    fedilink
    English
    arrow-up
    8
    ·
    17 days ago

    I’ve been writing code for 25+ years, and in tech for 27+.

    I’m a novice at all languages still. Even though they tell me I’m a Principal Engineer.

    There’s always some new technique or way to do what I want that’s better I’m learning every day. It never stops. The expectations for what I consider to be good code just continues to climb every day.

    • Rusty Shackleford@programming.dev
      link
      fedilink
      English
      arrow-up
      1
      ·
      16 days ago

      I try to tell this to all young guns getting in.

      The amount of information due the dearth and depth of theory, practical, and abstraction I would need to where I’m comfortable enough to consider myself an expert would take a lifetime to learn.

      Hence, it’s, “Stay in the dojo, padawan!”

  • dirtySourdough@lemmy.world
    link
    fedilink
    arrow-up
    4
    ·
    17 days ago

    After 6 years of seriously using Python regularly, I’d probably give myself a 6/10. I feel comfortable with best practices and making informed design decisions. I have no problem using linting and testing tools. And I’ve contributed to large open source projects. I could improve a lot by learning more about the standard library and some core computer science concepts that inform the design of the language. I’m pretty weak in web frameworks too, unfortunately.

    • JoshCodes@programming.dev
      link
      fedilink
      English
      arrow-up
      3
      ·
      17 days ago

      After 3-4 years of using python I’m bumping you up to a 7 so I can fit in at a 5. Congrats on your upgrade. I’ve never contributed to open source but I’ve fixed issues in publocly archived tools so that they aren’t buggy for my team. I can see errors and know what likely caused them and my code literacy is decent. That being said, I think I’m far from advanced.

    • ÞlubbaÐubba@lemm.ee
      link
      fedilink
      arrow-up
      4
      ·
      18 days ago

      This is probably the true highest level of expertise you’ll get out of most professional coders.

      It takes a real monk level of confinement to understanding the language to break out of being proficient in looking shit up and start being proficient in being the person that writes the shit people are looking up.

  • JackbyDev@programming.dev
    link
    fedilink
    English
    arrow-up
    2
    ·
    17 days ago

    Being proficient isn’t about getting something right the first time, it’s about how easily you recognize something as wrong and knowing how to get the knowledge to fix it. Under that definition I rate myself 5/5 if I’m not trying to be humble or sorry about tiny details.

  • danhab99@programming.dev
    link
    fedilink
    arrow-up
    2
    ·
    15 days ago

    I have no fear of implementing anything I’m asked to in typescript go rust java c# f# or nix… They’re all the same tool just kinda different in some places.

    • MajorHavoc@programming.dev
      link
      fedilink
      arrow-up
      2
      ·
      edit-2
      16 days ago

      After almost 12~15 years of programming in C and C++, I would give myself a solid “still don’t know enough” out of 10.

      That resonates so thoroughly.

      And while it can 100% also be the case in any tool or language, it’s somehow 300% true for C and C++.

  • I should know more about what’s happening under the hood.

    You’ve just identified the most important skill of any software developer, IMO.

    The three most valuable topics I learned in college were OS design basics, assembly language, and algorithms. They’re universal, and once you have a grasp on those, a lot off programming language specifics become fairly transparent.

    An area where those don’t help are paradigm specifics: there’s theory behind functional programming and OO programming which, if you don’t understand, won’t impeded you from writing in that language, but will almost certainly result in really bad code. And, depending on your focus, it can be necessary to have domain knowledge: financial, networking, graphics.

    But for what you’re taking about, those three topics cover most of what you need to intuit how languages do what they do - and, especially C, because it’s only slightly higher level than assembly.

    Assembly informs CPU architecture and operations. If you understand that, you mostly understand how CPUs work, as much as you need to to be a programmer.

    OS design informs how various hardware components interact, again, enough to understand what higher level languages are doing.

    Algorithms… well, you can derive algorithms from assembly, but a lot of smart people have already done a ton of work in the field, and it’s silly to try to redo that work. And, units you’re very special, you probably won’t do as good a job as they’ve done.

    Once you have those, all languages are just syntactic sugar. Sure, the JVM has peculiarities in how its garbage collection works; you tend to learn that sort of stuff from experience. But a hash table is a hash table in any language, and they all have to deal with the same fundamental issues of hash tables: hashing, conflict resolution, and space allocation. There are no short cuts.

    • Croquette@sh.itjust.worksOP
      link
      fedilink
      arrow-up
      0
      ·
      18 days ago

      Thanks for the input, it will make me think about how to approach how to get the skills I need.

      I’d say I am decent with FreeRTOS which is pretty much just a scheduler with a few bells and whistles.

      I haven’t used assembly in a long while, so I know where to look to understand all the instructions, but I can’t tell right off the bat what a chunk of assembly code does.

      Algorithms, I am terrible at these because I rarely use them. I haven’t worked in a big enough project where an algorithm is needed. I tend to work in finite state machine which is close to algorithms, but it’s not quite it. And a big part of my job is interfacing peripheral chips for other to use.

      • Thanks for the input

        You’re welcome!

        I haven’t used assembly in a long while, so I know where to look to understand all the instructions, but I can’t tell right off the bat what a chunk of assembly code does.

        Oh, me neither. And that’s not what I think is necessary; what’s important is that you can generally imagine the sorts of operations which are going on under the hood for any given line of code. That there’s no magic “generate a hash for a string” CPU operation, and that, ultimately, something is going to be iterating over a series of memory locations and performing several math operations on each to produce a numeric output. I think this awareness is enormously valuable in developers, and helps them think about the code they’re writing in a certain way, and usually in a way that improves their code.

        Algorithms, I am terrible at these because I rarely use them.

        You use them all the time! Anything longer than a single operation is an algorithm.

        Nobody is going to ask you to write a search function; however, being aware of Big-O notation, and being able to reason about time and space complexity, is important. On the backbend, it’s critical. It’s important if you’re a front end developer - I blame the whole NodeJS library fiasco on not enough awareness of dependency complexity by a majority of JS developers.

        I tend to work in finite state machine which is close to algorithms, but it’s not quite it.

        I’d absolutely call FSM work “algorithms”, and it sounds as if the projects you’re working on is where these fundamentals are most important. Interfaces between hardware components? It’s the most fraught topic in CIS! So. Many. Pitfalls. Shit, you probably have to worry about clock speeds and communication sheer; there’s absolutely a huge corpus of material about algorithms for handling stuff you’re working with, like vector clocks. That’s a fabulous, interesting field. It’s also super tedious, and requires huge attention to detail which I lack, so in a way I envy you, but an also glad I’m not you.

      • College.

        I’m one of those folks who believes not everyone needs a degree, and we need to do more to normalize and encourage people who have no interest in STEM fields to go to trade schools. However, I do firmly believe computer programming is a STEM field and is best served by getting a degree.

        There are certainly computer programming savants, but most people are not, and the next best thing is a good, solid higher education.

    • MajorHavoc@programming.dev
      link
      fedilink
      arrow-up
      3
      ·
      16 days ago

      But not good enough to get a job as a programmer.

      This is as weird of a time for getting hired as a programmer as we have ever had. Hang in there. Once we let AI deployment pipelines start causing production outages and shareholder bankruptcies, we will start falling over ourselves to hire human programmers again.

      • 🇰 🌀 🇱 🇦 🇳 🇦 🇰 ℹ️@yiffit.net
        link
        fedilink
        English
        arrow-up
        2
        ·
        edit-2
        16 days ago

        I mean that it’s quite a leap going from making, like, a text-based adventure in C++ or BASIC and changing/adding lines of code to someone else’s thing making mods to doing actual, professional level programming of systems I have never even fucked with for fun. Like, I can’t make the screen display an image. I don’t know how to do any sort of networking, at least from a programming standpoint (hardware and shit, no problem; I was CISCO and A+ certified at one point).

        I guess if all they need me to do is make what is essentially a database or calculator, I could do that. 🤷🏻‍♂️

        • MajorHavoc@programming.dev
          link
          fedilink
          arrow-up
          2
          ·
          edit-2
          15 days ago

          That’s the beauty of programming (and lots of skills, really) - once we master the basics, all we tend to notice is what we haven’t learned yet.

          It’s hard on our confidence, but there’s also a perverse beauty to it.

          It is a big leap, but it’s the kind of leap that gets easy when doing the job with training for dozens of hours per week.

          And it’s also a very small leap compared to the average computer user who doesn’t know why smoke shouldn’t come out of the computer case during normal operation.

          One of the cool things that AI will do is once again lower the barrier of entry for full time programmers.

          We’re on our way to finding out just how terrible AI is as a pilot, but it makes a damn fine co-pilot much of the time. And it’ll be key in welcoming in and enabling our next batch of brilliant full time programmers.

      • emil_98@lemmy.world
        link
        fedilink
        arrow-up
        2
        ·
        15 days ago

        Feels good to hear this. I’m also struggling to enter the industry and it’s nice to read something hopeful for a change

  • solrize@lemmy.world
    link
    fedilink
    arrow-up
    1
    ·
    edit-2
    18 days ago

    In C in particular, you have to be very cognizant of the tricky ways the language can screw you with UB. You might want to try some verification tools like Frama-C, use UB sanitizers, enable all the compiler warnings and traps that you can, etc. Other than that, I think using too many obscure features of a language is an antipattern. Just stick with the idioms that you see in other code. Take reviewer comments on board, and write lots of code so you come to feel fluent.

    Added: the MISRA C guidelines for embedded C tell you to stay with a relatively safe subset of the language. They are mostly wise, so you might want to use them.

    Added: is your issue with C or with machine code? If you’re programming small MCUs, then yes, you should develop some familiarity with machine code and hardware level programming. That may also help you get more comfortable with C.

    • Croquette@sh.itjust.worksOP
      link
      fedilink
      arrow-up
      0
      ·
      18 days ago

      My issue is with the imposter syndrome i’d say.

      I don’t know asm on the tip of the fingers because today’s mcu are pretty full of features that makes it not useful most of the time, but if I need to whip up something in asm for whatever reason, I know the basics and how to search for documentation to help me.

      I try to follow MISRA C guidelines because it’s pretty easy to follow and it gives tool to reduce mistakes.

      I have enough experience to avoid many common pitfalls such as overflows, but for whatever reason, it always feel like I don’t know enough when I come across a tutorial or a post with a deep dive in a specific part of an embedded project or on the C language.

      When I read these tutorials/posts, I understand what is being done, but I could not come to these conclusions myself, if that makes sense.

      • solrize@lemmy.world
        link
        fedilink
        arrow-up
        1
        ·
        edit-2
        18 days ago

        What are you working on and what kind of organization? Are you working with someone more senior? You could ask him or her for an assessment of where you should work on strengthening up.

        You are in the right mindset if you are worried. Many C programmers greatly overestimate their ability to write bug-free or even valid (UB-free) code.

        The AVR MCUs are pretty simple compared with 32 bit MCUs, so are good for asm coding.

        Otherwise it’s a matter of coding til it’s reflexive.

        Philip Koopman has written a book on MCU programming that sounds good. I haven’t seen it yet but someday. You might look for it: https://betterembsw.blogspot.com/2021/02/better-embedded-system-software-e-book.html?m=1

        John Regehr’s blog is also good.

        • Croquette@sh.itjust.worksOP
          link
          fedilink
          arrow-up
          1
          ·
          15 days ago

          Thanks for your input.

          I think I would like to follow all these people and their work on C, and their in depth knowledge. But free time is sparse, and I don’t have the mental energy when I do have some time.

          As for my work, I work in a startup where I am the only one doing what I do. However, I have a lot of leeway in how I code, so I am always somewhat read on best practices. So I can’t really refer to a senior dev, but I can self-teach.

          I think I coded enough that a lot of what I do is a reflex, and I often can approximate a first solution,but I have doubts all the time on how I implement new features. That makes it so that I am a slower coder and I really struggle to do fast prototyping.

          I am aware enough of what I do well, and what I struggle, so there’s that.

          • solrize@lemmy.world
            link
            fedilink
            arrow-up
            2
            ·
            edit-2
            15 days ago

            Fair enough. If your product isn’t safety or security critical then it’s mostly a matter of getting it working and passing reasonable testing. If it’s critical you might look for outside help or review, and maybe revisit the decision to use C.

            The book “Analysable Real-Time Systems: Programmed in Ada” was recommended to me and looks good. I have a copy that has been on my reading pile for ages. I was just thinking about it recently. It could be a source of wisdom about embedded dev in general, plus Ada generally fosters a more serious approach than C does, so it could be worth a look. I also plan to get Koopman’s book that I mentioned earlier.

  • Tolookah@discuss.tchncs.de
    link
    fedilink
    arrow-up
    1
    ·
    edit-2
    18 days ago

    Better than many, mediocre.

    With my coworkers I’ve got a strange ability to pick up any language that tastes like c, and get stuff done. I’m sure I’ve confused our c# guys when I make a change to their code and ask for a code review, because I’ll chase down quality of life improvements for myself. (Generally, I will make the change and ask if I have any unintended side effects, because in an MCU, I know what all my side effects are, multi threaded application?, not at all)

    Edit: coming from a firmware view, I’ve made enough mistakes to realize when order of operations will stab me, when a branch is bad because that pipeline hit will hurt, and I still get & vs && wrong more often than I would like to admit.

  • nik9000@programming.dev
    link
    fedilink
    arrow-up
    1
    ·
    18 days ago

    I’ve learned a lot by breaking things. By making mistakes and watching other people make mistakes. I’ve writing some blog posts that make me look real smart.

    But mostly just bang code together until it works. Run tests and perf stuff until it looks good. It’s time. I have the time to write it up. And check back on what was really happening.

    But I still mostly learn by suffering.

    • Croquette@sh.itjust.worksOP
      link
      fedilink
      arrow-up
      1
      ·
      18 days ago

      I really like brain twisters. It can get frustrating at times, but it’s the most fun out of the profession to me.

  • souperk@reddthat.com
    link
    fedilink
    arrow-up
    1
    ·
    18 days ago

    I would give myself a solid 4.2/5 on python.

    • I have in deepth knowledge of more than a few popular libraries including flask, django, marshmallow, typer, sqlalchemy, pandas, numpy, and many more.
    • I have authored a few libraries.
    • I have been keeping up with PEPs, and sometimes offered my feedback.
    • I have knowledge of the internals of development tooling, including mypy, pylint, black, and a pycharm plugin I have created.

    I wouldn’t give myself a 5/5 since I would consider that an attainable level of expertise, with maybe a few expections around the globe. IMO the fun part of being really good at something is that you understand there still is to learn ❤️

  • NigelFrobisher@aussie.zone
    link
    fedilink
    arrow-up
    1
    ·
    18 days ago

    The more I learn about my language the less I think it matters. Maybe in embedded C you can’t just leave everything to the compiler though.