How would a formal licensing system work for software engineering? How would they keep up with the rapid evolution in this industry?
I believe in better education in this field, but the standard “engineer” programs from other fields don’t translate to software. Having the government codify today’s standards would stunt the industry as a whole and kill innovation. Imagine if they had done that in the 90s and said all programming must be waterfall, monolithic, relational dbs, and using c/Fortran/Cobol.
Maybe I just don’t understand how other countries handle it though. I know my country would absolutely screw it up
Its not about knowing the current frameworks available, but rather the “if a civil engineer knowingly designs a bridge that fails there are serious repercussions.”
As to staying on top of things, every licensed engineer in the US is required by their state’s licensing board to have about a week’s worth of continuing education every year.
The thing is that the title of software engineer has been applied to people who lack licensure and thus weakened the importance of the title in terms of expected knowledge and professional responsibility.
They tried doing it for a few years, though few people were interested in taking the test and so it was dropped. I did look into it, however the strong math and physics requirement for the FE exam (the prerequisite for the PE) went beyond what I took in college.
NCEES Director of Exam Services Tim Miller, P.E., says there was a lot of discussion about the exam’s impact, including how many people with software engineering degrees were taking the FE exam. “If they’re not even taking the FE exam, they’re probably not going to take the PE exam,” he says. “In addition, if the boards aren’t regulating the [software engineering profession], it’s tough to get people to take the exam.”
Formal licensing could be about things that are language agnostic. How to properly use tests to guard against regressions, how to handle error states safely.
How do you design programs for critical systems that CANNOT fail, like pace makers? How do you guard against crashes? What sort of redundancy do you need in your software?
How do you best design error messages to tell an operator how to fix the issue? Especially in critical systems like a plane, how do you guard against that operator doing the wrong thing? I’m thinking of the DreamLiner incidents where the pilots’ natural inclination was to grab the yoke and pull up, which unknowingly fought the autopilot and caused the plane to stall. My understanding was that the error message that triggered during those crashes was also extremely opaque and added further confusion in a life-and-death situation.
When do you have an ethical responsibility not to ship code? Just for physical safety? What about Dark Patterns? How do you recognize them and do you have an ethical responsibility to refuse implementation? Should your accreditation as an engineer rely on that refusal, giving you systemic external support when you do so?
None of that is impacted by what tech stack you are using. They all come down to generic logical and ethical reasoning.
Lastly, under certain circumstances, Civil engineers can be held personally liable for negligence when their bridge fails and people die. If we are going to call ourselves “engineers”, we should bear the same responsibility. Obviously not every software developer needs to have such high standards, but that’s why software engineer should mean something.
How would a formal licensing system work for software engineering? How would they keep up with the rapid evolution in this industry?
I believe in better education in this field, but the standard “engineer” programs from other fields don’t translate to software. Having the government codify today’s standards would stunt the industry as a whole and kill innovation. Imagine if they had done that in the 90s and said all programming must be waterfall, monolithic, relational dbs, and using c/Fortran/Cobol.
Maybe I just don’t understand how other countries handle it though. I know my country would absolutely screw it up
Its not about knowing the current frameworks available, but rather the “if a civil engineer knowingly designs a bridge that fails there are serious repercussions.”
https://www.nspe.org/resources/ethics/code-ethics
As to staying on top of things, every licensed engineer in the US is required by their state’s licensing board to have about a week’s worth of continuing education every year.
https://njspe.org/2019/08/23/continuing-education-credits-for-professional-engineers-state-by-state/
The thing is that the title of software engineer has been applied to people who lack licensure and thus weakened the importance of the title in terms of expected knowledge and professional responsibility.
As to the NSPE licensing software engineer - https://www.nspe.org/resources/pe-magazine/may-2018/ncees-ends-software-engineering-pe-exam
They tried doing it for a few years, though few people were interested in taking the test and so it was dropped. I did look into it, however the strong math and physics requirement for the FE exam (the prerequisite for the PE) went beyond what I took in college.
Formal licensing could be about things that are language agnostic. How to properly use tests to guard against regressions, how to handle error states safely.
How do you design programs for critical systems that CANNOT fail, like pace makers? How do you guard against crashes? What sort of redundancy do you need in your software?
How do you best design error messages to tell an operator how to fix the issue? Especially in critical systems like a plane, how do you guard against that operator doing the wrong thing? I’m thinking of the DreamLiner incidents where the pilots’ natural inclination was to grab the yoke and pull up, which unknowingly fought the autopilot and caused the plane to stall. My understanding was that the error message that triggered during those crashes was also extremely opaque and added further confusion in a life-and-death situation.
When do you have an ethical responsibility not to ship code? Just for physical safety? What about Dark Patterns? How do you recognize them and do you have an ethical responsibility to refuse implementation? Should your accreditation as an engineer rely on that refusal, giving you systemic external support when you do so?
None of that is impacted by what tech stack you are using. They all come down to generic logical and ethical reasoning.
Lastly, under certain circumstances, Civil engineers can be held personally liable for negligence when their bridge fails and people die. If we are going to call ourselves “engineers”, we should bear the same responsibility. Obviously not every software developer needs to have such high standards, but that’s why software engineer should mean something.