A streamlined development must rely on a consistent and clearly defined process - the SDLC. It includes detailed plans how to develop, test, alter, maintain and replace a software system.
Distinct stages - Planning , design , building , testing , deployment are used to omit the typical pitfalls to software development projects.
A new system should start defining requirements (User Stories being a special kind) before starting to implement via the mentioned stages, to eliminate redundant rework and "after-the-facts" fixes.
Several SDLC approaches exist: Waterfall, Agile, Iterative, V-Model, Big- Bang, Spiral. Correctly applied, SDLC allows for a high level of management control together with developers having a deep understanding of the product, it's benefits and expected work-flow / way to work.
Separates the project stages into cycles, delivering working products very quickly, producing a succession of releases. Testing each release feeds back information that goes into the next version.
Agility is reached by the possibility to re-arrange requirements after each cycle: You are flexible to re-order, change, omit or add functionality.
Software development is currently undergoing a significant change in methodology to keep up with the increasing complexity of applications.
C++ as the major programming language for embedded development is getting more and more high-level features like async functions with futures and promises which allow to implement concurrency from a higher - more secure - level of abstraction.
CMake is about to become the standard meta-build process tool. From V3 is has undergone significant changes to follow a modular design to specify build instructions.
Git / Gitlab is a state of the art distributed version control system. It implements Bug-Tracking, Code-Review and Continous Integration on top of Git.
Jenkins is a comfortable Continous Integration system, providing a tremendous amount of plugins to support building, testing, deploying and various other automation tasks.
Scrum is replacing more traditional develpment methodologies because of it's flexibility. It is based on transparency , verifiability and adaptability. If applied correctly, development / maintenance cycles are shortened. However a lot of misunderstanding often impedes optimal results. Scrum teams need to be formed with care and a dedicated Scrum Master should accompany it. In case of software development, special care should be taken to assign roles matching the expertise of the team members.
Git / Gitlab
Means the process of creating the High Level Design of complex software systems as opposed to coding a certain piece of software.
It stands for optimizing quality and productivity in software production by taking non-functional requirements and special constraints into accout as well.
From IEEE 1471: Software Architecture (is) the fundamental organization of a system embodied in its components, their relationships to each other and to the environment and the principles guiding its design and evolution.
The impact of a good architecture is often underestimated and architectural methods like Attribute-Driven Design (ADD), Architecture Tradeoff Analysis Method (ATAM), Quality Attribute Workshop Method (QAW) and Decision-Centric Architecture Reviews (DCAR) are rarely practised.
However one of the main purposes of a good architecture is to allow for late decisions : The partioning and the relationship of the resulting components enable easy re-arrangement of the components.
So you could say that a good architecture works like a component ecosystem.
High Level Design
A good design not only supports architecture, it often is the base for it. In the first place, a good design reduces complexity and dependencies . An excellent starting point are the S.O.L.I.D. principles defined by "Uncle Bob", Robert. C. Martin.
However most software consists of some "hand crafted" code that is combined with libraries, so you need to know about the programming language - especially the not so obvious parts - and the concrete library implementation, which tends to be tricky in some cases.
On the other hand, a good design models a problem from a specific domain and Domain Driven Design is a way to create a common understanding of stakeholders and designers by using the same wording and the expected behaviour.
Test Driven Design helps to integrate testing into the development from the beginning by designing tests first, gaining a higher test coverage. - Following the "Red-Green-Refactoring" , you plan a functionality, write the test for it first, then implement the needed function, but only until all available tests pass; then refactoring it thereafter. - That's why some people are saying that TDD is more a design methodology.
Typically used to operate stand-alone devices, single board or distributed systems. Special constraints apply since in many cases they run over a long time period with limited resources available. They must be extreme reliable and fault tolerant, implying special care regarding their software.
The systems are usually tailored to specific tasks, using special hardware that need specific software - like device drivers and protocols.
A couple of de-facto standards have evolved over the last years to ease and standardize the development. Embedded Linux, Bitbake, Yocto, Board-Support-Packages, Hypervisors to name a few.
Areas of application are increasing and becoming more and more complex, thus requiring to use higher level software like modern C++ and libraries like std, boost, hpx and others to manage the complexity.
However those libraries have been designed to run on a broad spectrum of systems, so special care is needed to use it on embedded systems: You'd better know your hardware and what happens under the hood of the software to achieve the best results.
Reliable High Performance
Although C++ is widely recognized as a programming language that gives you fast program execution, this is not quite correct - C++ gives you the maximum control.
Beside the knowledge of performant algorithms and pattern, you need to know your concrete context as well. Hardware, libraries and in-depth C++ knowledge is needed to achieve the best results.
The first step in respect of C++ is to know about the available features and how they work relating to their resource consumption.
However the art of creating reliable and performant systems is to adapt your design to the context of hardware interfaces, cpu instruction pipeline and caches, available memory, software interfaces to libraries, concrete implementation of external software and usable language features.