Software Analysis Concepts and Principles
The overall role of software in large system is identified during system engineering. However, it’s necessary to take a harder look at software’s role to understand the specific requirements that must be achieved to build high-quality software. That’s the job of software requirements analysis. To perform the job properly you should follow a set of underlying concepts and principles.
1. Requirements Analysis
Requirement analysis is a software engineering task that bridges the gap between system level requirements engineering and software design. Requirements engineering activities result in the specification of software’s operational characteristics, indicate software’s interface with other system elements, and establish constraints that software must meet. Requirement analysis allows the software engineer to refine domains that will be treated by software.
Software requirements analysis may be divided into five areas of effort:
(1) problem recognition,
(2) evaluation and synthesis,
(4) specification, and
The analyst studies the system specification and the software Project Plan. It is important to understand software in a system context and to review the software scope that was used to generate planning estimates. Problem evaluation and solution synthesis is the next major area of effort for analysis. The analyst must define all externally observable data objects, evaluate the flow and content of information, define and elaborate all software functions, understand software behavior in the context of events that affect the system, establish system interface characteristics, and uncover additional design constraints.
Throughout evaluation and solution synthesis, the analyst’s primary focus is on “what” not “how”. What data does the system produce and consume, what functions must the system perform, what behavior does the system exhibit, what interfaces are defined and what constraints apply?
2. Requirements Elicitation for Software
Before requirements can be analyzed , modeled, or specified they must be gathered through an elicitation process.
2.1 Initiating the Process
The first meeting between a software engineer and the customer can be liked to the awkwardness of a first date between two adolescents. Communication must be initiated by asking context-free questions. That is a set of questions that will lead to basic understanding of the problem, the people who want a solution that will lead to basic understanding of the problem, the people who want a solution, the nature of the solution that is desired, and the effectiveness of the first encounter itself.
ü Who is behind the request for this work?
ü Who will use the solution?
ü What will be the economic benefit of a successful solution?
ü Is there another source for the solution that you need?
The next set of questions enables the analyst to gain a better understanding of the problem and the customer to voice his or her perceptions about a solution:
ü How would you characterize “good” output that would be generated by a successful solution?
ü What problem(s) will this solution address?
ü Can you show me the environment in which the solution will be used?
ü Will special performance issues or constrains affect the way the solution is approached?
2.2 Facilitated Application Specification Techniques
Customers and software engineers have an unconscious “us and them” mind-set. With these problems in the mind that a number of independent investigators have developed a team-oriented approach to requirements gathering that is applied during early stages of analysis and specification. Called facilitated application technique (FAST). Basic guidelines for this technique are:
ü A meeting is conducted at a neutral site and attended by both software engineers and customers.
ü Rules for preparation and participation are established.
ü An agenda is suggested that is formal enough to cover all important points but informal enough to encourage the free flow of ideas.
ü A “facilitator” controls the meeting.
ü A “definition mechanism” is used
ü The goal is to identify the problem, propose elements of the solution, negotiate different approaches, and specify a preliminary set of solution requirements in an atmosphere that is conductive to the accomplishment of the goal.
Initial meeting between the developer and customer occur and basic questions and answers help to establish the scope of the problem and the over all perception of a solution. The product request distributed to all attendees before the meeting date. The FAST team is composed of representatives from marketing, software and hardware engineering, and manufacturing. As the FAST meeting begins, the first topic of discussion is the need and justification for the new product – everyone should agree that the product justified. Once agreement has been established, each participant his or her list for discussion.
After individual lists are presented in one topic area, a combined list is created by the group. The combined list eliminates redundant entries, adds any new ideas that come up during the discussion, but does not delete anything. The combined list is shortened, lengthened, or reworded to properly reflect the product or system to be developed. The objective is to develop a consensus list in each topic area. Each sub team presents its mini-specs to all FAST attendees for discussion. After the mini-specs are completed, each FAST attendee makes a list of validation criteria for the product or system and presents his or her to the team.
2.3 Quality Function Deployment
Quality function deployment (QFD) is a quality management technique that translates the needs of the customer into technical requirements for software. QFD identifies three types of requirements:
Normal requirements. The objectives and goals that are stated for a product or system during meeting with customer. If these requirements are present, the customer is satisfied.
Expected requirements. These requirements are implicit to the product or system and may be so fundamental that the customer does not explicitly state them. Their absence will be a cause for significant dissatisfaction.
Exciting requirements. These features go beyond the customer’s expectations and prove to be very satisfying when present.
Functional deployment is used to determine the value of each function that is required for the system. Information deployment identifies both the data objects and events that the system must consume and produce. These are tied to the functions. Finally, task deployment examines the behavior of the system or product within the context of its environment. Value analysis is conducted to determine the relative priority of requirements determined during each of the three deployments.
As requirements are gathered as part of informal meetings, a software engineer can create a set of scenarios that identify a thread of usage for the system to be constructed. To create a use-case, the analyst must first identify the different types of people play as the system operates. Defined somewhat more formally an actor is anything that communicates with the system or product and that is external to the system itself.
It’s most important to note that an actor and a user are not the same thing. An actor represents a class of external entities that play just one role. Once actors have been identified, use-case can be developed. The use-case describes the manner in which an actor interacts with the system. The use-case should be answer below questions:
ü What main tasks or functions are performed by an actor?
ü What system information will the actor acquire, produce, or change?
ü Will the actor have to inform the system about changes in the external environment?
ü What information does the actor desire from the system?
ü Does the actor wish to be informed about unexpected changes?
In general, use-case is simply a written narrative that describes the role of an actor as interaction with the system occurs.
3. Analysis Principles
Over the past two decades, a large number of analysis modeling methods have been developed. Investigators have identified analysis problems and their causes and have developed a variety of notations and corresponding sets of heuristics to overcome them. Each analysis method has a unique point of view.
ü The information domain of a problem must be represented and understood.
ü The functions that the software is to perform must be defined.
ü The behavior of the software must be represented.
ü The models that depict information, function, and
ü The models that depict information function and behavior must be partitioned in a manner that uncovers details in a layered fashion.
ü The analysis process should move from essential information toward implementation detail.
In addition to these operational analysis principles for requirements engineering:
ü Understand the problem before you begin to create the analysis model.
ü Develop prototype that enable a user to understand how human/machine interaction will occur.
ü Record the origin of and the reason for every requirement.
ü Use multiple views of requirements.
ü Rank requirements.
ü Work to eliminate ambiguity
Architecture design represents the structure of data and program components that are required to build a computer-based system. It considers the architectural style that the system will take, the structure and properties of the interrelationships that occur among all architectural components of a system. Architectural design begins with data design and then proceeds to the derivation of one or more representations of the architectural structure of the system. Alternative architectural styles or patterns are analyzed to derive the structure that is best suited to customer requirements and quality attributes. Once an alternative has been selected, the architecture is elaborated using an architectural design method.
Definition: Software architecture provides a holistic view of the system to be built. It depicts the structure and organization of software components, their properties, and the connections between them. Software components include program modules and the various data representations that are manipulated by the program. Therefore, data design is an integral part of the derivation of the software architecture. Architecture highlights early design decisions and provides a mechanism for considering the benefits of alternative system structures.
1. Software Architecture
1.1 What is architecture?
The software architecture of a program or computing system is the structure or structures of the system, which comprise software components, the externally visible properties of those components, and the relationships among them. The architecture is not the operational software. Rather, it is a representation that enables a software engineer to
(1) analyze the effectiveness of the design in meeting its stated requirements,
(2) consider architectural alternatives at a stage when making design changes is still relatively easy, and
(3) reducing the risks associated with the construction of the software.
The definition emphasizes the role of “software components” in any architectural representation. In the context of architectural design, a software component can be something as simple as a program module, but it can also be extended to include databases and “middleware” that enable the configuration of a network of clients and servers. The properties of components are those characteristics that are necessary to an understanding of how the components interact with other components. At the architectural level, internal properties can be as simple as a procedure call from one module to another or as complex as a database access protocol.
1.2. Why Is Architecture Important?
ü Representations of software architecture are an enabler for communication between all parties interested in the development of a computer-based system.
ü The architecture highlights early design decisions that will have a profound impact on all software engineering work that follows and, as important, on the ultimate success of the system as an operational entity.
ü Architecture “constitutes a relatively small, intellectually graspable model of how the system is structured and how its components work together”
The architectural design model and the architectural patterns contained within it are transferable. That is, architecture style and patterns can be applied to the design of other systems and represent a set of abstractions that enable software engineers to describe architecture in predictable ways.
2. Data Design
Data design creates a model of data and/or information that is represented at a high level of abstraction. This data model is then refined into progressively more implementation-specific representations that can be processed by the computer-based system. The structure of data has always been an important part of software design. The design of data structures and the associated algorithms required to manipulate them is essential to the creation of high-quality applications.
2.1 Data Modeling, Data Structures, Databases, and the Data Warehouse
The data objects defined during software requirements analysis are modeled using entity/relationship diagrams and the data dictionary. The data design activity translates these elements of the requirements model into data structures at the software component level and, when necessary, a database architecture at the application level.
Data mining technique also called as Knowledge discover in databases that navigates through existing databases in an attempt to extract appropriate business-level information.
A data warehouse is a separate data environment that is not directly integrated with day-to-day applications but encompasses all data used by a business. A data warehouse is a large, independent database that encompasses some, but not all, of the data that are stored in databases that serve the set of applications required by a business.
2.2 Data Design at the Component Level
Data design at the component level focuses on the representation of data structures that are directly accessed by one or more software components. The design of data begins during the creation of the analysis model. Recalling that requirements analysis and design often overlap, we consider following set of principles for data specification:
ü The systematic analysis principles applied to function and behavior should also be applied to data.
ü All data structures and the operations to be performed on each should be identified.
ü A data dictionary should be established and used to define both data and program design.
ü Low-level data design decisions should be deferred until late in the design process.
ü The representation of data structure should be known only to those modules that must make direct use of the data contained within the structure.
ü A library of useful data structures and the operations that may be applied to them should be developed.
ü A software design and programming languages should support the specification and realization of abstract data types.
3. Architectural Styles
The software that is built for computer-based systems also exhibits one of many architectural styles. Each style describes a system category that encompasses
(1) a set of components that perform a function required by a system;
(2) a set of connections that enable “communication, coordination and cooperation” among components;
(3) constraints that define how components can be integrated to form the system; and
(4) semantic models that enable a designer to understand the overall properties of a system by analyzing the known properties of its constituent parts.
3.1 A Brief Taxonomy of Styles and Patterns
Data-centered architectures: A data store resides at the center of this architecture and is accessed frequently by other components that update, add, delete, or otherwise modify data within the store. Data-centered architectures promote integrability i.e. existing components can be changed and new client components can be added to the architecture without concern about other clients.
Data-flow architectures: This architecture is applied when input data are to be transformed through a series of computational or manipulative components into output data. A pipe and filter pattern has a set of components, called filters, connected by pipes that transmit data from one component to the next. Each filter works independently of those components upstream and downstream, is designed to expect data input of a certain form, and produces data output of a specified form.
Call and return architectures: This architectural style enabled a software designer to achieve a program structure that is relatively easy to modify and scale.
Object-oriented architectures: The components of a system encapsulate data and the operations that must be applied to manipulate the data. Communication and coordination between components is accomplished via message passing.
Layered architectures: The basic structure of layered architectures is “A number of different layers are defined, each accomplishing operations that progressively become closer to the machine instruction set.
These architectural styles are only a small subset of those available to the software designer. Once requirements engineering uncovers the characteristics and constraints of the system to be built, the architectural pattern or combination of patterns that best fits those characteristics and constraints can be chosen.
4. Analyzing Alternative Architectural Design
4.1 An Architecture Trade-off Analysis Method
An architecture trade-off analysis method establishes an iterative evaluation process for software architectures. The design analysis activities that follow are preformed iteratively:
ü Collect Scenario. A set of use-case is developed to represent the system from the user’s point of view.
ü Elicit requirements, constraint, and environment description. The information is required as part of requirements engineering and is used to be certain that all customer, user, and stakeholder concerns have been addressed.
ü Describe the architectural styles/patterns that have been chosen to address the scenarios and requirements.
o Module view for analysis of work assignments with components and the degree to which information hiding has been achieved.
o Process view for analysis of system performance.
o Data flow view for analysis of the degree to which the architecture meets functional requirements.
ü Evaluate quality attributes by considering each attribute in isolation.
ü Identify the sensitivity of quality attributes to various architectural attributes for a specific architectural style.
ü Critique candidate architectures using the sensitivity analysis.
4.2 Quantitative Guidance for Architectural Design
Once an architecture has been derived, it is elaborated and the analyzed against quality criteria.
One of the many problems faced by software engineer during the design process is a general lack of quantitative methods for assessing quality of proposed designs. Work in the area of quantitative analysis of architectural design is still in its formative stages. Asada and his colleagues suggest a number of pseudo quantitative techniques that can be used to complement the ATAM approach as a method for the analysis of architectural design quality.
Asada proposes a number of simple models that assists a designer in determining the degree to which a particular architecture meets predefined “goodness” criteria.
The first model, called spectrum analysis, assesses an architectural design on a “goodness” spectrum from the best to worst possible designs. Once the software architecture has been proposed, it is assessed by assessing by a “score” to each of its design dimensions.
Design selection analysis is another model that requires a set of design dimensions to be defined. The proposed architecture is then assessed to determine the number of design dimensions that it achieve when compared to an ideal system.
Contribution analysis “identified the reasons that one set of design choices gets a lower score than another”. Recalling our discussion of quality function deployment value analysis is conducted to determine the relative priority of requirements determine during function deployment.
5. Mapping Requirements into a Software Architecture
Structured design is often characterized as a data flow-oriented design method because it provides a convenient transition from a data flow diagram to software architecture. The transition as part of six-step process:
(1) the type of information flow is established;
(2) flow boundaries are indicated;
(3) the DFD is mapped into program structure;
(4) control hierarchy is defined;
(5) resultant structure is refined using design measuring and heuristics; and
(6) the architectural description is refined and elaborated.
5.1 Transform Flow
Recall the fundamental system model information must enter and exit software in an “external world” form. Information enters the system along paths that transform external data into an internal form. These paths are identified as incoming flow. At the kernel of the software, a transition occurs. Incoming data are passed through a transform center and begin to move along paths that now lead “out” of the software. Data moving along these paths are called outgoing flow.
5.2 Transaction Flow
The fundamental system model implies transform flow. It is possible to characterize all data flow in this category. Information is often characterized by a single data item called a transaction that triggers other data flow along one of many paths. Transaction flow is characterized by data moving along an incoming path that converts external world information into a transaction. The transaction is evaluated is evaluated and, based on its value, flow along one of many action paths in initiated.
Architectural design encompasses the initial set of design activities that lead to a complete design model of the software.