FPoliSolutions | Process
21938
page-template-default,page,page-id-21938,ajax_fade,page_not_loaded,,select-child-theme-ver-1.0.0,select-theme-ver-3.8.1,wpb-js-composer js-comp-ver-5.1.1,vc_responsive
 

Process

Multiple Work Processes

FPoliSolutions utilizes multiple work processes in order to complete projects that are tailored to customer and regulatory requirements. FPoliSolutions often stacks work processes within a single project; we will use a development process i.e. IDOV to develop a conceptual solution, and then we will implement the solution into a software product by following a separate software process (i.e. agile). FPoliSolutions has off-the-shelf processes, or can follow customer specific processes.

Evaluation Model Development and Assessment Process (EMDAP)

A systematic approach in the analysis of complex system which includes:

  • Identification of scenario, analysis purpose, scope and figures of merit
  • Hierarchical breakdown of systems and processes
  • Two-tier review of available evaluation models (EM) (top-down and bottom-up)
  • Scaling and applicability analysis of test facilities and models
  • Development of EM with state-of-the-art modeling tools and algorithms

  • Detailed assessment of EM against prototypical experiments
  • Characterization of bias and uncertainty contributors
  • Statistical techniques for propagation of uncertainties and development of probabilistic statements (system performance, risks, tolerances, etc.)
  • Surrogate models (meta-models) for sensitivity studies

Consistent with U.S. NRC Regulatory Guide 1.203

Evaluation Model Development and Assessment Process (EMDAP)

Commercial Grade Dedication (CGD) Process

FPoliSolutions uses software procedures with a three-phased CGD process for procurement, software verification, and software validation for CGD with nuclear software.

Phase I Software Needs Analysis and Selection

Needs Analysis – The first step in the software procurement process is to state the software needs. This is a somewhat iterative process, with more detailed needs and applications being defined as the analytical methodologies are being developed and mature with time, in order to guide corresponding revisions to the Software Requirements Specifications (SRS).

Software Evaluation and Selection – A software evaluation team formed of end users and code developers is created. This team evaluates the needs, time frames for development, and investigates. The process of commercially dedicating an item requires an in-depth understanding and knowledge of purpose and function of the item to be dedicated.

Phase II Technical Evaluation

A technical evaluation of the software, the source code, and the procedures/vendor that developed the software is performed at an in depth technical level. The review is focused on credibility of the organization and the quality of the software itself. The detailed technical review investigates the physical characteristics for the code (source code, platform); software performance characteristics (code models, features, correlations); and dependability characteristics (quality procures used in software development, user community base, personnel qualification and training, software testing and update processes, quality of the documentation). Once the software is confirmed to meet the commercial grade definition criteria the team determine how the critical characteristics are to be verified and tested in Phase III.

Phase III Code Testing, Assessment and Acceptance

Phase III focuses on the assessment and testing to commercially dedicate the software. In Phase III four activities are performed: Commercial Grade Survey; Receipt Inspection; Software Inspection; Software Acceptance Testing. Phase III starts with a receipt inspection where the as provided software is logged in the versioning control system. Followed by the software inspection which is performed to verify the quality of software products, including the computer code and associated documentation, against functional, quality, and regulatory requirements. The software test plan is developed following Software Verification and Testing Procedures which identified the test cases used for verification, figures of merit and acceptance criteria. Then the software acceptance testing is performed and documented in the acceptance text matrix for CGD.

FPoliSolutions commercial grade dedication process was created using US Regulations and nuclear industry guidance for our CGD process:

  • U.S. Regulations
    • 10 CFR 50, Appendix B, Criterion III, Design Control, and Criterion VII, Control of Purchased Products and Services
    • 10 CFR 21, requires that a commercial-grade item be “dedicated” – a point-in-time when the item is subject to reporting requirements
    • RG 1.203, “Transient and Accident Analysis”
    • DG-1305, “Acceptance Of Commercial-grade Design And Analysis Computer Programs For Nuclear Power Plants”
  • Nuclear Industry Guidance
    • ASME NQA-1
    • EPRI NP-5652, “Guideline for the Acceptance of Commercial-Grade Items in Nuclear Safety-Related Applications”
    • EPRI 1025243, “Guideline for the Acceptance of Commercial-Grade Design and Analysis Computer Programs Used in Nuclear Safety-Related Applications”

Design for Six Sigma (DFSS) – I-D-O-V

Design for Six Sigma (DFSS) is a business-process management method related to traditional Six Sigma. It is used in many industries to optimize processes.  DFSS has several methodologies, FPoliSolutions can use the IDOV processes when developing application methodologies. IDOV is an acronym that stands for Identify-Design-Optimize-Validate. IDOV has four project phases.

Identify Phase

The Identify phase begins the process with capturing the Voice of the Customer. This phase focuses on a charter, gathering VOC, performing competitive analysis, and developing the Critical tos which define the necessary features to make the solution acceptable.

Design Phase

The Design phase focuses on requirements and designing solutions that meet the Critical to Operations and Critical to Satisfaction. Once the functional requirements are identified design concepts and alternatives are evaluated based upon the Critical to Quality.

Optimize Phase

The Optimize phase takes an ideal design solution and applies statistical analysis using tolerances and off nominal conditions to create the detailed design elements that will meet all conditions.

Validate Phase

The Validate phase uses multiple forms of testing and validation to ensure the design meets the requirements and Critical to Quality(s).

The outcome of the project is a prototype test and validation criteria. This process is typically followed by an implementation phase which is handled by FPoliSolutions or the customer.

Scrum Project Management Process

Scrum is an Agile framework for completing complex projects. Scrum originally was formalized for software development projects, but it works well for any complex, innovative scope of work. The scrum process has the basic steps: 

  • A product owner creates a prioritized wish list called a product backlog.
  • During sprint planning, the team pulls a small chunk from the top of that wish list, a sprint backlog, and decides how to implement those pieces.
  • The team has a certain amount of time — a sprint (usually two to four weeks) — to complete its work, but it meets each day to assess its progress (daily Scrum).
  • Along the way, the ScrumMaster keeps the team focused on its goal.
  • At the end of the sprint, the work should be potentially shippable: ready to hand to a customer, put on a store shelf, or show to a stakeholder.
  • The sprint ends with a sprint review and retrospective.

As the next sprint begins, the team chooses another chunk of the product backlog and begins working again.

Attribute Driven Design (ADD)

The ADD method is an approach to defining a software architecture in which the design process is based on the software’s quality attribute requirements. ADD follows a recursive design process that decomposes a system or system element by applying architectural tactics [Bass 03] and patterns that satisfy its driving requirements. As illustrated in Figure 1, ADD essentially follows a “Plan, Do, and Check” cycle:

  • Plan: Quality attributes and design constraints are considered to select which types of elements will be used in the architecture.
  • Do: Elements are instantiated to satisfy quality attribute requirements as well as functional requirements.
  • Check: The resulting design is analyzed to determine if the requirements are met.

This process is repeated until all architecturally significant requirements are met.