GEIA SEB6-A-1990 System Safety Engineering in Software Development (Formerly TechAmerica SEB6-A)《软件开发中系统安全工程》.pdf
《GEIA SEB6-A-1990 System Safety Engineering in Software Development (Formerly TechAmerica SEB6-A)《软件开发中系统安全工程》.pdf》由会员分享,可在线阅读,更多相关《GEIA SEB6-A-1990 System Safety Engineering in Software Development (Formerly TechAmerica SEB6-A)《软件开发中系统安全工程》.pdf(140页珍藏版)》请在麦多课文档分享上搜索。
1、U -Q. L . -. . - EIA SEBb-A 90 3234b00 0006959 O :- = -=- EIA BULLETIN r - System Safety- Engin in Software-DeveIo ELECTRONIC INDUSTRIES ASSOCIATION ENGINEERING DEPARTMENT . . ,- i Copyright Government Electronics and provide basic instructions, tools, and supporting data for use in performing such
2、tasks and activities. This Bulletin addresses the system safety involvement, support and evaluation of software developed for Department of Defense weapon systems in accordance with the process specified by DOD-STD-2167A, “Defense System Software Development.“ These system safety engineering activit
3、ies will implement the requirements and intent of MIL-STD-882. Because software is generally unsuited to traditional hardware-oriented design hazard analysis techniques, system safety engineers must first ensure that safety requirements are properly included in the software specification documents.
4、During preliminary and detailed design, system safety uses various tools, such as the system software hazardous effects analysis (SSHEA) , to identify and document the results of software deficiencies and to ensure that adequate measures are taken to eliminate the deficiencies. The SSHEA, as a docum
5、entation and tracking tool, allows the system safety analyst to select the appropriate analysis techniques necessary to adequately identify and evaluate potential mishaps. background information on system safety tasks and activities (e.g., plans, / 2 Copyright Government Electronics . 49 Safety-Crit
6、ical Path . 50 Missile Test Routine. Modified 51 RFP Response Flow . B-2/6 Contract Award to SDR Flow C-2/8 SDR to SSR Flow D-2/8 SSR to PDR Flow E-2/12 PDR to CDR Flow F-2/18 CDR to FCA/PCA Flow . G-2/10 5 Copyright Government Electronics AFR 122-10, “Systems Design and Evaluation Criteria for Nucl
7、ear Weapon Systems“; MIL-HDBK-255, “Safety, Design and Evaluation Criteria for Nuclear Weapon Systems,“ and requirements established by the various national and military services test ranges and bases. 1.1.3 Application The software safety concerns, considerations, processes and methods discussed in
8、 this Bulletin apply to all programs/projects which are involved in the development of safety-critical computer software components (SCCSCs), (generally referred to as “safety-critical“ software in this document) throughout all phases of the software development process described in DOD- STD-2167A (
9、i.e., from System Requirements Analysis/Design through System Integration and Testing). The processes and techniques discussed are Copyright Government Electronics command, control and communications systems; avionics; weapons release systems; space systems; and nuclear power plant systems). As comp
10、uters assume a more controlling role in safety-critical systems, they and their software become a source of unique hazards, replacing, and in some instances adding to, the potentially hazardous effects of human operator errors. Computers fail, just as other hardware, and operator response to compute
11、r-generated data can be erroneous; but those similarities to non- computer hardware systems are not the problem. The problem is the way computers work. Computers are intended to operate only as instructed by their software. Software failures in the sense of hardware failures do not occur, but errors
12、 in the design of the software and computer failure-induced perversions of software are possible. Software tells a computer system what to do, how to do it, and when to do it, and then monitors the systems response(s) . Information on the system state (input) is received from the system (including o
13、ther computers) and from system operators. Using stored logic, computers then translate the input into commands to system elements, thereby triggering other changes in the system state. These changes present the software with a different set of conditions to which it must respond with new message ou
14、tputs. Monitoring of those changed messages by the computer starts the cycle again. This constantly evolving interplay of input, evaluation, response, monitoring, and changing input is called “real time“ control. The burgeoning use of computer controls for safety-critical functions in military and s
15、pace system applications is a good example of the situation in which a system can simultaneously be both safer and yet less assuredly safe, because of scientific advances. The increasing complexity of systems has placed demands upon human operators which even the most advanced use of biotechnology c
16、annot satisfy. The speed required to receive, understand, react to, and monitor complex and rapid changes in state exceeds human capability, but is necessary in order to maintain real time control. The development of high-speed computers therefore provides an improved capability of safely maintainin
17、g control of highly complex safety-critical systems. However, this same technological advance results in a diminished credibility of mishap risk assessment in such systems because the development of analytical and testing techniques to preclude computer-caused system mishaps has lagged the scientifi
18、c advances in application of computer technology. The reduced credibility of mishap risk analysis in computer-controlled systems results from the differences between the way systems puters work, as opposed to those which use computers. In systems without computers, system safety analysts work with k
19、nown, calculable, or postulated hardware failure rates and modes, failure effects, human errors, safety factors, and hardware interactions within a system and across system and facility interfaces. Although problems in conduct of system safety analyses in such systems have grown in complexity with t
20、he introduction of new technology (e.g., new materials), consensus methods exist which permit high levels of assurance that 9 Copyright Government Electronics the one with a double asterisk (*) is from MIL-STD-882B. * COMPUTER HARDWARE: Devices capable of accepting and storing computer data, executi
21、ng a sequence of operations on computer data, or producing any output data (including control outputs) . * COMPUTER SOFTWARE (or SOFTWARE): A combinat.ion of associated computer instructions and computer data definitions required to enable the computer hardware to perform computational or control fu
22、nctions. * COMPUTER SOFTWARE COMPONENT (CSC): A distinct part of a computer software configuration item (CSCI). CSCs may be further decomposed into other CSCs and Computer Software Units (CSUS). . * COMPUTER SOFTWARE CONFIGURATION ITEM (CSCI): Software that is designated by the procuring agency for
23、configuration management. * COMPUTER SOFTWARE UNIT (CSU): An element specified in the design of a Computer Software Component (CSC) that is separately testable. * CONTRACTING AGENCY: The “contracting office” as defined in Federal Acquisition Regulation Subpart 2.1, or its designated representative.
24、(In this document the term Managing Activity MA is used). FAULT INJECTION PROCESS: The process of deliberately inserting faults into a system (by manual or automatic methods) to test the ability of the system to safely handle the fault or to fail to a safe state. Usually, fault injection criteria is
- 1.请仔细阅读文档,确保文档完整性,对于不预览、不比对内容而直接下载带来的问题本站不予受理。
- 2.下载的文档,不会出现我们的网址水印。
- 3、该文档所得收入(下载+内容+预览)归上传者、原创作者;如果您是本文档原作者,请点此认领!既往收益都归您。
下载文档到电脑,查找使用更方便
10000 积分 1人已下载
下载 | 加入VIP,交流精品资源 |
- 配套讲稿:
如PPT文件的首页显示word图标,表示该PPT已包含配套word讲稿。双击word图标可打开word文档。
- 特殊限制:
部分文档作品中含有的国旗、国徽等图片,仅作为作品整体效果示例展示,禁止商用。设计者仅对作品中独创性部分享有著作权。
- 关 键 词:
- GEIASEB6A1990SYSTEMSAFETYENGINEERINGINSOFTWAREDEVELOPMENTFORMERLYTECHAMERICASEB6A 软件 开发 系统 安全工程 PDF

链接地址:http://www.mydoc123.com/p-754565.html