欢迎来到麦多课文档分享! | 帮助中心 海量文档,免费浏览,给你所需,享你所想!
麦多课文档分享
全部分类
  • 标准规范>
  • 教学课件>
  • 考试资料>
  • 办公文档>
  • 学术论文>
  • 行业资料>
  • 易语言源码>
  • ImageVerifierCode 换一换
    首页 麦多课文档分享 > 资源分类 > PDF文档下载
    分享到微信 分享到微博 分享到QQ空间

    ASTM E3016-2018 Standard Guide for Establishing Confidence in Digital and Multimedia Evidence Forensic Results by Error Mitigation Analysis.pdf

    • 资源ID:532245       资源大小:118.39KB        全文页数:11页
    • 资源格式: PDF        下载积分:5000积分
    快捷下载 游客一键下载
    账号登录下载
    微信登录下载
    二维码
    微信扫一扫登录
    下载资源需要5000积分(如需开发票,请勿充值!)
    邮箱/手机:
    温馨提示:
    如需开发票,请勿充值!快捷下载时,用户名和密码都是您填写的邮箱或者手机号,方便查询和重复下载(系统自动生成)。
    如需开发票,请勿充值!如填写123,账号就是123,密码也是123。
    支付方式: 支付宝扫码支付    微信扫码支付   
    验证码:   换一换

    加入VIP,交流精品资源
     
    账号:
    密码:
    验证码:   换一换
      忘记密码?
        
    友情提示
    2、PDF文件下载后,可能会被浏览器默认打开,此种情况可以点击浏览器菜单,保存网页到桌面,就可以正常下载了。
    3、本站不支持迅雷下载,请使用电脑自带的IE浏览器,或者360浏览器、谷歌浏览器下载即可。
    4、本站资源下载后的文档和图纸-无水印,预览文档经过压缩,下载后原文更清晰。
    5、试题试卷类文档,如果标题没有明确说明有答案则都视为没有答案,请知晓。

    ASTM E3016-2018 Standard Guide for Establishing Confidence in Digital and Multimedia Evidence Forensic Results by Error Mitigation Analysis.pdf

    1、Designation: E3016 18Standard Guide forEstablishing Confidence in Digital and Multimedia EvidenceForensic Results by Error Mitigation Analysis1This standard is issued under the fixed designation E3016; the number immediately following the designation indicates the year oforiginal adoption or, in the

    2、 case of revision, the year of last revision. A number in parentheses indicates the year of last reapproval. Asuperscript epsilon () indicates an editorial change since the last revision or reapproval.1. Scope1.1 This guide provides a process for recognizing anddescribing both errors and limitations

    3、 associated with tools,techniques, and methods used to support digital and multime-dia evidence forensics. This is accomplished by explaininghow the concepts of errors and error rates should be addressedin digital and multimedia evidence forensics. It is important forpractitioners and stakeholders t

    4、o understand that digital andmultimedia evidence forensic techniques and tools have knownlimitations, but those limitations have differences from errorsand error rates in other forensic disciplines. This guide pro-poses that confidence in digital and multimedia evidenceforensic results is best achie

    5、ved by using an error mitigationanalysis approach that focuses on recognizing potential sourcesof error and then applying techniques used to mitigate them,including trained and competent personnel using tested andvalidated methods and practices. Sources of error not directlyrelated to tool usage are

    6、 beyond the scope of this guide.1.2 This international standard was developed in accor-dance with internationally recognized principles on standard-ization established in the Decision on Principles for theDevelopment of International Standards, Guides and Recom-mendations issued by the World Trade O

    7、rganization TechnicalBarriers to Trade (TBT) Committee.2. Referenced Documents2.1 ISO Standard:2ISO/IEC 17025 General Requirements for the Competenceof Testing and Calibration Laboratories2.2 SWGDE Standards:3SWGDE Model Quality Assurance Manual for Digital Evi-denceSWGDE Standards and Controls Posi

    8、tion PaperSWGDE/SWGIT Proficiency Test Program GuidelinesSWGDE/SWGIT Guidelines however, they often struggle toestablish their confidence on a scientific basis. Some forensicdisciplines use an error rate to describe the chance of falsepositives, false negatives, or otherwise inaccurate results whend

    9、etermining whether two samples actually come from the samesource. But in digital and multimedia evidence forensics, thereare fundamental differences in the nature of many processesthat can make trying to use statistical error rates inappropriateor misleading.4.2 The key point to keep in mind is the

    10、difference betweenrandom errors and systematic errors. Random errors arecharacterized by error rates because they are based in naturalprocesses and the inability to perfectly measure them. System-atic errors, in contrast, are caused by many different factors. Incomputer software, for example, an imp

    11、erfect implementationcan produce an incorrect result when a particular condition,usually unknown, is met. Digital forensics being based oncomputer science is far more prone to systematic thanrandom errors.4.3 Digital and multimedia forensics includes multiple taskswhich, in turn, use multiple types

    12、of automated tools.4.4 For each digital and multimedia evidence forensic tool,there is an underlying algorithm (how the task should be done)and an implementation of the algorithm (how the task is donein software by a tool). There can be different errors and errorrates with both the algorithm and the

    13、 implementation. Forexample, hash algorithms used to determine if two files areidentical have an inherent false positive rate, but the rate is sosmall as to be essentially zero.4.5 Once an algorithm is implemented in software, inaddition to the inherent error rate of the algorithm, theimplementation

    14、 may introduce systematic errors that are notstatistical in nature. Software errors manifest when somecondition is present either in the data or in the executionenvironment. It is often misleading to try to characterizesoftware errors in a statistical manner since such errors are notthe result of va

    15、riations in measurement or sampling. Forexample, the hashing software could be poorly written and mayproduce the same hash every time an input file starts with thesymbol “$”.4.6 The primary types of errors found in digital and multi-media evidence forensic tool implementations are:4.6.1 Incompletene

    16、ssAll the relevant information has notbeen acquired or found by the tool. For example, an acquisitionmight be incomplete or not all relevant artifacts identified froma search.4.6.2 InaccuracyThe tool does not report accurate infor-mation. Specifically, the tool should not report things that arenot t

    17、here, should not group together unrelated items, andshould not alter data in a way that changes the meaning.Assessment of accuracy in digital and multimedia evidenceforensic tool implementations can be categorized as follows:4.6.2.1 ExistenceAre all reported artifacts reported aspresent actually pre

    18、sent? For example, a faulty tool might adddata that was not present in the original.4.6.2.2 AlterationDoes a forensic tool alter data in a waythat changes its meaning, such as updating an existing date-time stamp (for example, associated with a file or e-mailmessage) to the current date.4.6.2.3 Asso

    19、ciationDo all items associated together actu-ally belong together? A faulty tool might incorrectly associateinformation pertaining to one item with a different, unrelateditem. For instance, a tool might parse a web browser history fileand incorrectly report that a web search on “how to murderyour wi

    20、fe” was executed 75 times when in fact it was onlyexecuted once while “history of Rome” (the next item in thehistory file) was executed 75 times, erroneously associating thecount for the second search with the first search.4.6.2.4 CorruptionDoes the forensic tool detect and com-pensate for missing a

    21、nd corrupted data? Missing or corruptdata can arise from many sources, such as bad sectorsencountered during acquisition or incomplete deleted filerecovery or file carving. For example, a missing piece of datafrom an incomplete carving of the above web history file couldalso produce the same incorre

    22、ct association.4.6.3 MisinterpretationThe results have been incorrectlyunderstood. Misunderstandings of what certain informationmeans can result from a lack of understanding of the underly-ing data or from ambiguities in the way digital and multimediaevidence forensic tools present information.4.7 T

    23、he basic strategy to develop confidence in the digitaland multimedia evidence forensic results is to identify likelysources of error and mitigate them. This is done by applyingtool testing and sound quality control measures as described inthis guide including:4.7.1 Tool Testing:4.7.1.1 Determine app

    24、licable scenarios that have been con-sidered in tool testing.4.7.1.2 Assess known tool anomalies and how they apply tothe current case.4.7.1.3 Find untested scenarios that introduce uncertainty intool results.4.7.2 Sound Quality Control Procedures:4.7.2.1 Tool performance verification.4.7.2.2 Person

    25、nel training, certification and regular profi-ciency testing.4.7.2.3 Written procedures in accordance with applicableorganizational quality assurance procedures.4.7.2.4 Examinations should be documented utilizing appli-cable organizational quality procedures.4.7.2.5 Document deviations/exceptions fr

    26、om standard op-erating procedures.4.7.2.6 Laboratory accreditation.4.7.2.7 Technical/peer review.4.7.2.8 Technical and management oversight.4.7.2.9 Use multiple tools and methods.4.7.2.10 Maintain awareness of past and current problems.4.7.2.11 Reasonableness and consistency of results for thecase c

    27、ontext.E3016 1824.8 A more formalized approach to handling potentialsources of error in digital and multimedia evidence forensicprocesses is needed in order to address considerations such asthose in Daubert.4.9 The error mitigation analysis process involves recogniz-ing sources of potential error, t

    28、aking steps to mitigate anyerrors, and employing a quality assurance approach of continu-ous human oversight and improvement. Rather than focusingonly on error rates, this more comprehensive approach takesinto account all of the careful measures that can be taken toensure that digital and multimedia

    29、 evidence forensics processesproduce reliable results. When error rates can be calculated,they can and should be included in the overall error mitigationanalysis.5. Procedures5.1 Mitigating errors in a digital forensics process begins byanswering the following questions:5.1.1 Are the techniques (for

    30、 example, hashing algorithmsor string searching) used to process the evidence valid science?5.1.2 Are the implementations of the techniques (forexample, software or hardware tools) correct and appropriatefor the environment where they are used?5.1.3 Are the results of the tools interpreted correctly

    31、?5.2 Considering each of these questions is critical to under-standing errors in digital and multimedia evidence forensics.The next three sections explain the types of error associatedwith each question. In the first section, Techniques (5.3), thebasic concept of error rates is addressed along with

    32、a discus-sion of how error rates depend on a stable population. Thesecond section, Implementation of Techniques in Tools (5.4),addresses systematic errors and how tool testing is used to findthese errors. The third section, Tool Usage and InterpretingResults (5.5), summarizes how practitioners use t

    33、he results ofdigital and multimedia evidence forensic tools. This overallapproach to handling errors in digital and multimedia evidenceforensics helps address Daubert considerations.5.3 TechniquesIn computer science, the techniques thatare the basis for digital processing includes copying bits andth

    34、e use of algorithms to search and manipulate data (forexample, recover files). These techniques can sometimes becharacterized with an error rate.5.3.1 Error RatesAn error rate has an explicit purpose toshow how strong the technique is and what its limitations are.There are many factors that can infl

    35、uence an error rateincluding uncertainties associated with physical measurements,algorithm weaknesses, statistical probabilities, and humanerror.NOTE 1Systematic and Random Errors: Error rates for many proce-dures can be treated statistically, however not all types of experimentaluncertainty can be

    36、assessed by statistical analysis based on repeatedmeasurements. For this reason, uncertainties are classified into twogroups: the random uncertainties, which can be treated statistically, andthe systematic uncertainties, which cannot.4The uncertainty of the resultsfrom software tools used in digital

    37、 and multimedia evidence forensics issimilar to the problems of measurement in that there may be both arandom component (often from the underlying algorithm) and a system-atic component (usually coming from the implementation).5.3.1.1 Error rates are one of the factors described inDaubert to ascerta

    38、in the quality of the science in experttestimony.5The underlying computer techniques are compa-rable to the type of science that is described in Daubert.Are theunderlying techniques sound science or junk science?Are theyused appropriately? In computer science, the types of tech-niques used are diffe

    39、rent from DNA analysis or trace chemicalanalysis. In those sciences, the technique or method is oftenused to establish an association between samples. Thesetechniques require a measurement of the properties of thesamples. Both the measurements of the samples and theassociations have random errors an

    40、d are well described byerror rates.5.3.1.2 Differences between digital and multimedia evi-dence and other forensic disciplines change how digital andmultimedia evidence forensics uses error rates. There are errorrates associated with some digital and multimedia evidenceforensic techniques. For examp

    41、le, there are false positive ratesfor cryptographic hashing; however, the rate is so small as to beessentially zero. Similarly, many algorithms such as copyingbits also have an error rate that is essentially zero. SeeAppendix X1, X1.2 and X1.3, for a discussion of error ratesassociated with hashing

    42、and copying.5.3.2 Error Rates and PopulationsThere are other majordifferences between digital and multimedia evidence forensicsand natural sciences-based forensic disciplines. In biology andchemistry-based disciplines, the natural components of asample remain fairly static (for example, blood, hair,

    43、 cocaine).Basic biology and chemistry do not change (although newdrugs are developed and new means of processing are created).In contrast, information technology changes constantly. Newtypes of drives (for example, solid-state drives) and applica-tions (for example, Facebook) may radically differ fr

    44、omprevious ones. There are a virtually unlimited number ofcombinations of hardware, firmware, and software.5.3.2.1 The rapid and significant changes in informationtechnology lead to another significant difference. Error rates, aswith other areas of statistics, require a “population.” One of thekey f

    45、eatures of a statistical population is that it is stable, that is,the essential elements of the composition remain constant. Thisallows predictions to be made. Since IT changes quickly andunpredictably, it is often infeasible to statistically describe apopulation in a usable way because, while the d

    46、escription mayreflect an average over the entire population, it may not beuseful for individual situations. See Note 2 for an example ofthis.NOTE 2Deleted File Recovery Example: File fragmentation is signifi-cant to the performance of deleted file recovery algorithms. In general, themore fragmented

    47、the files, the harder it is to recover the original files. Forconventional (magnetic) hard drives, the amount of fragmentation wasgoverned by the size of the hard drive (which change rapidly as biggerdrives are brought to market) and usage patterns (which change rapidlysuch as storing large amount o

    48、f multimedia files or using new applica-tions). The resulting complexity itself meant that it was very difficult to4Taylor, John R., An Introduction to Error Analysis: The Study of Uncertaintiesin Physical Measurements, University Science Books, Sausalito, CA, 1997, p. 93.5Daubert v. Merrell Dow Pha

    49、rmaceuticals (92-102), 509 U.S. 579, 1993.E3016 183determine what performance could be expected for a given drive type oruser. This then changed completely when solid state drives (SSDs) wereintroduced and became popular. They no longer optimize performance bykeeping files contiguous, rather moving files to prolong storage cell life.Additionally, the drive may “clean” deleted material. These kinds ofparadigm shifts in IT are common and sometimes have unknown effectson forensic tools.5.3.2.2 In examining these two differences (1) the virtu-ally infinite number of com


    注意事项

    本文(ASTM E3016-2018 Standard Guide for Establishing Confidence in Digital and Multimedia Evidence Forensic Results by Error Mitigation Analysis.pdf)为本站会员(sumcourage256)主动上传,麦多课文档分享仅提供信息存储空间,仅对用户上传内容的表现方式做保护处理,对上载内容本身不做任何修改或编辑。 若此文所含内容侵犯了您的版权或隐私,请立即通知麦多课文档分享(点击联系客服),我们立即给予删除!




    关于我们 - 网站声明 - 网站地图 - 资源地图 - 友情链接 - 网站客服 - 联系我们

    copyright@ 2008-2019 麦多课文库(www.mydoc123.com)网站版权所有
    备案/许可证编号:苏ICP备17064731号-1 

    收起
    展开