Static And Dynamic Analysis Of Android Malware - SJSU

1y ago
28 Views
2 Downloads
676.39 KB
10 Pages
Last View : 15d ago
Last Download : 3m ago
Upload by : Braxton Mach
Transcription

Static and Dynamic Analysis of Android MalwareAnkita Kapratwar, Fabio Di Troia and Mark StampDepartment of Computer Science, San Jose State University, San Jose, U.S.A.Keywords:Malware, Android, Static Analysis, Dynamic Analysis.Abstract:Static analysis relies on features extracted without executing code, while dynamic analysis extracts featuresbased on execution (or emulation). In general, static analysis is more efficient, while dynamic analysis can bemore informative, particularly in cases where the code is obfuscated. Static analysis of an Android applicationcan, for example, rely on features extracted from the manifest file or the Java bytecode, while dynamicanalysis of such applications might deal with features involving dynamic code loading and system calls. Inthis research, we apply machine learning techniques to analyze the relative effectiveness of particular staticand dynamic features for detecting Android malware. We also carefully analyze the robustness of the scoringtechniques under consideration.1INTRODUCTIONAccording to a recent report by International DataCorporation, Android dominates the smartphonemarket, with a market share of 88.2% as of 2015 andmore than 1.4 billion active Android phone users1.This large market for smartphones has not goneunnoticed by cybercriminals (Spreitzenbarth, 2014).There are many third party stores for Androidapplications, and it has become common forcybercriminals to repackage legitimate Androidapplications to include malicious payloads.Smartphone malware can come in many forms,including Trojans, botnets, and spyware. Suchapplications are created with malicious intent, andcan, for example, acquire a user’s private data (Saudi,2015).Reports estimate that during the 2010 to 2014timeframe, the number of mobile malware applications grew exponentially, and most of this malwaretargeted Android systems. Figure 1 shows theincrease in the number of total mobile malwareapplications and the share of these that are Androidmalware2,3. According to a report by Kaspersky pdf/additionalresources/jnpr-2011-mobile- rtphone-os-marketshare.jspthere were 291,800 new mobile malware programsthat emerged in the second quarter of 2015 alone,which is 2.8 times more than in the first quarter. Inaddition, there were one million mobile malwareinstallation packages in the second quarter, which isseven times greater than the number in the firstquarter4.Figure 1: Growth of Mobile Malware.Due to this alarming increase in the number ofAndroid malware applications, the analysis SMB-companies653Kapratwar, A., Troia, F. and Stamp, M.Static and Dynamic Analysis of Android Malware.DOI: 10.5220/0006256706530662In Proceedings of the 3rd International Conference on Information Systems Security and Privacy (ICISSP 2017), pages 653-662ISBN: 978-989-758-209-7Copyright c 2017 by SCITEPRESS – Science and Technology Publications, Lda. All rights reserved

ForSE 2017 - 1st International Workshop on FORmal methods for Security Engineeringdetection of Android malware has become animportant research topic. Many Android malwaredetection and classification techniques have beenproposed and analyzed in the literature, some ofwhich we briefly review later in this paper.To collect the features used to analyze malware,we can rely on static or dynamic analysis—or somecombination thereof. Static analysis relies on featuresthat are collected without executing the code. Incontrast, for dynamic analysis we execute (oremulate) the code. Static analysis is usually moreefficient, since no code execution is required.Dynamic analysis can be more informative, since weonly analyze code that actually executes. However,with dynamic analysis we may not not see allexecution paths, which can limit our overall view ofthe code.Static analysis of Android malware can rely onJava bytecode extracted by disassembling anapplication. The manifest file is also a source ofinformation for static analysis. One specificdisadvantage of such static analysis is that it is blindto dynamic code loading, that is, static analysis failsto deal with parts of the code that are downloadedduring execution. In contrast, dynamic analysis canexamine all code that is actually executed by anapplication.In this paper, we consider Android malwaredetection based on static and dynamic features. Thestatic features we consider are based on permissionsextracted from the manifest file, while our dynamicanalysis is based on system calls extracted at runtime.We analyze the effectiveness of these techniquesindividually and in combination. We also perform arobustness analysis, and carefully consider theinterplay between the static and dynamic features.This paper is organized as follows. In Section 2,we discuss relevant background topics, including abrief overview of the Android operating system, abrief literature survey, and a high level view of themachine learning techniques used in this research.Section 3 discusses the dataset used and ourmethodology for extracting static and dynamicfeatures. Section 4 provides our experimental results.Finally, in Section 5 we give our conclusion andsuggestions for the future work.257https://os.itec.kit.edu/downloads/sa 2010 braehler-stefan m/devices/#Linuxkernel654BACKGROUNDIn this section, we discuss relevant backgroundtopics. Our focus here is on previous related work,while we also give an overview of the Android OS,we take a brief look at different types of Androidmalware from a high-level perspective, and wediscuss the various machine learning techniques thatare used in our analysis.2.1Overview of Android OSFigure 2 illustrates the Android software stack, wherethe items in green are the written in C/C while theblue items are written in Java and executed using theDalvik VM5. The Android Linux Kernel is a modifiedLinux Kernel which includes wake locks, binder IPCdrivers, and other features that play a critical role in amobile embedded platform6. The libraries plays a rolein optimizing CPU usage, memory consumption, andalso contains the audio and video codecs for thedevice.Figure 2: Android architecture (Abah, 2015).The Android runtime layer consists of the Dalvikvirtual machine and core Java libraries. During anAndroid application compilation, the Java bytecode isconverted into Dalvik bytecode using dx tool, whichis executed on the Dalvik virtual machine. The Dalvikvirtual machine is more powerful than the JavaVirtual Machine in terms of multitasking capabilities.The application framework is an abstract layerused to develop applications that rely on the underlying reusable libraries and packages. Some majorcomponents of this layer include the cs/manifest/manifest-intro.html

Static and Dynamic Analysis of Android Malware The Activity Manager provides an interfacefor the users to interact with the applications.The Intent/Notification Manager deals withmessaging objects to facilitate interprocesscommunication with components.The Content Manager provides an interface toconnect data in one process with code runningin another process.The Telephony Manager deals with telephonyrelated information, such as the InternationalMobile Station Equipment Identity (IMEI)number.Applications are built on top of the Applicationframework, which provides for interaction betweenusers and the device. Applications are distributed asandroid package (apk) files. An apk file is a signedzip archive file that includes a classes.dex file,external libraries, and the AndroidManifest.xml. Thismanifest file describes the abilities or privilegesgranted to the application, and also providesinformation about various application components.For example, the activities, services, intents, andbroadcast receivers must be declared in this xml file.For our purposes, the most important aspect of themanifest is that it contains a list of permissions, whichallows the application to access certain devicecomponents. These permissions are explicitly grantedby the user at install time.2.2Android MalwareAndroid malware applications primarily consist ofTrojans. A typical Android Trojans might trick theuser by using icons or user interfaces that mimic abenign application. Android Trojans often display aservice level agreement during installation whichobtains permissions to access a user’s personalinformation, such as the phone number. The Trojancan then, for example, send SMSs to premium ratenumbers in the background.Android Trojans are also often used as spyware.Such malicious applications can gain access to auser’s private information and send it to a privateserver. The main purpose of such spyware is to stealinformation such as phone location, bank or creditcard details, passwords, text messages, contacts,on-line browsing activity, and so on. A moresophisticated implementation might also includebotnet capabilities.2.3Related WorkIn the research by Feng, et al. (Feng, 2014), theauthors develop Appopscopy, a semantic languagebased signature detection strategy for Android. In thisapproach, general signatures are created for eachmalware family. Signature matching is achievedusing inter-component call graphs based on controlflow properties. Further, the results are enhancedusing static taint analysis. However, this approachseems to be fairly weak with respect to codeobfuscation and dynamic code loading.In the research by Fuchs, el al. (Fuchs, 2009), theauthors analyze a tool that they call Scandroid. Thisscheme extracts features based on data flow. Zhou, etal. (Zhou, 2012a), analyze permissions and applyheuristic filtering to detect Android malware.Abah, et al. (Abah, 2015), propose an approachthat relies on a k-Nearest Neighbor classifier. Thefeatures collected include incoming and outgoingSMS and calls, device status, running applicationsand processes, and so on. In the research by Aung, etal. (Aung, 2013), the authors propose a frameworkthat relies on machine learning algorithms based onfeatures obtained from Android events andpermissions.Aphonso, et al. (Afonso, 2015), propose adynamic analysis technique that relies primarily onthe frequency of system calls and API calls. The maindrawback of this approach is that it can detect malware only in cases where the application meetscertain API level.Taintdroid (Enck, 2014) is another dynamicanalysis system. This approach analyzes networktraffic to search for anomalous behavior. Finally,Maline (Dim-Jasevic, 2015) is another dynamicdetection tool based on Android system call analysis.2.4Machine Learning AlgorithmsIn this section, we briefly describe the categories ofmachine learning algorithms used in this research.For all of these algorithms, we have used the Weka8implementation.2.4.1 Random ForestDecision trees are one of the simplest learningtechniques. However, a decision tree tends to overfitthe training data, since it is a literal interpretation ofthe data, and provides no generalization of the8https://weka.wikispaces.com/ARFF %28book version%29655

ForSE 2017 - 1st International Workshop on FORmal methods for Security Engineeringtraining set. To partially alleviate this problem,multiple decision trees can be used, where each istrained on a subset of the training data. A randomforest takes this idea one step further by also trainingon subsets of the classifiers (Breiman, 2013).Although much of the inherent simplicity of decisiontrees is lost in this process, random forests haveproved to be a very strong machine learningtechnique over a wide variety of applications.2.4.2 J.48The J.48 algorithm is based on a specific implementation of the decision tree algorithm known as C4.5(Ruggieri, 2000). In this algorithm, a node for the treeis created by splitting the dataset, where the data withhighest information gain is chosen at each step.2.4.3 Naïve BayesNaïve Bayes is a classic statistical discriminationtechnique, the key aspect of which is the assumptionthat all features are independent of each other9.Although this is unlikely to be true in reality, it greatlysimplifies the computations, and Naïve Bayes hasproven highly successful in many applications.2.4.4 Simple LogisticSimple Logistic is an ensemble learning algorithm.To evaluate the base learners, this approach utilizeslogistic regression (Shalizi, 2016), using simpleregression functions. Similar to linear regression, ittries to find a function that will fit the training datawell by computing the weights that maximize the loglikelihood of the logistic regression function.2.4.5 Sequential Minimal OptimizationThe Sequential Minimal Optimization (SMO)technique is a specific implementation of SupportVector Machines (SVM) used in Weka. In SVM, theclassification is determined based on a separatorbetween two classes of labeled training data. In SVM,we maximize the “margin”, i.e., the separationbetween the labeled training sets. Another feature ofSVM is the so-called kernel trick, where data is, ineffect, mapped to a higher dimensional space—withmore space to work in, it is likely to be much easierto separate the training data. The SMO classifier useseither a Gaussian or a polynomial kernel (Guptil,2013).9http://software.ucv.ro/ 6562.4.6 IBkThe IBk algorithm is an example of a lazy learner.This instance-based learner saves all of the trainingsamples and compares the test samples to each of themembers of the training set until it finds the closestmatch. This algorithm is Weka’s version of the wellknown k-nearest neighbor classifier10. The Wekaimplementation of IBk uses Euclidean distance as thedefault distance measure.3METHODOLOGYThis section describes the Malware and benigndataset used in the project and the methodology usedto extract features from the dataset. We also discussvarious implementation details of our approach.3.1DatasetsSince there does not appear to be a standard Androidbenign dataset, we generated our own. Our benigndataset application files were collected from theGoogle Play Store, which is considered relativelyunlikely to contain malware applications. Further,each benign application was classified as such usingVirustotal11, a service which aggregates informationfrom multiple antivirus engines, website scanners,and URL analyzers.The malware dataset used in this research wasacquired from the authors of Drebin (Arp, 2014). Thisdataset consists of applications obtained from varioussecondary Android markets, Android websites,malware forums, security blogs, and the AndroidMalgenome Project (Zhou, 2012b). Each element ofthe malware dataset was classified as mal- ware basedon results from Virustotal. Table 1 gives the numbersof applications in our datasets.Table 1: Dataset Feature ExtractionWe extracted static and dynamic features. First, wediscuss the static case, and then we turn our attentionto the more complex dynamic -Neighborshttps://www.virustotal.com

Static and Dynamic Analysis of Android Malware3.2.1 Static AnalysisTable 2: Permissions and Entropy Scores.As mentioned above, an Android application is in theform of an Android package, or apk, archive, whichis a zip bundle. The apk archive includes themanifest, along with various other resources andfolders. To extract the features of interest, we firstneed to reverse engineer the apk files, which weaccomplished using the APK tool in Virustotal.The file AndroidManifest.xml containsseveral features that could possibly be used for staticanalysis. Here, we focus on the permissions requestedby the application. The AndroidManifest.xmlcontains a list of all permissions required by theapplication. Android uses a proprietary binary xmlformat, so we designed our own custom xml parser xml files.There are a total of 135 Android permissions. Weconstruct a binary feature vector from the extractedpermissions. We denote this feature vector as R (r1,r2, . . . , r135), where1 if the ith permission is presentri (1)0 otherwise.Given an Android application the following stepsdescribe the process we use to extract the permissionsfeatures.1. Reverse engineer the Android application.This reverse engineering is achieved using theAPK tool in Virustotal12.2. Extract the permissions requested from theAndroidManifest.xml file using our customxml parser.3. Generate a binary feature vector, as in (1).4. Finally, we built a permission vector datasetfor all the applications in our dataset store it inan ARFF13 file format.Information 380.08130.0722PermissionMOUNT UNMOUNT FILESYSTEMSMANAGE DOCUMENTSREAD PHONE STATEINSTALL LOCATION PROVIDERSET WALLPAPERVIBRATEWRITE CALL LOGWAKE LOCKSET PREFERRED APPLICATIONSREQUEST IGNOREBATTERY OPTIMIZATIONSThe information gain of each permission iscalculated asgain(c, ri) entropy(c) entropy(c ri)where c is the label (i.e., either malware of benign)and ri is the ith permission feature. Here entropy(c) isthe information entropy. Table 2 shows the list of thetop ten permissions (with respect to information gain)and their corresponding information gain. Note thathigher values indicate more information is gainedfrom the given attribute.After eliminating permissions that never appearedand those that resulted in no information gain, weobtained a subset of 99 permissions. Furtherexperiments enabled us to reduced these 99 nonredundant permissions. We found that using the top87 permissions (with respect to information gain) weobtained the best results (based on the AUC, asdiscussed in Section 4.1, below), and hence we use afeature vectors of length 87 in all experimentsreported below.For example, a reduced permissions vector fromone of the files in our malware dataset is given by0, 0, 0, 0, 0, 0, 0, 1, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0,0, 0, 0, 0,0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0,0, 0, 0, 0, 0, 1, 0, 0,0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0,0, 0, 0, 0, 0, 1, 0, 0, 0, 0, 0, 0,0, 0, 0, 0, 0, 0, 0,0, 0, 0, 0, 1, 0, 0, 0, 0, 0, 0Of the 135 possible permissions, many were neverrequested in any of the Android applications in ourdatasets. These permissions were removed fromconsideration, since they contribute nothing to theanalysis. Furthermore, some features (i.e.,permissions) provide little or no useful information.Thus, to further reduce the length of our featurevectors, we have used feature selection based on astraightforward information gain calculation, whichwe now describe.As another example, a reduced permissions vectorfrom our benign dataset is given by1213https://www.virustotal.com0, 0, 0, 0, 0, 0, 0, 1, 0, 0, 0, 0, 1, 0, 0, 0, 0, 1, 0,0, 0, 0, 0,0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0,0, 0, 0, 0, 0, 1, 0, 0,1, 0, 0, 0, 0, 0, 0, 0, 1, 0, 0,0, 0, 1, 0, 0, 1, 0, 0, 0, 0, 0, 0,0, 1, 0, 0, 0, 0, 1,0, 0, 0, 0, 1, 0, 0, 0, 0, 0, 0https://weka.wikispaces.com/ARFF %28book version%29657

ForSE 2017 - 1st International Workshop on FORmal methods for Security Engineering3.2.2 Dynamic AnalysisAs expected, an Android application interacts withthe operating system through system calls. We haveextracted system calls using dynamic analysis. Toachieve this, we have made use of the Androidemulator that is included with Android Studio14. EachAndroid application in our dataset has been executedin a separate emulator, with the frequency of eachsystem call recorded.We connect to the emulator instance using theAndroid Debug Bridge (ADB)15, which is a commandline tool found in the Android SDK. The ADB comeswith a so-called Monkey Runner16, which can be usedto emulate random UI interactions. These eventsinclude clicks, volume interactions, touches, and soon, which trigger system calls. We record the resulting system calls using the monitoring tool Strace17In detail, the emulation and data collectionconsists of the following steps.1. Open the AVD Manager in Android Studioand click on Create New Device. This createsan emulator instance and runs it.2. After the emulator is running, we open theterminal and navigate to the platform toolsfolder of the Android SDK. There we enteradb help to verify that the ADB is workingas expected.3. Next, we issue the command adb deviceswhich lists the emulator ID that is running.4. Assuming the Android application is namedApplicationName.apk, we give thecommandadb install ApplicationName.apk(via a batch file). At this point, we can verifythat the application file has been installed inthe emulator.5. Next, we enter the emulator shell by typingadb -s emulator-5646 shellat the terminal.6. We launch the application and check theprocess ID using the commandps package name .7. The commandstrace -P ProcessID -c -o path in emulator Filename.csv package name begins the recording of system 6588. We start Monkey Runner using the commandadb shell -p package name -v 500 -s 42.As mentioned above, this generates randomevents through the user interface. Simultaneously,Strace will record the frequency count of the system calls that are generated.9. After the Monkey Runner instance stops, weex- tract the log file using the commandadb pull path in emulator path in destination .Of course, the precise sequence of system callsgenerated will vary, depending on the random selection made by the Monkey Runner. However, the frequency of the various system calls is relatively stablefor a given application.The frequency representation of system calls carries information about the behavior of the application (Burguera, 2011). A particular system call maybe utilized more in a malicious application than in abenign application, and the system call frequency representation is intended to capture such information.Let C (c1, c2, . . . , cn) be the set of possible system calls available in the Android OS. Then elementi in our system call feature vector contains the countfor the number of occurrences of system call ci. Forexample, such a system call vector extracted from oneinstantiation of one of our benign applications ,0,0,0,0,1,0,0,0,0,0,0,426,0,0,65A system call vector from one of the Androidmalware application in our dataset is given onkey.htmlhttp://linux.die.net/man/1/strace

Static and Dynamic Analysis of Android MalwareTable 3: System Configurations.4EXPERIMENTSWe conducted several sets of experiments. First, wecarried out experiments to compare the effectivenessof various machine learning algorithms in theAndroid malware detection context. Second, theeffectiveness of classification based on the dynamicsystem call frequency data was analyzed. Third, theeffectiveness of classification based on the staticanalysis of permissions data was evaluated. Finally,experiments were carried out based on combinedpermission and system call data. Furthermore, in eachof the latter three cases, we carefully quantify therobustness of the scoring technique.All experimental results given in this paper arebased on 10-fold cross validation. That is, our malware set is randomly partitioned into 10 subsets, say,S1, S2,., S10. Then subsets S2 through S10 are usedfor training, with subset S1 and the benign setreserved for testing. This training and scoring processis repeated nine more times, with a different subsetreserved for testing in each iteration. The scoringresults from all 10 “folds” are accumulated andconsidered together as one experiment. Crossvalidation serves to reduce the effect of any bias inthe data, and it also maximizes the number of scoresobtained from a given dataset.The system configuration used for all of theexperiments reported in this paper is given in Table 3.4.1Evaluation MetricTo evaluate the success of our experiments, we relyon the area under the ROC curve (AUC). Given aTable 4: Comparison of Machine Learning Algorithms.scatterplot of scores for benign and malware cases, anROC curve is a graph of the true positive rate (TPR)versus the false positive rate (FPR) as the thresholdvaries through the range of scores. An AUC of 1.0indicates the ideal case, where there exists a threshold that completely separates the benign and malwarescores, while an AUC of 0.5 indicates that the binaryclassifier is no better than flipping a coin. In general,the AUC can be interpreted as the probability that arandomly selected positive instance scores better thana randomly selected negative instance (Hand, 2001).One advantage of the AUC as compared to measuringaccuracy is that no explicit thresholding is requiredwhen computing the AUC. In fact, the AUC takes allpossible thresholds into account.4.2ResultsIn this section, we first compare various machinelearning algorithms. Then we turn our attention todetailed analyses of detection based on static,dynamic, and combined feature sets.4.2.1 Comparison of Machine LearningAlgorithmsTable 4 shows the AUC values of different algorithmsavailable on Weka based on (dynamic) system callsand (static) permissions. This same information isgiven in the form of a bar graph in Figure 3.From these results, we see that a Random Forestwith 100 trees gives the best results. Consequently,we use this algorithm in the remainder of theexperiments reported in this paper.4.2.2 System Calls and Permissions AnalysisTo analyze system calls, we train on the dynamicallyextracted feature vector containing system call659

ForSE 2017 - 1st International Workshop on FORmal methods for Security EngineeringFigure 3: AUC Comparison of Machine Learning Algorithms.frequencies. The feature extraction process isdescribed above in Section 3.2.2. For this experiment,we obtain an AUC of 0.884, which implies that thesystem calls feature alone does not yield particularlystrong detection result.We also evaluated our (static) permission featurein a similar manner. Recall that this feature extractionprocess is described in Section 3.2.1. In this case, weobtain an AUC of 0.972. This results is quite strongand shows that a fairly simple static feature can beused to detect Android malware with high accuracy.4.2.3 Robustness AnalysisNext, we want to analyze the robustness of each ofthese scoring techniques—individually, and incombination. Here, we mimic the effect of a malwaredeveloper who tries to make the permissions andsystem calls of Android malware look more similar tothose of a benign application. Since the number ofpermissions and system calls tends to be much largerin mal-ware applications, we analyze the robustnessof our scoring techniques when these numbers arereduced in the malware applications.The results in Figure 4(a) show the effect ofreducing the number of permissions. The analogousresults for system calls are given in Figure 4(b).As can be seen from Figure 4, reducing thenumber of system calls has a limited effect, while660even a slight reduction in the number of permissionscan have a large effect.The static and dynamic features considered herecan easily be combined, and hence it is important toanalyze their robustness in combination. Thisexperiment has been conducted, with the results givenin the form of 3-dimensional graph in Figure 5.From the results in Figure 5, we can clearly seethe interplay between permissions and system calls issomewhat more complex than might be expectedfrom merely viewing the permissions and systemcalls independently, as in Figure 4. While it isnecessary that the malware writer reduce the numberof permissions, unless this is accompanied by asignificant reduction in the number of system calls,fairly strong detection results can still be obtained inthe combined case.5CONCLUSION AND FUTUREWORKFor Android malware detection, we have observedthat a simple static feature based on permissions issignificantly more informative than a dynamic featurebased on system calls. This is, perhaps, somewhatsurprising, since in much of the malware detectionliterature, system calls are treated as essentially the

Static and Dynamic Analysis of Android MalwareFigure 4: Robustness of Permissions and System Calls Separate

Abstract: Static analysis relies on features extracted without executing code, while dynamic analysis extracts features based on execution (or emulation). In general, static analysis is more efficient, while dynamic analysis can be more informative, particularly in cases where the code is obfuscated. Static analysis of an Android application

Related Documents:

simulation method. A dynamic fault tree usually consists of static gates and dynamic gates. The unique function of dynamic gates is depicting interactions in a complex system, which cannot be realized by static gates. In order to understand fault tree better, we apply static fault tree and dynamic fault tree in risk analysis of di erent areas .

variation of the load is small, hence static analysis is sufficient. However, in case of off-shore structures (oil rigs), high rise buildings subjected to lateral loads (wind, earth quake) dynamic effects of the load must be explored for knowing the exact safety and reliability of the structure. Comparison between static and dynamic analysis:

l Combines static and dynamic analysis l Static analysis phase Extract program communication structure tree - Loops, branches etc. l Dynamic analysis phase Use this tree as a template "Fill in" runtime information - Loop count MUG'18-Combining Static and 26 Analysis for Top -down Communication Trace Compression 11

Static analysis checks 20 Stages of static analysis Control flow analysis. Checks for loops with multiple exit or entry points, finds unreachable code, etc. Data use analysis. Detects uninitialized variables, variables written twice without an intervening assignment, variables which are declared but never used, etc. Interface analysis.

Dec 06, 2018 · Dynamic Strategy, Dynamic Structure A Systematic Approach to Business Architecture “Dynamic Strategy, . Michael Porter dynamic capabilities vs. static capabilities David Teece “Dynamic Strategy, Dynamic Structure .

5 Vulnerability Analysis Techniques Static analysis Analysis performed before a program starts execution Works mainly on source code Binary static analysis techniques are rather limited Not very effective in practice, so we won't discuss in depth Dynamic analysis Analysis performed by executing the program Key challenge: How to generate input for execution?

Nonlinear Iwan Joints Using Quasi-Static Modal Analysis," Mechanical Systems and Signal Processing, vol 118, pp. 133-157, 2019. Dynamic analysis of a structure is computationally expensive so we use a static analysis 10x increase in speed for a quasi-static case (seconds) vs. static response

3rd Grade – Persuasive Essay . Teachers may want to invest time in reading Kindergarten-Second Grade MAISA Writing Units of study or talk to previous grade level teachers before beginning this unit. If students have not had previous experience in a writing workshop or with aligned units of study, teachers may want to include lessons from previous grade levels as support and build towards .