Program for 2019 International Conference on Platform Technology and Service (PlatCon)
Sunday, January 27
Sunday, January 27 11:00 - 12:00
Local Arrangement Meeting
Sunday, January 27 13:30 - 15:30
ICRP Steering Meeting
Sunday, January 27 16:00 - 18:00
Conference Committees' Meeting(Steering Committee / Organizing Committee / Program Committee)
Monday, January 28
Monday, January 28 9:00 - 10:00
Monday, January 28 10:00 - 11:20
1-A: Convergence & Human Media Platform
- Formal Specification Technique in Smart Contract Verification
- The block chain technology is changing rapidly. The block chain guarantees the integrity of the book through a specific consensus of the participants. In the past, the block chain technology had a limited range of applications. However, the use of block chain technology is gradually expanding as smart contracts that can formulate general business logic are mentioned. Already studied the components of smart contracts in other studies and proposed the possibility of extending them on the basis of ontology. And research on securing traceability of smart contract based on ontology has been carried out. However, research on various transactions constituting smart contracts is lacking. In this paper, the constituent elements of smart contract are analyzed and expressed by ontology. And the process of negotiating the components is represented by each transaction. Finally, we construct the component represented by the ontology as XML by including the state information in the transaction. In this way, the smart contract is represented in a formal language that contains state information. It also laid the foundation for a smart contract that can be reused and verified.
- A Study on the Needs for Enhancement of Personal Information Protection in Cloud Computing Security Certification System
- Recently, the development of cloud computing technology has provided an efficient access to information, but issues about personal information security are emerging and hindering the spread of the cloud industry. In this response, cloud security certification system is implemented to solve various cloud security problems in Korea. However, it is difficult to see that the level of technical protection measures of the cloud security certification system is satisfied with all the technical protection measures required by the personal information protection laws and regulations. As a result, it was proposed that security products that are highly related to the contents of the personal information protection laws and regulations should be in line with the CC(common criteria) certified product group. In this paper, we propose items that the cloud security certification system should supplement from the viewpoint of security requirements that are compliant with the CC certified product line.
- TAPS: Trust-based Access Control and Protect System
- The current network access control solution focuses only on the setup for the connection control of each gateway or VPN equipment and the status monitoring function of each equipment. In addition, each secure connection configuration between the corresponding network access nodes making it easy for the target node to be protected is likely to be exposed to malicious attacks. Therefore, in this paper, we propose a trust-based network access control and protect system (TAPS) that enables the software-driven definition and control of a network tunnel for protection between devices and services. The proposed system was designed to only grant access to specified services to mutually authorized devices by predefining and mapping a list of devices and services. In addition, data traffic between each device and service has been developed to be resistant to a variety of cyber attacks by encrypting it according to its purpose.
- Multi-Label Bias-Based Predictor
- A recommender system predicts the future preference of a set of items for a user by computing the similarity between users or items, and recommends the top items based on the prediction. Collaborative filtering, the most popular approach to build recommender systems, has been successfully employed in many commercial and non-commercial applications but has reached the limit of performance growth. A novel way to predict user preferences is needed to overcome this problem. Bias-based predictor (BBP) is a prediction model for recommender systems that assumes that bias and preference are highly correlated. In predictive models, bias generally means the intercept where our line intercepts the y-axis, offsetting all predictions we make. Users' preference ratings are likely to be influenced by biases in many different aspects. This implies that bias can play more significant roles in preference prediction, rather than simply intercepting the y-axis. This paper proposes a prediction model for recommender systems that takes into account bias in multiple aspects. The proposed model called multi-label bias-based predictor (MLBBP) extends the conventional BBP to allow for a more in-depth analysis of bias. Through experiments with movie data, it was demonstrated that MLBBP performs better than BBP. The trained MLBPP model produces numerical data that can explain why a specific item is recommended to a user.
1-B: Computing Platform
- ICMPv6SD: A Compact Service Discovery Protocol Supporting Plug-and-Play in Home Networks
- UPnP (Universal Plug and Play) is popular in home networks because it provides a general service management schemes for heterogeneous hardware and software based on open standards. Meanwhile, the rise of the internet of things (IoT) results in the rapid growth of the number of small or embedded devices which makes the IPv4 address exhaustion even worse. Hence, IPv6 is widely adopted in recent years. It is worthy to note that IPv6 also improves the design of the family of IP-based management protocols, namely, ICMPv6, so that the efficiency of transmission is significantly improved. Following this trend, UPnP also supports IPv6, but this extension concentrates on the address compatibility and neglects to utilize the improved design in IPv6. In light of this, this paper proposes a new service discovery protocol which bases on ICMPv6 in order to make the service discovery of UPnP more compact and efficient. Also, we develop a monitoring tool for developers to troubleshoot the problems of the proposed protocol. We hope this enhancement can make the deployment of IoT in the smart home network more scalable and robust.
- Communication-Aware Scheduling for Malleable Tasks
- Task scheduling is an important process in the design of multicore computing systems. This paper presents methods for scheduling of malleable tasks. The scheduling methods decide not only the execution order of the tasks but also the number of cores assigned to the individual tasks, simultaneously. Different from previous work on malleable task scheduling, this paper takes into account the inter-task communication overhead during scheduling. The communication is necessary if the main thread of a predecessor task and that of a successor task are assigned on different cores. This paper proposes two methods for malleable task scheduling with communication overhead. One is a two-step method which schedules tasks first and then assigns threads in the tasks onto cores. Other is an integrated method which performs task scheduling and thread assignment simultaneously. Both of the two methods are based on integer linear programming (ILP). The proposed methods are evaluated through experiments and encouraging results are obtained.
- Fine-Grained Plant Identification Using Wide and Deep Learning Model
- In recent years, with the evolution of deep learning technology, the performance of plant image recognition has improved remarkably. In this paper, we propose a model to address the fine-grained plant image classification task by using the wide and deep learning framework which combines a linear model and a deep learning model. Proposed method sums the result of the wide and deep learning model using a logistic function so that discrete features can be considered simultaneously with continuous image content. Our works used metadata such as the date of flowering and locational information for the wide model. Our experiment shows that the proposed method gives better performance than a baseline method.
- System Design for Automatic Laundry Organization
- In this paper, a system for automated laundry cleaning was designed that is more efficient and convenient for organization of laundry organization which occupies the largest portion of household work. This automation system for dry laundry folding is designed to automatically manage dry laundry based on each laundry folding method and to manage laundry through laundry management function. This system is expected to solve the problems of housewives, dual-income couples, single-person households who are mentally and physically stressed about household duties and organization, and through it the economic development of smart appliances market is expected to be possible.
Monday, January 28 11:20 - 13:30
Coffee Break & Lunch
Monday, January 28 13:30 - 14:20
Opening Remark & Keynote Speech 1 (Emerald B (B2))
We are blessed with the sophisticated technological artifacts that are enriching our daily lives and the society. It is believed that the future Internet is going to provide us the framework to integrate, control or operate virtually any device, appliance, monitoring systems, infrastructures etc. Industry 4.0 is the current trend of automation and data exchange in manufacturing echnologies, which also includes a close integration of cyber-physical systems, the Internet of things and cloud computing. In this talk, the concept of Industry 4.0 will be presented and then various research challenges from several application perspective will be illustrated. Some real world applications involving the analysis of complex data / applications would be the key focus.
Monday, January 28 14:20 - 14:40
Monday, January 28 14:40 - 16:00
2-A: SCA 2019
- A Study on the Application of Singularity Container in HTCondor Scheduler
- We have studied the use of HTCondor's built-in Singularity Container to verify that the work pipeline is useful for computing-centric job processing. Results using HPCG benchmark program did not show significant performance degradation when processing computing-centric tasks with HTCondor-Singularity combination. Therefore, it has proven very useful to use Singularity when processing CPU-intensive tasks.
- A Study of Host Access Control System for IPv6-only Large-scale Data Center
- In recent year, the large-scale data centers are adopting Internet Protocol version 6 (IPv6) due to exhaustion of IPv4 addresses spaces. The Internet Engineering Task Force (IETF) developed IPv6 in 1998 to overcome the limitations of IPv4. IPv6 not only has a large address space but also has advantages over IPv4, such as Stateless Address Autoconfiguration (SLAAC), extension headers, IPsec, Quality of Service, mobility and etc. Especially, SLAAC allow the hosts to generate and configure its own IPv6 address without any setting. However, SLAAC drawbacks are respect to host access control for the large-scale data center. Due to these drawbacks, an unauthorized host can connect to IPv6 network in the large-scale data center. In order to solve this issue, this paper provides a host access control vulnerabilities and a solution to overcome this problem. Our proposed solution provides a system for collecting host information and blocking unauthorized hosts using ICMPv6. And also, through accurate analysis shows that the proposed system can manage access control of IPv6 host correctly in the large-scale data centers.
- Multi-experiment Support Through HTCondor Scheduling Policy Integrated Pool Configuration
- A high-performance computing power required for processing a large amount of data is being used. Cluster technology is used for high-performance computing. Batch programs are used for task scheduling. We want to build an integration pool that supports multiple experiments using a batch program called HTCondor. Through HTCondor's scheduling policy, the group quota is specified, and the preemptive option is added to allocate resources flexibly but to allocate the least resources. Pools using 400 cores and 1656 cores, respectively, consisted of a pool of 2136 cores allocated with idle resources, and utilization of computing resources was also increased through flexible allocation of resources.
- Enabling Traceability on Grid Job Payload Based on Messaging System for Local Compute Cluster
- At KISTI, we are operating a Tier-1 datacenter of WLCG for ALICE experiment. Grid computing activities are monitored centrally by the experiment. Although it provides useful information such as running/done/error jobs, storages usage, incoming/outgoing network traffics and so on, none of information related to owner and payload of each grid job is presented. We conducted a study on how to enable traceability by processing grid job payload logs distributed upon hundreds of worker nodes in our facility. To capture, publish and consume logs with the size of a few terabytes in real time, we considered an efficient messaging systems in the design. We present a design of messaging system based Grid job monitoring and discuss its possibility on future usage.
2-B: SESIS 2019
- Deception Tree Model for Cyber Operation
- Modern cyber operations are evolving from direct attacks and defense to complex cyber operations that involve deception. As deceptions is included in cyber-attacks and defenses, deception elements should be identified to respond to cyber operations. If appropriate countermeasures can be taken for identified deception elements, they can gain a strategic advantage in cyberspace. Related cyber deception research includes developing response tools for attackers from a defensive standpoint and developing attack techniques that exploit human cognitive vulnerabilities. Other research has classified deception tools according to their purposes and has studied procedures for effectively carrying out deception. However, existing studies neither consider specific deception objectives nor classify deception in complex cyber operations. Classifying deception in cyber operations requires dividing cyberspace into physical, logical, and persona layers, the targets of cyber operations should be identified from machines to humans, and deception procedures should be identified from TTPs to objectives. In response, this paper proposes a "deception tree model" that can be categorized from a cyber-deceitful TTP perspective. The deception tree model can distinguish targets from humans and machines in terms of attack and defense and systematically establish the effects, tactics, techniques, and procedures of selected targets. Three cases were applied and analyzed to verify the performance of the deception tree model. The first case is the cyber incident that occurred at KHNP in 2014 in which a deceitful attack was conducted on humans, the second case is using Honeynet technology to deceive the attacker, and third case is using Anti-Ransomware technology to deceive malware.
- Analysis of IoT Platform Security: A Survey
- - As the IoT (Internet of Things) is emerging as the next generation growth engine that leads the IT (information technology) industry, many developed countries and many other organizations are developing IoT related technologies to preoccupy the IoT market. Among the core technologies that make up IoT, since, the platform can have a huge impact on future devices, it is attracting attention as one of the most promising technologies of IoT technology, but at the same time concerns about platform security are also increasing. In response to this, Korea (domestic) and overseas IoT platforms adopt diverse security technologies according to the developed purpose and environment. For most domestic IoT platforms are developed based on platforms on which standardization is underway like oneM2M and OCF. For the overseas IoT platforms, they were developed in a proprietary method. Not only this, the security method also adopted their own method. But, even though the security methods of domestic and overseas IoT platforms have a distinct difference like this, researches comparing and analyzing the security elements of domestic and overseas IoT platforms are not enough. Therefore, in this paper, we analyze and compare security elements of domestic IoT platforms and overseas IoT platforms so that more secure domestic IoT growth can be achieved. Finally, we propose the development direction of future IoT platform security.
- Interoperable OAuth 2.0 Framework
- Internet of things (IoT) is becoming key paradigm in our everyday life. However, interoperability which is essential key to share services and resources in heterogenous IoT domains is difficult to be achieved since some technologies are often designed without considering interoperability. The open authorization (OAuth) 2.0 framework used in IoT as well as conventional web environment also did not consider the interoperability in spite of its wide usage. In other words, systems cannot interoperate together without other alternatives even though they implement same OAuth 2.0 framework standard. We therefore propose interoperable OAuth 2.0 framework (IOAF). IOAF provides an additional authorization layer to support common authorization server capabilities, and supports four extended authorization grant types used to obtain interoperable access token (IAT) which has global scope in multiple domains. This paper shows the main requirements and benefits of IOAF, and describes detailed flow of authorization grant types.
- Efficient Vulnerability Management Process in the Military
- Reducing vulnerabilities is one of the most effective ways to minimize the risks that can occur to information systems. Given the characteristics of the military environment, particularly in operating a wide variety of information systems and dealing with critical information on national security, clear / concise management procedures are needed that enable more realistic and direct action to identify and address vulnerabilities. This paper suggested 5 phases for the vulnerability management process in military: (i) Framing; (ii) Identification; (iii) Assessment; (iv) Remediation; and (v) Verification. In addition, the Three-Tiered concept was applied to the efficient management of the vulnerabilities, taking into consideration the characteristics of the organization with clear hierarchical relationships. As a result, it will contribute to reduce the cyber risk in the defense area, by presenting the specific procedures for vulnerability management in each hierarchical organization.
- Security Evaluation Methodology for Software Defined Network Solutions
- Software Defined Networking (SDN) has introduced both innovative opportunities and additional risks in the computer networking. Among disadvantages of SDNs one can mention their susceptibility to vulnerabilities associated with both virtualization and the traditional networking. Selecting a proper controller for an organization may not be a trivial task as there is a variety of SDN controllers on the market and each of them may come with its own pros and cons from the security point of view. This research proposes a comprehensive methodology for organizations to evaluate security-related features available in SDN controllers. The methodology can serve as a guideline in the decisions related to SDN choice. The proposed security assessment follows a structured approach to evaluate each layer of the SDN architecture and each metrics defined in presented research has been matched with the security controls defined in NIST 800-53. Through the tests on actual controllers the paper provides an example on how the proposed methodology can be used to evaluate existing SDN solutions.
Monday, January 28 16:00 - 16:20
Monday, January 28 16:20 - 17:40
3-A: SCA 2019
- Design and Implementation of an Efficient File Transmission Protocol
- In this work, we propose a file transmission protocol for the large-scale data storage system. Our scheme improves resource utilization, such as network bandwidth and storage space, by streaming files. We packetize the metadata and data of multiple files into a single stream with the proposed protocol. We implement our protocol on the system which consists of two different nodes. The experimental results show that our protocol reduces transfer time while increasing network bandwidth utilization.
- Multi Population Memetic Search for Effective Multi-label Feature Selection
- Population-based multi-label feature selection methods can significantly improve the accuracy of multi-label classifier. However, unexpected bias with regard to the number of features may occur when a population-based evolutionary algorithm is employed to search promising feature subsets or solutions. In this study, we present a multi-population based memetic search that is able to circumvent unexpected bias, resulting in the improvement of search capability. The proposed method divides the solution space into a series of sub-spaces according to the number of features and then each sub-population is assigned to each sub-space to be searched. In addition, we introduce a local refinement process to guide populations that are composed of very few members. Experimental results show the effectiveness of our strategy in improving the classification accuracy compared to other memetic-based algorithm or multi population-based algorithm, which insures the ability of our algorithm in searching feature subsets.
- Computer-Assisted Pronunciation Training for Correcting Vague Pronunciation
- Computer-assisted pronunciation training system is a useful application for improving the pronunciation skill of beginner-level users. In this paper, we propose a new computer-assisted training system that aims at improving the pronunciation of phonemes that are pronounced vaguely by users. The proposed method is implemented using a system that recommends a set of words based on the phonemes that were pronounced vaguely. To detect vague pronunciation, the system employed the concept of entropy. Experimental results showed that the proposed method improves the pronunciation skill of user.
- Improving Hybrid Memory Usages Through Bandwidth-aware Data Migration Methodology
- Hybrid memory is a promising memory technology that contains different types of memory devices, which have different characteristics regarding access time, retention time, and capacity. However, the increasing performance and employing hybrid memories induce more complexity as well. In this paper, we propose a data migration methodology called BDM (Bandwidth-aware Data Migration) to effectively use hybrid memories targeting at Intel Knight Landing (KNL) processor. BDM monitors status of applications running on a system and migrates pages of selected applications to the High Bandwidth Memory (HBM). BDM selects applications whose bandwidth usage is evenly distributed among threads. Experimental results show that our BDM improves over the baseline execution the execution time by 20% on average.
3-B: FSP 2019
- Simulation System of a Reservation-Based 3D Intersection Crossing Scheme for UAVs
- This paper presents the simulation system of a three dimensional intersection crossing model. In this model, multiple UAVs must access the intersection without colliding with each other. The system uses reservation-based scheduling to manage the entrance of the UAVs in the intersection wherein there will be no collision among the UAVs that are travelling in the intersection at the same time. It also finds the path to be taken by the UAV through which it will reach the exit of the intersection in the shortest time. The simulation system is implemented using object-oriented programming. UML diagrams are used to describe the simulation system's structure. Along with the UML diagram, we discussed the classes, attributes, methods and the relationship of the classes in the simulation.
- Dynamic Frame Length Adaptation for Anti-Collision Management in Wi-Fi Backscatter System
- This paper presents a dynamic frame length adaptation (DFLA) for anti-collision management in Wi-Fi backscatter system, which reduces the collisions and access delay of the Wi-Fi backscatter tags. The Q-algorithm defined in EPCglobal Gen2 specification adjusts the frame length (i.e., a total number of slots in a frame) at the beginning of each frame, so that Wi-Fi backscatter tags have different access delays depending on the number of collisions. Thus, it is generally used for anti-collision management in Wi-Fi backscatter system. However, the existing Q-algorithm may cause frequent collisions and long access delays since it uses the initial frame of fixed length and adjusts the frame length without considering the number of Wi-Fi backscatter tags. To address this problem, the DFLA dynamically determines the frame length at the beginning of each frame considering the number of Wi-Fi backscatter tags. Specifically, the DFLA estimates throughput for all the slot-count parameters (Qs) in the range of [0, 15] taking into account the number of backscatter tags. Then, it determines the frame length by deciding the Q that maximizes the throughput of Wi-Fi backscatter system. The simulation results show that the DFLA obtains a better performance than the existing Q-algorithm management.
- Cyber Attack Scenarios on Smart City and Their Ripple Effects
- In recent years, nations have been establishing policies to construct smart cities and promote their operation at the national level and competing with one another through the support of the required resources. Smart cities are futuristic state-of-the-art cities wherein all components of the urban infrastructure are inter-operated through networks using the core technologies in the Fourth Industrial Revolution and ICT(Information and Communication Technology). Various kinds of on-site state information are collected and monitored by installing sensors that are IoT(Internet of Things) devices in various services, and on-site controlling devices are controlled remotely if needed. Smart cities face increasing cyber security threats compared to existing cities as advanced technologies are utilized, so the importance of cyber security and user privacy has increased. In this study, security threats against the smart city architecture are analyzed, and possible cyber-attack scenarios in the construction of services and operation environment are identified and analyzed. The ripple effects of each attack are also analyzed. The analysis results in this study can be utilized in security technology research and development and applications in the future.
- A Study on IoT Device Authentication Protocol for High Speed and Lightweight
- In Authentication to IoT devices is a very important step in providing secure IoT services. Accordingly, studies on an authentication protocol suitable for low-power and lowperformance IoT devices are being actively conducted. Therefore, in this paper, we have selected the LEA-128-CTR and Chaskey algorithms to increase the high-speed parallelization effect of SIMD-based IoT device authentication protocol and proposed protocol for light weight and high speed
- KSI Based Sensitive Data Integrity Validation Method for Precision Medicine System
- Interest in precision medicine in the medical field is increasing through the United States 'Precision Medicine Initiative' announced in 2015. Accordingly, a method for secure and efficient provide of precision medicine services has been continuously studied. Recently, a survey by consulting company shows the importance of verification the integrity of data in decision making is increasing in the analysis of sensitive data of big data size that a key component of precision medicine. Therefore, in this paper, we study integrity validation method of sensitive data by applying signature technology called KSI in precision medicine system.
- Optimized Power Consumption Model for Multiplication in Galois Field of AES
- Side-channel analysis (SCA) is an analysis method to extract secret information using power consumption, electromagnetic waves, sound, etc., which occur when an encryption algorithm operates in a device. In , Lee et al. conducted SCA into the 8-bit based AES-128 algorithm operating on the ChipWhisperer-Lite board. As a result, it was found that Correlation Power Analysis (CPA) using only the most significant bit (MSB) of an 8-bit intermediate value is better than CPA using an 8-bit Hamming weight of the value. In this paper, we demonstrate that CPA using only a specific bit has the superior performance because of the modulo operation in Galois field of the 8-bit based AES-128 encryption algorithm. We performed CPA on the xtime operation part. As a result, we show that power consumption related with the MSB occurs while performing the modulo operation in Galois field.
Monday, January 28 19:00 - 20:30
Reception Party & Service Award Ceremony [Sapphire]
Tuesday, January 29
Tuesday, January 29 9:00 - 10:00
Tuesday, January 29 10:00 - 11:20
4-A: Computing Platform & CRET 2019
- A Study on Electrocardiogram Based Biometrics Using Embedded Module
- Biometrics technology uses bio-signal data, which are unique for each person, as features for identification. Among the biometrics, electrocardiogram (ECG) signals, which are related to the heartbeat, can be used for personal identification as well as disease diagnosis, and also makes it easier to miniaturize measuring devices compared to other bio-signals. In this paper, an ECG-based personal identification system using embedded module is proposed. When an ECG signal is entered, the computer removes noise and segments the signal, after which the signals are transmitted to the embedded module. The embedded module extracts the morphological features of the ECG signal and classifies ECG data. Experiment results showed that the segmented drive and the single drive exhibited equal results, and the equal error rate (EER) was the lowest at an average of 0.74% when the number of verification data was 6. To shorten the operating time of the implemented personal identification system, three embedded module optimization methods were used, it decreased by 66.1%. Thereby confirming potential use of the identification system by using ECG signals based small devices.
- Simple Shadow Detection Using Chromaticity and Minimum Cross Entropy
- Shadows included in images are common phenomena observed in natural images and are a negative factor in digital image analysis. Therefore, shadow detection is a very important problem in various image analysis and it must be performed in the preprocessing process of image analysis. In the paper, we propose shadow candidate detection using the chromaticity and entropy image. The chromaticity image can be calculated based on the RGB color model, and shadows are based on features with dark pixels in the image. The entropy image based on minimum cross entropy can be obtained using the optimal threshold value at the grayscale level. Experimental results show that proposed method using the chromaticity and the entropy image can effectively detect shadow candidates.
- A Proposal of Iterative Consensus Process for Group Decision Making
- Consensus is so important that it has always occupied its crucial position in human society. In particular, the meaning and importance of consensus in democratic society is much more special because it is a social system that operates based on the consensus among people. Up to now, human society has reached consensus by using the majority vote, but it is clear that there are several problems in it. Meanwhile, there are lots of researches which have been carried out so far to develop the appropriate methodology and tools for reaching out consensus. However, these researches suggest that measurement to derive the result is difficult for general purpose in that the evaluation is one-time, and that the trust problem arises from delegating the consensus process to the 3-rd party such as information system is not solved. In this paper, we propose an iterative consensus process which can solve the problems above, and general description about this process is included. The proposed process draws consensus in an iterative way, visualize the information generated from iteration, presents it to the participants, and supports the participants' consensus through structured discussion. In addition, the trust problem, which occurs when a consensus process is delegated to the information system can be solved with utilizing the blockchain technology by conveying and ensuring the reliability of the result, and this proposed process can adopt it.
4-B: SCA 2019
- e-Mentoring-based Intelligent Healthcare Monitoring Platform
- In this paper, we proposed a healthcare monitoring platform based on keyword analysis using software engineering techniques. We have designed a system that presents intelligent biometric information to the clinical system and provides consultation between doctor-patient (mentor-mentees) and clinical learning between doctor-intern (mentor-mentees).
- A Case Study for Block Chain in Smart Grid
- Distributed and decentralized management systems are emerging innovative technologies. They could change the approaches including production, distribution and consumption process. Especially smart grid technology is important to sustainable energy management. this paper studied the important issues for blockchain based energy management.
- Augmented Reality Based on the Job Training Through Object Detection and Text Mining
- The purpose of this study is to propose a methodology for establishing an augmented reality-based business analysis model for on the job training through object detection and text analysis. The proposed methodology consists of text mining and object detection. In the text mining, articles on hotel management published since last decade is collected to extract causal relationships which are useful for on the job training and to save them in the rule base. When it comes to object detection, we detect various objects in real time through YOLO(You Only Look Once). Then a set of causalities which is matched and displayed to the trainees who are wearing AR devices. To our best knowledge, this method is the first to identify causal knowledge from text and linked to AR technology for on the job training. In addition, the methodology proposed in this study reduces the organization's development and maintenance costs required for operating on the job training program. The method would allow trainee to become immersed in training environment, which results in improving the effectiveness of on the job training.
Tuesday, January 29 11:20 - 13:30
Coffee Break & Lunch
Tuesday, January 29 13:30 - 14:20
Keynote Speech 2 (Emerald B (B2))
Recent advances in technology of medical image have made medical simulation that is helpful to diagnosis, operation plan, or education. Improving and enhancing the medical imaging have led to the availability of high definition images and three-dimensional (3D) visualization, it allows a better understanding in the medical and educational field.
The Real human field of view is stereoscopic. Therefore, with just 2D images, stereoscopic reconstruction process through the surgeon's head, is necessary. To reduce these process, 3D images have been used. 3D images enhanced 3D visualization, it provides significantly shorter time for surgeon for judgment in complex situations.
Based on 3D image data set, virtual medical simulations, such as virtual endoscopy, medical planning, and real-time interaction, have become possible. This article describes principles and recent applications of newer medical imaging techniques and special attention is directed towards medical 3D reconstruction techniques.
Tuesday, January 29 14:20 - 14:40
Tuesday, January 29 14:40 - 16:00
5-A: Convergence Platform
- Using Blockchain to Enhance and Optimize IoT-based Intelligent Traffic System
- Smart city blueprint is constructed by increasing deployment of various Internet of Things devices. In this paper, we focus on data transmission and request for lane property right under the domain of an intelligent traffic system. IoT devices are usually deployed in a heterogeneous environment, with the natural gene of decentralization. Compared with cloud-based centralized architecture, in this paper, we introduce blockchain as a decentralized technology to let vehicles jointly collaborate without having to go through a central computing node authority. Data transmission between vehicles uses a peer-to-peer network in which every node communicates directly with every other node, verified by their relevant end-point nodes; lane's acquisition right is approved by all relevant vehicle nodes, as well as with consensus agreed upon via smart contract.
- The Design of Graph-Based Privacy Protection Mechanisms for Mobile Systems
- In the range of mobile privacy, there are many attack methods which can reveal the user's private information. The attacker can use the communication between applications to violate permissions and access the private information without the user's authorization. Therefore, many researchers focus on privilege escalation. However, the attacker can increase their knowledge about the user without achieving privilege escalation through various inference techniques. For this reason, we extend the concept of privilege escalation attack to a more general information escalation attack, and propose a privacy protection mechanism based on the graph-based inference model.
- The Impact of PCA-Scale Improving GRU Performance for Intrusion Detection
- An Intrusion Detection System (IDS) is a device or software application that monitors a network or systems for malicious activity. Conventional IDS does not detect elaborate cyber-attacks such as a low-rate DoS attack as well as unknown attacks. To overcome such deficiencies, in advanced IDS based on Machine Learning has attracted more and more interests in recent years. In this paper, we propose a novel method to improve intrusion detection accuracy of Gated Recurrent Unit (GRU) by embedding the proposed PCA-Scale with two options including PCA-Standardized and PCA-MinMax into the layer of GRU. Both optional methods explicitly enforce the learned object feature maps by affecting the direction of maximum variance with positive covariance. This proposed can be universally applied to GRU model with negligible additional computation cost. We present experimental results on two real-world datasets such as KDD Cup 99 and NSL-KDD demonstrate that GRU model trained with PCA-Scaled method achieves remarkable performance improvements.
- Source Code Analysis for Static Prediction of Dynamic Memory Usage
- We studied source code analysis techniques to predict statically how real programs work and use memory. If we can recognize problems of memory usage in the source code, we are able to prevent them and improve security in the software development phase. The problem detection techniques which are already existed can detect whether the program includes weak code such as Common Vulnerabilities and Exposures, Common Weakness Enumeration. However, these methods are not useful for finding problems in programs that do not include OpenSource. because they use hash value or pattern of weak code contained in OpenSource. Therefore, we propose a static prediction technique for dynamic memory usage with source code analysis without using techniques such as similarity detection. Also, we present how to calculate the values used for static prediction from the source code.
5-B: CIA 2019
- Combining Multiple Load Forecasting Models Using MLR
- In this paper, we propose a two-stage based combination scheme of multiple very short-term load forecasting (VSTLF) models for day-ahead energy scheduling of smart grids. To construct our forecasting model, we collected 15-minute interval electric load data of three years from a science and engineering complex at a university campus and split them into a training set and test set. In the first stage, we build four VSTLF models based on Support Vector Regression (SVR), Gradient Boosting Machine (GBM), Random Forest (RF), and Multilayer Perceptron (MLP) using the training set. In the second stage, we construct a Multiple Linear Regression-based forecasting model using the test set to compute the final value from the forecasting values of four models. To show the performance of our approach, we carried out several experiments for the actual electric load data. We report some of the results.
- The Design of AR Emotional Messenger Using Smart Glasses
- We propose emotional messenger based on Smart Glass for Augmented Reality to overcome the limitations of existing instant messaging applications. In order to implement AR instant messaging using smart glass, it is necessary to understand the difference between instant messaging used in conventional smartphones. We propose input/output methods suitable for smart glass and generated 3D virtual Augmented Reality environments so that users can express their personalities in 3D AR virtual space. In addition, the ways to express the their emotion were constructed through emotional analysis so that users can effectively communicate their personality and intentions.
- Gait Correction System Based on Gait Cycle and Gait Angle
- Diseases with recently increasing incidences such as lumbago, scoliosis, and degenerative arthritis are affected by abnormal gait patterns. In this paper, we propose a gait correction system based on gait cycle and gait angle to improve abnormal gait patterns. The proposed system analyzes the gait pattern using the force sensing resistor (FSR) and inertial measurement unit (IMU) and provides corrective feedback on that basis. The FSR is used to estimate the gait cycle. The IMU is used to evaluate the gait pattern by measuring the gait angle on the stance phase.
- A Study on Ethical and Legal Discussion on Artificial Intelligence
- Nowadays both legal and ethical issues on artificial intelligence are very complicated and controversial, given the growing use of artificial intelligence products and services in everyday life. It is necessary to deal with such legal and ethical issues as data privacy, liability, intellectual property, information security, inequality, the end of labor, or human rights at the fundamental level. How should we judge who would be liable for accidents caused by artificial intelligence technology? How can we manage the world where robotics would infringe human rights? Some countries such as United States and EU, have started reviewing appropriate normative responses to future risk factors caused by AI, however, they are cautious about the legal approach toward AI(Park & Kim, 2017). This paper analyzes what legal and ethical approach we should take in order to settle the issues. To do this, firstly this paper deals with artificial intelligence, identifies some legal and ethical issues related to AI, and comparatively analyzes legal and ethical approaches toward AI in foreign countries. And this paper introduces current legislations and pending bills in Korea on robotics and argues Korea needs to amend a current legislation rather than enact new legislations.
Tuesday, January 29 16:00 - 16:20
Tuesday, January 29 16:20 - 17:40
6-A: Convergence Platform
- Big Data with Integrated Cloud Computing for Prediction of Health Conditions
- The increase in population will be accompanied by an increase of chronic diseases and medical conditions. This growing problem of disease will require efficient systems to reduce the burden on healthcare providers. In order to improve the life expectancy of patients with a chronic disease, this research will propose a big data analytics system using cloud-based technologies for patients suffering from chronic diseases. Continued innovation in the area of health IT will help providers to reduce inefficiencies and improve the health system's capabilities. By investing in technology tools like big data analytics, health providers have the opportunity to improve diagnosis and preventative care, save lives and lower costs by monitoring and managing patients more effectively in their homes. The proposed system works as a guide for lifestyle, and as a tool to support the decision-making. It is envisaged that this system will be able to gather large volumes of patient data and report on a patient's health status in real time.
- Android Application Risk Indicator Based on Feature Analysis Utilizing Machine Learning
- As the penetration rate of smart mobile devices has increased, threats targeting the Android platform, which accounts for the majority of mobile operating systems, have increased. At the end of 2017, when a fake Korea Financial Supervisory Service application appeared and users installed this application and called the Financial Supervisory Service, there was a case of fake loan consultation, which resulted in financial loss and leakage of personal information. There have been a variety of malicious applications targeting mobile devices. As a result, it became necessary to detect the risks to such malicious applications and to make decisions about the apps. In this paper, we created a model to evaluate the risk of applications in Android and define the characteristics of each element. In addition, the risk from the model is used to make a risk map for decision making using unsupervised algorithms. To do this, we ran experiments with a total of 2,970 applications.
- Two-Factor Blockchain Using Watermarking as a Proof of Work
- This research presents a watermarking algorithm as proof of work for making the blockchain processes more secure. Commonly, one blockchain would be sufficient to ensure the integrity of the sequence of media files generated but it would not be enough to ensure the integrity of the media itself and then a second blockchain is needed to ensure the reliability of the watermarking process itself. In this paper, a secure watermarking method is proposed which can verify the process in each stage by utilizing a unique QR code logo as a watermark without any extensive calculation. By a proposed unique reading-verification approach, the integrity of each block can be verified, thus improving also the reliability of the whole process in the blockchain.
- Study on Sony Smartphone Backup
- Smartphones basically include a lot of user privacy data that can be used as potential evidence. In order to obtain such evidence, forensic investigators should extract user data from a smartphone. In this case, the backup function provided by each manufacturer is useful for acquiring important information, such as users' private messages, call log, system setting, application data, and other sensitive information. However, since the backup data is generally encrypted, it cannot be immediately used as evidence. Therefore, in order to utilize the backup data as evidence, studies on the backup processes and backup data storage method for each smartphone manufacturer should be preceded. In this paper, we analyze the backup process of the Sony smartphones
6-B: SESIS 2019
- A Design of a Tax Prediction System Based on Artificial Neural Networks
- It's is not easy for people to predict a tax income accurately. In most cases, an expert manually predict tax revenues for next month based on a heuristic. Although this method is simple and easy to use, it does not take into account the economic situation, the real estate market, GDP, etc., and therefore, the prediction error is very large, so it is difficult to actually use it. To solve this problem, this paper presents an auxiliary tax prediction system that is based on an artificial neural networks. The system can help experts to predict tax revenues efficiently.
- A Study on Smart City Security Policy Based on Blockchain in 5G Age
- The smart city is aiming for an intelligent connected society beyond the concept of U-City with simple ICT. In particular, the rapid development of information and communications, such as 5G, and blockchains, has led to a paradigm shift to a new digital urban environment. As smart energy and smart transportation are integrated or interlocked, the technical and institutional arrangement should be preceded from the time of user authentication to provide reliability, transparency, and efficiency. For this purpose, this study aims to investigate the technical and institutional plans for the implementation of safe urban life utilizing blockchain technology.
- Security Requirements for Cloud-based C4I Security Architecture
- With the development of cloud computing technology, developed countries including the U.S. are pushing the efficiency of national defense and public sector, national innovation, and construction of the infrastructure for cloud computing environment through cloud adaption policies. Korea Military is also considering adaption of cloud computing technology into its national defense command control system. However, existing security requirements cannot solve the problem related security vulnerabilities of cloud computing. In order to solve this problem, it is necessary to design the secure security architecture of national defense command control system considering security requirements related to cloud computing. In this paper, we analyze and propose the security requirements for cloud-based national defense command control system.
- Design Methodology of Web Services Choreography Language
- Rise of the internet is a viable way to manage information resources and the need to integrate services further necessitates companies to focus on B2Bi. Especially the Web technologies which enable electronic document exchanges among different information platforms are replacing an educational infrastructure. This study aims to develop a design methodology of web services choreography languages based on web standards. Along with the methodology, we will report practical usage scenarios of the methodology and demonstrate how it can be used for developing a product prototype for domestic firms which are interested in educational models. Proposed methodology is expected to provide basic guidelines to develop web-based integration systems.
Tuesday, January 29 18:30 - 20:00
BQ: Conference Banquet (with Best Paper Award Ceremony) [Sapphire]
- Official Ceremony for PlatCon-19
- Buffet Dinner
- Best Paper Award Ceremony
- Raffle prizes
- Information on PlatCon-20
Wednesday, January 30
Wednesday, January 30 9:30 - 10:50
7-A: Interdisciplinary Session
- 3D Facial Modeling Using FaceWarehouse Database
- Facial modeling, which has been a hot topic in the field of computer graphics and computer vision, is the most basic implementation step in a variety of visual computing applications for facial image processing. This paper proposes a method for quickly generating an average facial model of neutral expressions using the FaceWarehouse database and reconstructing the neutral expression model of an arbitrary face by the Singular Value Decomposition(SVD) method. The FaceWarehouse database is a 3D facial expression database for visual computing applications. The database contains 150 human facial data from different ethnic backgrounds and ages, including expression data and Blendshapes data. Simulation results show that arbitrary facial model can be obtained by changing the model parameters.
- Trends on Cyber Attacks and Threats of North Korea
- In the competition and cooperation of the international society surrounding cyberspace, each country has systematically established and promotes cybersecurity policy to effectively cope with cyber threats. Recently, cyber attacks and threats from North Korea have been increasing. This paper presents the analysis of the types of the latest cyber threats in North Korea and analyzes the cyber power of North Korea based on it. As North Korea become a major actor in cyber attacks and threats in the world, it is very important to analyze the cyber capabilities currently held by North Korea in establishing cybersecurity policy in Korea. The latest cyber attacks in North Korea shows trends such that cyber attacks and threats techniques are diversified, cyber psychological warfare is activated based on cyber attacks and the role of systematic hacking organizations related to North Korea has been segmented and differentiated through analyzing the various cases.
- An Intelligent Electricity Usage Management System for Energy Efficiency in Smart Home Environment
- Recently, a variety of electrical accidents have been caused by the safety insufficiency of electricity use and the lack of information. In addition, the types of home appliances are further increased in the smart home environment. Thus, a safe electric usage management system applicable to such environments is needed. In this paper, we propose a novel system that can measure electricity usage in real time by attaching an adapter to a multi-tap. Users can conveniently monitor the entire home situation by measuring electricity usage dynamically changing from various sensors connected to the multi-tap and transmitting the information to the smartphone in real time. The system has the advantages of low cost implementation and reliable data transmission to smart phone.
- A Study on the Internalization of Sensor Technology
- This study analyzes the derivation of national strategy for R&D of sensor technology, and draws out the effect of technology internalization effort through strategic R & D activities on technical performance and further on national economy. As a technology internalization strategy, considering its own R&D investment and joint research and development, we examine the impact of each factor on patents, focusing on causality and ripple effects. It is necessary to establish a system that enables reinvestment of patent achievement in research and development activities and a system that enables the technology itself to smoothly enter the market so that technical achievements can be linked to economic performance.
7-B: Interdisciplinary Session
- Digital Forensic Artifact Collection Technique Using Application Decompilation
- Nowadays, many applications tend to collect user profile, such as location, usage trace and so on, even if it is not malicious. This information can be important clues in the criminal investigation. So, the technique is needed which extract artifacts from applications using decompilation. We describe a method for selecting and analyzing forensic artifacts from the Android application with a share of over 80% of mobile devices. Based on the static analysis method, we propose a method for automatically collecting forensic artifact. The effectiveness of the proposed idea is proved by simulation.
- Digital Forensic Analysis Using Android Application Cache Data
- When we use an application on Android, cache data is generated. This can be an important clue to the investigation and needs analysis. However, the present analysis is performed without understanding the generation principle of various cache artifacts, such as purpose and structure, and the data obtained through this simple analysis is used only indirectly for investigation because of lack of information. Therefore, in this paper, we expect to improve the usability of cache by analyzing the type, meaning, and structure of the cache that is stored when the application is used and suggesting an analysis method according to each cache.
- Digital Forensic Readiness for Financial Network
- Major bank hacking cases such as the Carbanak and the Bangladesh bank robbery are reported constantly. As a result of a global joint investigation, Carbanak's leader has been caught in March 2018, but in the case of the Bangladesh bank robbery, the damage has not been restored to this day due to lack of digital forensic evidence. As shown in the Bangladesh bank robbery, digital forensic evidence is the most important thing for incident response. Therefore, in this paper we propose IP (Internet Protocol) traceback and visualization techniques for better digital forensic.
- Semantic Analysis of NIH Stroke Scale Using Machine Learning Techniques
- In particular, stroke is a major disease leading to death in adults and elderly people, as well as disability. Rapid detection of stroke is very difficult because the cause and cause of the onset are different for each individual. In this paper, we design and implement a system for semantic analysis of early detection of stroke and recurrence of stroke in Koreans over 65 years old, based on the National Institutes of Health (NIH) Stroke Scale. Using C4.5 of the decision tree series represented by the analytics algorithm of machine learning technique, we conduct a semantic interpretation that analyzes and extracts the semantic rules of the execution mechanism that are additionally provided by C4.5. The C4.5 algorithm is used to construct a classification and prediction model using the information gain of the NIH stroke scale features, and to obtain additional NIH Stroke Scale feature reduction effects.