Accepted Papers

  • GPU Based Acceleration For Fergus' Image Deblurring Alogrithm
    Geethan Karunaratne, Udaranga Wickramasinghe and Jayathu Samarawickrama, University of Moratuwa,Sri Lanka.
    ABSTRACT
    Image Deblurring algorithms have been evolving over many decades. Even though there are many algorithms available today that produce reasonably good results, their speed of execution does not make them appealing for many real time applications. Thus we address this issue by improving the speed of execution of an existing algorithm using parallelization and other optimizations. The algorithm we selected was published by Fergus et al in 2006, for which a 10x speed up improvement was achieved. Many other authors have since followed his approach and have come up with different variants of the solutions to which our implementation could be readily extended to improve the run-time.
  • Feature based Image Authentication using Symmetric Surround Saliency Mapping in Image forensics
    Meenakshi Sundaram A,Reva Institute of Technology and Management, India
    ABSTRACT
    For an efficient image security, image hashing is one of the solutions for image authentication. A robust image hashing mechanism must be robust to image processing operations as well as geometric distortions. A better hashing technique must ensure an efficient detection of image forgery like insertion, deletion, replacement of objects, malicious color tampering, and for locating the exact forged areas. This paper describes a novel image hash function, which is generated by using both global and local features of an image. The global features are the representation of Zernike moments on behalf of luminance and chrominance components of the image as a whole. The local features include texture information as well as position of significant regions of the image. The secret keys can be introduced into the system, in places like feature extraction and hash formulation to encrypt the hash. The hash incorporation into the system is found very sensitive to abnormal image modifications and hence robust to splicing and copy-move type of image tampering and, therefore, can be applicable to image authentication. As in the generic system, the hashes of the reference and test images are compared by finding the hamming or hash distance. By setting the thresholds with the distance, the received image can be stated as authentic or non-authentic. And finally location of forged regions and type of forgery are found by decomposing the hashes. Compared to most recent work done in this area, our algorithm is simple and cost effective with better scope of security.
  • Study of Factors Affecting Customer Behaviour Using Big Data Technology
    Prabin Sahoo, Narsee Monjee Institute of Management Studies,India
    ABSTRACT
    Big data technology is getting momentum recently. There are several articles, books, blogs and discussion points to various facets of big data technology. The study in this paper focuses on big data as concept, and insights into 3 Vs such as Volume, Velocity and Variety and demonstrates their significance with respect to factors that can be processed using big data for studying customer behaviour for online users.
  • Measuring The Effectiveness of Test Case Priorritization Techniques Based On Weight Factors
    Thillaikarasi Muthusamy and Seetharaman K, Annamalai University,India.
    ABSTRACT
    Test case prioritization schedule test cases in an order that increases the success in achieving some performance target. The most important target is,at what rate the faultis detected. Test cases should run in an order that increases the opportunity of fault exposure and as well detecting the most rigorous faults, most primitively in its testing life cycle. Test case prioritization techniques have proved to be advantageous for improving regression testing activities. whereas code coverage based prioritization are being studied by most scholars, hitherto test case prioritization techniques based on requirements in cost effective manner has not been analyzed . Here we suggest to put forth a model for system level test case prioritization from software requirement specification and to develop user fulfillment with quality software that can also be cost effective. Thus improving the rate of severe fault detection. The projected model priorities the system test cases, based on six factors. They are customer allotted priority, developer observed code implementation complexity, changes in requirements, fault impact of requirements, completeness and Traceability. The anticipated prioritization techniques is experimented with two set of industrial projects. The results realistically show that proposed prioritization techniques improves the rate of fault detection
  • Feed optimization system based on quality filtering
    Nevatia, Alladi,Kanade,Panchamia,Sardar Patel Institute of Technology India.
    ABSTRACT
    The phenomenal rate at which data is being generated on social networking websites demands for effective organization and curation methods. The current approach leads to loss of quality content amongst popular content. The paper presents an algorithm that ranks content in an ecosystem according to quality relevance to generate an organized and improved feed which evolves with the changes in data. For achieving relevance of content, the algorithm considers different parameters like user interests, article category, domain expertise of a user, quality rating of an article, and its future popularity and user activities in the ecosystem. These attributes are normalized according to the extent of
  • Structuring and Functioning of an Entry-Level Artificial Intelligence System
    Anirudh Khanna,Chitkara University,India India.
    ABSTRACT
    Artificial Intelligence is a very promising field of work and study these days. It involves the creation of "intelligent machines" which can simulate human (and sometimes animal) behaviour based on a few factors, like their environment, text/voice inputs given to them and obviously, their knowledgebase fed by a programmer. This paper focuses on the working and structuring of a simple beginner-level A.I. chatbot named F.U.T.U.R.E. (Feed-Utilising Talking, Understanding and Responding Entity). It is a very "simply-programmed" chatbot, which is in a very raw format and at a beginner-level. This paper covers its designing, logic behind it, programming concepts used, and finally, the working program.
  • A survey on Improving performance of Real Time Scheduling for Cloud Systems
    Rekha Kulkarni1 and Prof Dr. Suhas H Patil2,PICT,Punesup1,Bharati Vidyapeeth University2,India
    ABSTRACT
    Cloud computing provides a distributed computing environment in which there is a pool of virtual, dynamically scale able and managed heterogeneous computing power and storage platforms. The computing power and the storage are provided as Services on demand to external user over the internet. Cloud computing aim s at making computing as a service whereby shared resources, software, and information are provided to users who are requesting the service as a utility over a network. One of the key technology which plays an important role in Cloud data-center is resource scheduling. One of the challenging scheduling problems in Cloud data center is to consider allocation and migration of re-configurable virtual machines (VMs) and integrating features of hosting physical machines.
Copyright ® ADCO 2014