Dr.



SECURE IMAGE STORING AND SHARING IN CLOUD THROUGH DOUBLE STEP AUTHENTICATIONDr.M.KANNAN1, TAMILALAGAN.P 21 PROFESSOR & HEAD OF THE DEPARTMENT, 2 PG STUDENTCOMPUTER SCIENCE AND ENGINEERINGMAHENDRA ENGINEERING COLLEGE (AUTONOMOUS), TAMILNADU, INDIAABSTRACTMillions of private images are generated in various digital devices every day. The consequent massive computational workload makes people turn to cloud computing platforms for their economical computation resources. Meanwhile, the privacy concerns over the sensitive information contained in outsourced image data arise in public. In fact, once uploaded to cloud, the security and privacy of the image content can only presume upon the reliability of the cloud service providers. Lack of assuring security and privacy guarantees becomes the main barrier to further deployment of cloud-based image processing systems. In existing various image processing tasks, including image feature detection, digital watermarking, content-based image search were implemented. The state-of-the-art techniques, including secure multiparty computation and homomorphic encryption are investigated. In proposed, data is encrypted and stored in cloud with watermarked technique. This is used to ensure originality of image and repository key has been generated to ensure security to user data in maximum. In addition, to utilize cloud storage, compressor is utilized which compress the image and store it to the cloud. This shows proposed system attains maximum performance compared to existing system.Keywords: Security and privacy, Cloud System & Processing System1. INTRODUCTIONMotived by the rapid growth of image processing and data mining techniques, more and more image processing based applications are deployed in various end-users’ devices. For example, content-based image search, digital watermark verification and so on. The consequent massive image processing tasks bring enormous computation over head to data owners. To solve this problem, more and more users are outsourcing the “expensive” tasks to cloud computing platforms. In such cloud computing platform, Cloud Service Provider (CSP) offers a pay-per-use business model, which enables individual user to use robust computation power in cloud while saving time and System Model cost on setting up corresponding infrastructures. In fact, not only individual or small business data owners refer to, Internet giants like Microsoft and Yahoo are also attracted by the benefits brought by cloud computing and authorize some services to third-party cloud computing platforms. For example, several types of data searching tasks in Microsoft Bing have been outsourced to Wolfram. However, the participation of a third-party cloud computing platform also increases the vulnerability of private data, e.g., potential data breach and lost. Undercurrent cloud architecture, the content of outsourced, image data will inevitably be leaked to CSPs. In this case, the leaked content might be sensitive information like data owner’s personal identity, home address, or even financial records. Moreover, even we assume CSPs are completely honest and could be trusted to have data owners’ private information, such privacy leakage still happens. In fact, cloud server is usually considered as a low-qualified locker rather than a strong bank deposit box. Comparing with traditional network server, the cloud computing platform suffers from more security threats. For instance, a severe vulnerability in cloud server is the sharing of computing resources: flaws in System Virtual Machine (SVM) software are frequently discovered and exploited to attack cloud servers in recent years. Nevertheless, the private data leakage in public cloud happens very often due to the improper configuration and maintenance by CSPs. In a nutshell, the privacy concern over the outsourced data has become the main barrier to the further development of cloud computing platforms.In recent years, the secure image data processing is a rapidly growing research field and has attracted attention from both academia and industry. In practice, many fancy image processing applications require the computational power beyond the limit of mobile device. For example, 3D structure reconstruction needs massive computational power for image feature detection and matching. In this area, the main research direction lies in the detection of image features over cipher text domain. Many encryption techniques are applied or adjusted to protect image data privacy while enabling visual feature extractions. A global image feature detection mechanism for color histogram based descriptors detection is proposed. The authors utilize a Somewhat Homomorphism Encryption (SHE) scheme to enable the computation of diverse color descriptors inMPEG-7 standard over cipher text domain. These features are further utilized as basic building blocks for services such as image matching and semantic tag generation. In a local feature detection mechanism for Scalar Invariant Feature Transform (SIFT) is proposed, which utilizes the Parlier encryption scheme to enable the computation of SIFT features over ciphertext domain. Moreover, the authors analyze different scaling ratios by adjusting fixed point numbers in the proposed scheme. However, all these works suffer from the high computational complexity brought by homomorphism operations, especially for those who perform relatively complicated algorithms like SIFT. In solving this problem by utilizing a multi-server structure to enable SIFT algorithm over encrypted data. In addition, another thriving research direction is the secure digital watermarking. It enables outsourcing the time-consuming tasks of generating digital watermark without compromising the privacy of the image content. Two types of approaches have been proposed: the asymmetric watermarking and the zero-knowledge watermark detection. However, most existing works still suffer from the high computational complexity on both user and cloud side.Moreover, as an orthogonal research direction, the secure image retrieval mechanism is proposed, which enables applications such as location based detection. It offers flexible approaches to manage private image datasets online. The features extracted from images are encrypted in a distance-preserving scheme to enable direct comparisons for similarity evaluation. The current image search indices are encrypted while achieving searching functionalities with efficiency. However, in a practical privacy-preserving computation scenario, all the existing works are very difficult to achieve the security requirements and practical efficiency performance at the same time. Here, the contributions of this paper are summarized as follows: We introduce and formulate diverse image processing tasks in a general image computation outsourcing model, including image feature detection, digital watermarking, content-based image search. The state-of-the-art techniques, including secure multiparty computation and homomorphism encryption are discussed. A detailed taxonomy of the problem statement and the corresponding solutions are provided.2. OVERVIEW OF DATA STORAGE IN CLOUD COMPUTING:Cloud computing is a functional paradigm that is evolving and making IT utilization easier by the day for consumers. Cloud computing offers standardized applications to users online and in a manner that can be accessed regularly. Such applications can be accessed by as many persons as permitted within an organisation without bothering about the maintenance of such application. The Cloud also provides a channel to design and deploy user applications including its storage space and database without bothering about the underlying operating system. The application can run without consideration for on-premise infrastructure. Also, the Cloud makes massive storage available both for data and databases. Storage of data on the Cloud is one of the core activities in Cloud computing. Storage utilizes infrastructure spread across several geographical locations. Storage on the Cloud makes use of the internet, virtualization, encryption and others technologies to ensure security of data. This paper presents the state of the art from some literature available on Cloud storage. The study was executed by means of review of literature available on Cloud storage. It examines present trends in the area of Cloud storage and provides a guide for future research. The objective of this paper is to answer the question of what the current trend and development in Cloud storage is? The expected result at the end of this review is the identification of trends in Cloud storage, which can beneficial to prospective Cloud researches, users and even providers.Cloud computing is defined by [1] as a parallel and distributed computing system consisting of a pool of interconnected and virtualized computers that are dynamically provisioned and presented a single computing resource to the users based on pre-agreed Service Level Agreements (SLA). It enables users to remotely run their applications as well as store data with the benefit of an on-demand and highly available service; without the burden of local hardware and software management. With Cloud storage, data is stored on multiple third party servers, rather than on the dedicated server used in traditional networked data storage. Third party service providers are entrusted with users’ data and for security purposes the exact storage locations of these data are unknown to most people. Cloud computing is positively impacting the IT landscape using the Internet as it enables users pay on a per services usage bases. User concerns are thus shifted from acquisition and maintenance to utilization of facilities made available by Cloud service providers. Cloud computing is about moving services, computation or data for cost and business advantages offsite to an internal or external, location transparent, centralized facilities or contractor [3]. Cloud computing has characteristics that include resource pooling and multi-tenancy [2]. There are three basics service types in Cloud computing: the Software-asa-Service (SaaS), where applications are made available by Cloud Service Providers (CSPs) over the Internet to the Cloud users; Platform-as-a-Service (PaaS), wherein the CSPs offers the Cloud users platforms for development and deployment of their own applications; and Infrastructureas-a-Service (IaaS), where the CSPs offers compute, storage, network and other computing resources to the Cloud users. The IaaS users have control over the operating system and applications running on them, while the provider manages the hardware infrastructure. These services are all made available to users anytime and from any location via the web. Cloud computing also has four modes of deployment, the private Cloud, public Cloud, community Cloud and the hybrid Cloud. The private Cloud is owned and controlled by an individual organization. The facilities could on– promise or off–premise. Private Cloud allow for more secured environment due to internal staff utilization. Public Cloud is owned and managed by major CPSs. These providers own large data centres, sometime spread across different geographical locations. They provide various services that free the customer from expensive infrastructural procurements. Community Clouds belong to several organizations that come together based on shared common interest. The community Cloud may be managed by the community or a third party. Hybrid Cloud is a combination of either private, public or community Cloud. The hybrid Cloud share the same infrastructure but the organizations are unique. A major component of Cloud computing is storage. Storage could be for an enterprise database or simple storage of data similar to storing information on a local hard drive. In Cloud storage, data is stored in multiple third party services rather than on dedicated servers used in traditional networked data storage [4]. When storing data, the customer “sees” a virtual server, hence it appear that data is stored in a particular place with a specific name, but such a place does not exist in reality. It is just a pseudonym used to reference a virtual space carved out in the Cloud. The users’ data could be stored on any computer in any data centre across several geographical locations. The data’s actual storage location may differ from time to time as the Cloud dynamically manages available storage locations around the data centres. Although, the data location is virtual, the user sees a static location for his data and can actually manage the storage space as if the user were on a personal computer [4]. A typical Cloud storage architecture includes a master control server and several storage servers as depicted in Fig. 1. At the most basic level, a Cloud storage system needs just one data server connected to the Internet. A user sends copies of files over the Internet to the data server which then records the information. In the Cloud data storage system, users store their data in the Cloud and no longer have complete control over their data as they would if the data resides on a local computer. Hence, the correctness, security and availability of data being stored on the Cloud server must always be guaranteed. The purpose of this paper therefore is to examine Cloud computing data storage. The paper discusses the Cloud storage architecture and various challenges facing Cloud storage. Thereafter, the events relating to Cloud data storage on the Cloud are highlighted.3. CLOUD COMPUTNG SERVICES:Fig 1: Cloud Computing ServicesInfrastructure-As-A-Service:The Infrastructure as a Service is a provision model in which an organization outsourcers the equipment used to support operations, including storage, hardware, servers and networking components. The service provider owns the equipment and is responsible for housing, running and maintaining it. The client typically pays on a per-use basis. Characteristics and components of IaaS include: 1. Utility computing service and billing model. 2. Automation of administrative tasks. 3. Dynamic scaling. 4. Desktop virtualization. 5. Policy-based services. 6. Internet connective Infrastructure-as-a-Service like Amazon Web Services provides virtual server instances with unique IP addresses and blocks of storage on demand. Customers use the provider's application program interface (API) to start, stop, access and configure their virtual servers and storage. In the enterprise, cloud computing allows a company to pay for only as much capacity as is needed, and bring more online as soon as required. Because this pay-for-what-youuse model resembles the way electricity, fuel and water are consumed it's sometimes referred to as utility computing. Infrastructure as a Service is sometimes referred to as Hardware as a Service (HaaS). Platform-As-A-Service:Platform as a Service (PaaS) is a way to rent hardware, operating systems, storage and network capacity over the Internet. The service delivery model allows the customer to rent virtualized servers and associated services for running existing applications or developing and testing new ones. Platform as a Service (PaaS) is an outgrowth of Software as a Service (SaaS), a software distribution model in which hosted software applications are made available to customers over the Internet. PaaS has several advantages for developers. With PaaS, operating system features can be changed and upgraded frequently. Geographically distributed development teams can work together on software development projects. Services can be obtained from diverse sources that cross international boundaries. Initial and ongoing costs can be reduced by the use of infrastructure services from a single vendor rather than maintaining multiple hardware facilities that often perform duplicate functions or suffer from incompatibility problems. Overall expenses can also be minimized by unification of programming development efforts. On the downside, PaaS involves some risk of "lock-in" if offerings require proprietary service interfaces or development languages. Another potential pitfall is that the flexibility of offerings may not meet the needs of some users whose requirements rapidly evolve.Software-As-A-Service:No Software as a service sometimes referred to as "software on demand," is software that is deployed over the internet and/or is deployed to run behind a firewall on a local area network or personal computer. With SaaS, a provider licenses an application to customers either as a service on demand, through a subscription, in a "pay-asyou-go" model, or at no charge. This approach to application delivery is part of the utility computing model where all of the technology is in the "cloud" accessed over the Internet as a service. SaaS was initially widely deployed for sales force automation and Customer Relationship Management (CRM). Now it has become commonplace for many business tasks, including computerized billing, invoicing, human resource management, financials, content management, collaboration, document management, and service desk management.4. CLOUD COMPUTNG SECURITY ISSUES: In the last few years, cloud computing has grown from being a promising business concept to one of the fastest growing segments of the IT industry. Now, recession-hit companies are increasingly realizing that simply by tapping into the cloud they can gain fast access to best-of breed business applications or drastically boost their infrastructure resources, all at negligible cost. But as more and more information on individuals and companies is placed in the cloud, concerns are beginning to grow about just how safe an environment it is.Security: Where is your data more secure, on your local hard driver or on high security servers in the cloud? Some argue that customer data is more secure when managed internally, while others argue that cloud providers have a strong incentive to maintain trust and as such employ a higher level of security. However, in the cloud, your data will be distributed over these individual computers regardless of where your base repository of data is ultimately stored. Industrious hackers can invade virtually any server, and there are the statistics that show that one-third of breaches result from stolen or lost laptops and other devices and from employees’ accidentally exposing data on the Internet, with nearly 16 percent due to insider theft.Privacy:Different from the traditional computing model, cloud computing utilizes the virtual computing technology, users’ personal data may be scattered in various virtual data center rather than stay in the same physical location, even across the national borders, at this time, data privacy protection will face the controversy of different legal systems. On the other hand, users may leak hidden information when they accessing cloud computing services. Attackers can analyze the critical task depend on the computing task submitted by the users.Reliability:Servers in the cloud have the same problems as your own resident servers. The cloud servers also experience downtimes and slowdowns, what the difference is that users have a higher dependent on cloud service provider (CSP) in the model of cloud computing. There is a big difference in the CSP’s service model, once you select a particular CSP, you may be locked-in, thus bring a potential business secure risk. Legal Issues :Regardless of efforts to bring into line the lawful situation, as of 2009, supplier such as Amazon Web Services provide to major markets by developing restricted road and rail network and letting users to choose “availability zones” . On the other hand, worries stick with safety measures and confidentiality from individual all the way through legislative levels.Open Standard :Open standards are critical to the growth of cloud computing. Most cloud providers expose APIs which are typically well-documented but also unique to their implementation and thus not interoperable. Some vendors have adopted others' APIs and there are a number of open standards under development, including the OGF's Open Cloud Computing Interface. The Open Cloud Consortium (OCC) is working to develop consensus on early cloud computing standards and practices. Compliance:Numerous regulations pertain to the storage and use of data require regular reporting and audit trails, cloud providers must enable their customers to comply appropriately with these regulations. Managing Compliance and Security for Cloud Computing, provides insight on how a top-down view of all IT resources within a cloud-based location can deliver a stronger management and enforcement of compliance policies. In addition to the requirements to which customers are subject, the data centres maintained by cloud providers may also be subject to compliance requirements. Freedom :Cloud computing does not allow users to physically possess the storage of the data, leaving the data storage and control in the hands of cloud providers. Customers will contend that this is pretty fundamental and affords them the ability to retain their own copies of data in a form that retains their freedom of choice and protects them against certain issues out of their control whilst realizing the tremendous benefits cloud computing can bring . Long-term Viability:You should be sure that the data you put into the cloud will never become invalid even your cloud computing provider go broke or get acquired and swallowed up by a larger company. "Ask potential providers how you would get your data back and if it would be in a format that you could import into a replacement application.IES-CBIR Design and Implementation :The main component on the users’ side leverages a novel cryptographic scheme specifically designed for images and Privacy preserving CBIR, dubbed IES-CBIR. Before describing IES-CBIR in detail, we give a definition of image privacy that underlines our work.An Image Encryption Scheme with CBIR properties is a tuple (GENRK, GENIK, ENC, DEC, TRPGEN) of five polynomial-time algorithms run by a user, where: GENRK(sprk): is a probabilistic algorithm that takes as input the security parameter sprk 2 N and generates a repository key rk; GENIK(spik): is a probabilistic algorithm that takes as input the security parameter spik 2 N and generates an image key ik; ENC(I; rk; ik): takes as input an image I and the cryptographic keys frk; ikg, returning an encrypted image CI ; DEC (CI; rk; ik ): takes as input an encrypted image Cardkeys frk; ikg, returning the decrypted image I ; TRPGEN (Q; rk): takes as input a query image Q and repository key rk, returning a searching trapdoor CQKey Generation:IES-CBIR works with two distinct sorts of cryptographic keys, storehouse keys (rk) and picture keys (ik), which are created by the GENRK and GENIK calculations separately. Repository keys deterministically map a pixel’s color value in a color channel to some new random value4. To prevent images from increasing in size after encryption (i.e. prevent cipher text expansion), encrypted pixels should be in the same range of values as their original plaintexts (usually 8 bitsper color channel). As such, we build repository keys in IESCBIR by performing random permutations of all possible pixel color values in each color channel. Leveraging the HSVcolor space ((H) hue, (S) saturation, (V) value/brightness), we perform three independent random permutations of thevalues in range [0::100]. This range represents all possible color values in the HSV color space, and each permutationis used for a different color channel, resulting in 3 repository sub-keys: rkH ; rkS; rkV. Permutations are performed bya Pseudo-Random Generator (PRG) [21] G parameterized.5. LITERATURE REVIEW5.1 Homomorphic encryption-based secure SIFT for privacy-preserving feature extraction:Privacy has received much attention but is still largely ignored in the multimedia community. Consider a cloud computing scenario, where the server is resource-abundant and is capable of finishing the designated tasks, it is envisioned that secure media retrieval and search with privacy-preserving will be seriously treated. In view of the fact that scale-invariant feature transform (SIFT) has been widely adopted in various fields, this paper is the first to address the problem of secure SIFT feature extraction and representation in the encrypted domain. Since all the operations in SIFT must be moved to the encrypted domain, we propose a homomorphic encryption-based secure SIFT method for privacy-preserving feature extraction and representation based on Paillier cryptosystem. In particular, homomorphic comparison is a must for SIFT feature detection but is still a challenging issue for homomorphic encryption methods. To conquer this problem, we investigate a quantization-like secure comparison strategy in this paper. Experimental results demonstrate that the proposed homomorphic encryption-based SIFT performs comparably to original SIFT on image benchmarks, while preserving privacy additionally. We believe that this work is an important step toward privacy-preserving multimedia retrieval in an environment, where privacy is a major concern. SIFT is an algorithm of detecting and describing local features in images and has been widely used8–11due to its powerful attack-resilient feature point detection mechanism. In this paper, we focus on presenting a homomorphic encryption-based secure SIFT method for privacy-preserving feature extraction and representation. This core technology will find many applications, including media retrieval, (near-) duplicate detection, and so on. Particularly, both the query and database are permitted to be encrypted to guarantee privacy-preserving.5.2 Towards efficient privacy-preserving image feature extraction in cloud computing:As the image data produced by individuals and enterprises is rapidly increasing, Scalar Invariant Feature Transform (SIFT), as a local feature detection algorithm, has been heavily employed in various areas, including object recognition, robotic mapping, etc. In this context, there is a growing need to outsource such image computation with high complexity to cloud for its economic computing resources and on-demand ubiquitous access. However, how to protect the private image data while enabling image computation becomes a major concern. To address this fundamental challenge, we study the privacy requirements in outsourcing SIFT computation and propose SecSIFT, a high performance privacy-preserving SIFT feature detection system. In previous private image computation works, one common approach is to encrypt the private image in a public key based homomorphic scheme that enables the original processing algorithms designed for plaintext domain to be performed over ciphertext domain. In contrast to these works, our system is not restricted by the efficiency limitations of homomorphic encryption scheme. The proposed system distributes the computation procedures of SIFT to a set of independent, co-operative cloud servers, and keeps the outsourced computation procedures as simple as possible to avoid utilizing homomorphic encryption scheme. Thus, it enables implementation with practical computation and communication complexity. Extensive experimental results demonstrate that SecSIFT performs comparably to original SIFT on image benchmarks while capable of preserving the privacy in an efficient way. 5.3 A review of intrusion detection techniques in cloud:In this paper, we review different intrusions affecting availability, confidentiality and integrity of Cloud resources and services. Proposals incorporating Intrusion Detection Systems (IDS) and Intrusion Prevention Systems (IPS) in Cloud are examined. We recommend IDS/IPS positioning in Cloud environment to achieve desired security in the next generation networks. Due to the availability of vast set of historical data regarding various types of normal as well as attack requests of the users of networks, machine learning algorithms came to existence in the field of intrusion detection system. Several machine learning techniques like pattern recognition, anomaly detection, clustering techniques, support vector machines that can be applied on the above data to find a specific request is legitimate or not. 5.4 A compressive sensing based secure watermark detection and privacy preserving storage framework:Privacy is a critical issue when the data owners outsource data storage or processing to a third party computing service, such as the cloud. In this paper, we identify a cloud computing application scenario that requires simultaneously performing secure watermark detection and privacy preserving multimedia data storage. We then propose a compressive sensing (CS)-based framework using secure multiparty computation (MPC) protocols to address such a requirement. In our framework, the multimedia data and secret watermark pattern are presented to the cloud for secure watermark detection in a CS domain to protect the privacy. During CS transformation, the privacy of the CS matrix and the watermark pattern is protected by the MPC protocols under the semi-honest security model. We derive the expected watermark detection performance in the CS domain, given the target image, watermark pattern, and the size of the CS matrix (but without the CS matrix itself). The correctness of the derived performance has been validated by our experiments. Our theoretical analysis and experimental results show that secure watermark detection in the CS domain is feasible. Our framework can also be extended to other collaborative secure signal processing and data-mining applications in the cloud.5.5 Security protection between users and the mobile media cloud:Mobile devices such as smart phones are widely deployed in the world, and many people use them to download/upload media such as video and pictures to remote servers. On the other hand, a mobile device has limited resources, and some media processing tasks must be migrated to the media cloud for further processing. However, a significant question is, can mobile users trust the media services provided by the media cloud service providers? Many traditional security approaches are proposed to secure the data exchange between mobile users and the media cloud. However, first, because multimedia such as video is large-sized data, and mobile devices have limited capability to process media data, it is important to design a lightweight security method; second, uploading and downloading multi-resolution images/videos make it difficult for the traditional security methods to ensure security for users of the media cloud. Third, the error-prone wireless environment can cause failure of security protection such as authentication. To address the above challenges, in this article, we propose to use both secure sharing and watermarking schemes to protect user's data in the media cloud. The secure sharing scheme allows users to upload multiple data pieces to different clouds, making it impossible to derive the whole information from any one cloud. In addition, the proposed scalable watermarking algorithm can be used for authentications between personal mobile users and the media cloud. Furthermore, we introduce a new solution to resist multimedia transmission errors through a joint design of watermarking and Reed- Solomon codes. Our studies show that the proposed approach not only achieves good security performance, but also can enhance media quality and reduce transmission overhead.6.SYSTEM ANALYSIS6.1 EXISTING SYSTEM:This system consists of two main entities: Cloud Computing Platform (CCP) and user. User is a data owner who holds massive image data and intends to outsource the image processing tasks to the CCP. Under this setting, a user utilizes the CCP as a complementary resource for its limited computational power and outsources complicated image processing tasks to the CCP. Meanwhile, users also need to protect the privacy of data. Meanwhile, the entity CCP is composed by a set of cloud servers. It is assumed to be honest-but-curious. It can only access the encrypted image data uploaded by users and perform the corresponding image processing algorithms over ciphertext domain. After that, the CCP returns the requested results in the form of ciphertext back to a user. Finally, a user can use its private key to decrypt the returned results. 6.2 PROPOSED SYSTEM:A new secure framework for privacy preserving outsourced storage, search, and retrieval of large - scale, dynamically updated image repositories has been proposed. In our proposed, initially water marking has been done by adding water marking word to image and uploaded in cloud. While uploading attribute based encryption is used to encrypt the particular file. Therefore respective key should be needed by user to access that particular file. Due to curious but honest server security is less which leads to leakage of information. Are in chance of key attack the attacker can retrieve the data. Therefore to ensure originality of data Water marking is done. In addition to ensure security in high level repository key is generated through AES algorithm. An authorized that need original image can retrieve data by getting proper authentication from owner of the file and getting repository key. Compared with other existing approaches it proves that our proposed framework attains security and it provides enhanced scalability, performance and lower bandwidth consumption, allowing client applications to be increasingly lightweight and efficient in providing security. 6.4 INSTANCE METHODS:The InetAddress class also has several other methods, which can be used on the objects returned by the methods just discussed. Here are some of the most commonly used. Boolean equals (Object other)-Returns true if this object has the same Internet address as other.byte[ ] getAddress( )-Returns a byte array that represents the object’s Internet address in network byte order.String getHostAddress( )- Returns a string that represents the host address associated with the InetAddress object.String getHostName ( ) - Returns a string that represents the host name associated with the InetAddress object.boolean isMulticastAddress( )- Returns true if this Internet address is a multicast address. Otherwise, it returns false.String toString( )- Returns a string that lists the host name and the IP address for conveneince.Internet addresses are looked up in a series of hierarchically cached servers. That means that your local computer might know a particular name-to-IP-address mapping autocratically, such as for itself and nearby servers. For other names, it may ask a local DNS server for IP address information. If that server doesn’t have a particular address, it can go to a remote site and ask for it. This can continue all the way up to the root server, called InterNIC ().6.5 INTRODUCTION TO BACK END:6.5.1 MICROSOFT SQL SERVER 2005Micro soft SQL Server 2005 provides a scalable database that combines ease of use with complex analysis and data warehousing tools. SQL Server includes a rich graphical user interface (UI) along with a complete development environment for creating data- driven applications. Commerce server takes advantages of SQL Server data warehousing and analysis capabilities in several key areas. The commerce server data warehouse, for example, uses SQL Server Data Transformation Services (DTS) to transform data stored in SQL Server database to the format used by Commerce server resources.6.5.2 WINDOWS 2005 AND SQL SERVER SECURITY:Existing Microsoft ? 2005 accounts (Active Directory users of groups) must be granted permissions to connect to Microsoft ? Microsoft ? SQL Server ? before they can access a database. If all members of a windows group require connections to SQL Server, you can grant login permissions for the group as a whole. Managing group permissions is easier than managing permissions for individual users. If you do not want a group to be granted permissions collectively, you can grant permissions to connect to SQL Server for each individual user.6.5.3 ACTIVE DIRECTORY USERS AND GROUPS:In windows 2005, users are individuals who have an account that provides specific privileges to access information and resources. Granting permission to users to develop , manage , and use workflow applications is dependent upon the integration of windows 2005 domain accounts and SQL Server roles. If a number of users all have the same permissions, they can be treated as a single unit, called a group, which can be assigned permissions that apply to all members of the group. Individuals can be added to or removed from groups as desired.There are two types of windows groups: Global and LocalGlobal groups contain user accounts from the windows 2005 server domain in which they are created on a computer running windows 2005 professional.Local groups can contain user accounts and global from the domain in which they are created and any trusted domain. Local groups cannot contain other local groups. In addition, Windows 2005 has predefined, built – in local groups, such as administrators, Users and Guests. By default, these built – in groups always are available on any windows 2005 computers, unless they are removed explicitly.To grant access to SQL Server to a windows local or global group, specify the domain or computer name on which the group is defined, flowered by a backslash, and then the group name. For example, to grant access to the windows 2005 group SQL_users in theWindows 2005 domain LONDON, specify LONDON/SQL_users as the group name.However, to grant access to a window built – in local group, specify BUILT IN, instead of the domain or computer name. To grant access to the built – in windows local group Administrators, specify BUILTIN/Administrator as the group name to add to SQL server.You must have appropriate permissions on the server to create windows groups or users or to create SQL servers users or roles.For additional information about windows accounts, see your windows documentation.6.5.4 SQL SERVER LOGINS:SQL Server logins are the account identifiers that control access to any SQL server will not complete connections unless it has first verified that the login you specified is valid. This verification of the login is called authentication. A member of the SQL server sysadmin fixed- server role first must specify to SQL Server all the windows accounts or groups that can connect to SQL Server. Your access to SQL Server is controlled by your windows account or group, which is authenticated when you log on to the windows operating system on the client. When Connecting, the SQL Server client software requests a windows trusted connection to SQL Server. Windows will not open a trusted connection unless the client has logged on successfully using a valid windows account. The properties of a trusted connection include the windows group and user accounts of the client that opened the connection. SQL Server gets the user account information from the trusted connection properties and matches them against the windows accounts defined as valid SQL Server logins. If SQL Server finds a match, it accepts the connection. You are identified in SQL Server finds a match, it accepts the connection. You are identified in SQL Server by your windows group or user account.6.5.5 DATABASE ROLES:Using database roles, you can collect users into a single unit to which you can apply permissions. Permissions granted to, denied to or revoked from a role also apply to any members of the role. SQL Server roles exist within a database and cannot span more than one database. Because roles are unique to each database, you can reuse a role name, such as “reviewer” in each database that you create. To assign users and groups to data base roles, the users and groups must have valid windows domain accounts and SQL Server logins.If you make any changes to the membership of database roles in your workflow application, you must synchronize the user directory for role permissions to work properly.Users can belong to more than one database role at a time.Roles can contain windows accounts and other SQL Server users and roles.A scalable model is provided for setting up the right level of security within a databaseIt is easy to manage permissions in a database if you define a set of roles based on job functions and assign each role the permissions that apply to that job. Then, you can move users between roles rather than having the permissions for each individual user. The owner of a role determines who can be added or removed from the role. The owner is either the user explicitly specified as the owner when the role is created or the user who created the role when no owner is specified. If you make any changes to the membership of database roles in your workflow application, you must synchronize the user directory for role permissions to work properly. Database roles are created for a particular database. In SQL Server 7.0 and SQL Server 2005, users can belong to multiple roles. Because users can belong to more than one database role at a time, it is no longer required for users to assume temporarily the identify (and permissions) of other users through aliases. Note if you plan to make a template based on a workflow application, you should use role based permissions for everything, because the set of database users will be different for each instance of a project based on the template.6.5.6 DATABASE USER ACCOUNT:While a SQL Server login makes it possible for a user to access SQL Server, a database user account is required for the user to access a specific database. Then, these user accounts can be associated with the roles defined in you workflow application. A user account can be a member of any number of roles within the same workflow application. For example, a user can be a member of the admin role and the author’s role for the same database, with each role granting different permissions. The effective permissions on an object granted to a member of more than one role are the cumulative permissions of the roles, although denied permissions in one role has precedence over the same permissions granted in another role. For example, the admin role might grant access to a table while the author’s role denies access to the same table. A member of both roles is denied access to the table, because denied access is the most restrictive.7. MODULE DESCRIPTIONImplementation is the stage in the project where the theoretical design is turned into a working system and is giving confidence on the new system for the users, which it will work efficiently and effectively. It involves careful planning, investigation of the current system and its constraints on implementation, design of methods to achieve the change over, an evaluation, of change over methods. Apart from planning major task of preparing the implementation are education and training of users. The more complex system being implemented, the more involved will be the system analysis and the design effort required just for implementation. An implementation co-ordination committee based on policies of individual organization has been appointed. The implementation process begins with preparing a plan for the implementation of the system. According to this plan, the activities are to be carried out, discussions made regarding the equipment and resources and the additional equipment has to be acquired to implement the new system. Implementation is the final and important phase, the most critical stage in achieving a successful new system and in giving the users confidence. That the new system will work be effective .The system can be implemented only after through testing is done and if it found to working according to the specification. This method also offers the greatest security since the old system can take over if the errors are found or inability to handle certain type of transactions while using the new system.8. CONCLUSIONIn this paper we have proposed a new secure framework for the privacy-preserving outsourced storage, search, and retrieval of large-scale, dynamically updated image repositories, where the reduction of client overheads is a central aspect. In the basis of our framework is a novel cryptographic scheme, specifically designed for images, named AES. Key to its design is the observation that in images, color information can be separated from texture information, enabling the use of different encryption techniques with different properties for each on, and allowing privacy preserving Content-Based Image Retrieval to be performed by third-party, untrusted cloud servers. We formally analyzed the security of our proposals, and additional experimental evaluation of implemented prototypes revealed that our approach achieves an interesting trade-off between precision and recall in CBIR, while exhibiting high performance and scalability when compared with alternative solutions. An interesting future work direction is to investigate the applicability of our methodology - i.e. the separation of information contexts when processing data (color and texture in this contribution) - in other domains beyond image data. 9. REFERENCES:[1] W. Lu, et al. Secure image retrieval through feature protection. In Proc. of ICASSP, 2009.[2] C.-Y. Hsu, et al. Image feature extraction in encrypted domain with privacy-preserving SIFT. IEEE TIP, 21.11 (2012): 4593-4607.[3] C.-Y. Hsu, et al. Homomorphic encryption-based secure SIFT for privacy-preserving feature extraction. In Proc. of SPIE, 2011.[4] M. Naehrig, et al. Can homomorphic encryption be practical?. In Proc. of CCSW, 2011.[5] W. Lu, et al. Enabling search over encrypted multimedia databases. In Proc. of SPIE, 2009.[6] Z. Erkin, M. Franz, J. Guajardo, S. Katzenbeisser, I. Lagendijk, and T. Toft. Privacy-preserving face recognition. In Proc. of PET, 2009.[7] M. K. Khan, J. Zhang, and K. Alghathbar. Challenge-response-based biometric image scrambling for secure personal identification, Future Generation Computer Systems. 27.4 (2011): 411-418.[8] Z. Qin, J. Yang, K. Ren, C. W. Chen, and C. Wang. Towards efficient privacy-preserving image feature extraction in cloud computing. In Proc. of MM, 2014.[9] Z. Qin, J. Yan, K. Ren, C. W. Chen, C. Wang, and X. Fu. Privacy-preserving outsourcing of image global feature detection. In Proc. of GLOBECOM, 2014.[10] T. Sikor. The mpeg-7 visual standard for content description-an overview. IEEE TCSVT, 11.6 (2001): 696–702. ................
................

In order to avoid copyright disputes, this page is only a partial summary.

Google Online Preview   Download