Hackbusters Reports Largest Databreach in History showing HP Restricted docs that is hidden from the HUMAN EYE to access exposed via Google / Hash code exposed for years and hackers to be able to make data that seems to match $60 Billion audit on HP assets

Hackbusters Reports Largest Databreach in History showing HP Restricted docs that is hidden from the HUMAN EYE to access exposed via Google / Hash code exposed for years and hackers to be able to make data that seems to match $60 Billion audit on HP assets


International Open Source News

     The hash code is currently exposed for 10 years+ allowing hackers to be able to make data that seems to match causing $60 Billion audit on HP assets.

The following link shows over 37 Million "HP Restricted Documents" exposed worldwide and Reporting of Hackbusters the Largest Databreach in History however some documents make the "HP Restricted to be hidden" from the HUMAN EYE in the Restricted Documents.(LARGEST DATABREACH IN HISTORY REPORTED BY HACKBUSTERS) (37 Million HP Restricted Documents Exposed )(2017). When you look through some of the documents you will see the material but the HP Restricted is parsed in the document however when you look at the document you dont see HP Restricted only when you do a search on the document you will see HP Restricted but its white out from the human eye. The following files reveals hp confidential trade secrets to allow competitors to steal business. The document allows competitors to analyze the problem and solution HP uses to win Million Dollar deals to deploy converged infrastructure in various environments ( Link to Confidential Document that reveals Cloud Secrets) (Highly Restricted HP Sales Playbook) (2014). The highly restricted HP sales playbook is forbidden to be viewed, which HP have invested $1 Billion however HP have created a playbook back in 2011 (HP invest $1 Billion in Cloud Services) (2014). Basically, employees are being paid 5 digit to 6 figure salaries to execute Million dollar deals from these restricted corporate documents exposed. The $1 billion investment will be used to beef up all aspects of HP’s cloud operations, such as improvement of data centers, restricted training material for consultant and operation teams, and research into advancing core cloud technologies. The same sales playbook is still used to make strategic sales to multi million dollar infrastructures. Avnet’s pioneering role in bringing this solution to market was unveiled at HP Discover 2014, held in Barcelona, Spain. According to reports, Avnet will integrate and configure HP ConvergedSystem 700 to help solution provider partners bring this next-generation converged management infrastructure to market quickly ( Avent Partnering with HP to implement HP Converged system) (2014). According to reports, Avent technology exposes highly restricted information to sell HP Portfolio of Cloud services, which the playbook details how strategies, tactics, and how to win business along with sensitive percentages and financial information (Avent exposes HP Portfolio Playbook) (2014). Our Investigators have researched the documents and was surprised on the information, which is exposed to detail out objections potential customers would reply with when making a sale and solutions to make the HP converged system sale (Restricted document exposed revealing selling secrets) (2014). When searching through the document the word “Restricted” is searched it is invisible, which it is unclear why it is not in black bold letters. The mission for Avnet is to provide partners with the flexible and highly customizable HP solution that is fully tested, validated and integrated, allowing for faster delivery and implementation in customers’ data centers. The solutions Avent offers to sell to the customers is already exposed, which minimizes the sale value to sell the HP Converged System 700, which is an innovative way HP is enabling the New Style of IT to help businesses drive agility, increase revenue and cut costs,” said Chris Swahn, vice president and general manager, Avnet Technology Solutions, Americas, HP Solutions group. In the article “Avnet’s solution provider partners will have a unique opportunity to introduce this solution to their customers. According to the HP website, The HP ConvergedSystem 700 incorporates converged management with automation to simplify everyday tasks, mitigate project risk, reduce operational expenses, and improve efficiency and agility. This solution enables partners to create a high-velocity business model for their customers that delivers faster time to value with lower structured costs and fewer risks, while better positioning their customers to move toward cloud-based models. The investigators examined the exposure of the files and large body of content of articles and suggested a conspiracy to facilitate illegal activity in business dealings or if these partners are not aware of these confidential files being exposed. HP ConvergedSystem 700 powered by HP OneView is a simple, efficient and effective way to manage next-generation application platforms. It provides a single tool to manage the entire infrastructure, including support for a particular workload such as cloud, big data, mobility, virtualization, analytics/business intelligence or database transformation. The cutting edge integration and logistics facility is 228,000 square feet and has the capability to build and ship more than 700,000 systems annually. The services provided by Avnet Global Solutions Center for the HP system include full hardware and software factory installation and integration, which the documents reveal how to sell the system to customers and win Multi Million dollar deals. Avent even has the entire HP and Microsoft Sales Playbook exposed, which details the entire strategy to win buyers and how to respond to objections to score the big Million dollar deals (HP and Microsoft Restricted Sales Playbook) (2014) . The following sensitive restricted document, which is the entire $1 Billion investment that was used to invest in telemarketing selling tactics, sales battle card, and cookbooks that is used to sell more cloud solutions and earn more money (Restricted Cloud Asset Catalog Documents exposed by Avent) (2014). The Asset Catalog would allow any other competitor to download the complete restricted cloud portfolio documents to fully steal business, which all of these documents are being administered by a System Administrator. The System Administrator does not know these files are exposed and would expect that the System Administrator forgot to apply restricted access to the files in the directory, but no one knows these files are accessible until now, which would allow the competitors to steal business and cyber criminals to sell trade secrets.


In the past years Hewlett Packard has had many customers doing everything from Deployments, IT Management, and Customer Service from Small Businesses, Large Enterprises, and Military customers. The SHA2 hashes found would cause HP Customers to file lawsuits to get their “CASH BACK” I mean were talking Billions of dollars knowing that their data was not secured because the data can be forged knowing the SHA2 hash of the GPG keys, which below researchers explain the process of SHA2 and importance to keep out of anyone abilities to access. The SHA2 hash code has been sitting for multiple years dating back between 1998 – 2008, which would allow Large Enterprise customers to audit and to get refund for the projects invested with HP. Cyber Criminals downloaded the entire forbidden FTP directory, which is now accessible were reproduction without permission is forbidden documents are located under /pub/c-products/servers/management/hpsim/ directory “Return on Investments” and SHA2 hash code can be obtained to forge data to match these 2 following links (ftp://ftp.hp.com/pub/keys/)(ftp://ftp.hp.com/pub/keys/) (2017). The directories are currently accessible, which as of now its no telling how many times the high powered files was downloaded however the $48 Billion in HP Assets to be seized........ Crippling HP Sales which includes internal firmware and docs that’s completely forbidden to view that includes ROI financials and having the secret hash to forge data. In this case SHA2 is used which I was able to obtain to sign to hash private key to encrypt for signing messages and also to match if the correct password was inputted on the private key of user side to ensure there’s a match however this means I can create a pair of private keys with this hash. A digital signature "proves" conscious action of a designated signer over a piece of data; like asymmetric encryption, this involves key pairs and mathematics, and associated constraints on the signed data. Hashing is usually used in such systems in the following way, we want to check the data did come from the other person, so he decrypts the plain text with his private key, if we encrypt this cipher text we should get the data back out (basically thing of encryption as the inverse of decryption and this suddenly makes sense!), however as I said earlier, asymmetric cryptography is slow. The following link shows the SHA2 Public Keys for both HP and HPE, but the SHA2 hash code are hidden, which is only used for internal to ensure data is not forged ( Link to HP GPG Public Key) (Link to HPE GPG Public Key)(2017).

What type of hashing method is used for the private key to encrypt for signing messages?


The hash is derived using a cryptogaphic hash function. Hash function usually has the following three properties:

1.   Preimage resistance: From a given hash you can't (easily) find out the corresponding input to the function

2.   2nd Preimage resistance: From a given hash and a given input you can't (easily) construct another input having the same hash.

3.   Collision resistance: You can't (easily) find any two inputs resulting in the same hash.

The usual constructions for this are SHA-1(not recommended), SHA-256 and SHA-3.


So, why not hash the data and sign the hash this way we can verify the integrity of the data without someone being able to make data that seems to match. In this case having the hashing code someone can be able to make data that seems to match, which in this case this is the ultimate fail would be to conduct an audit on Hewlett Packard $52 Billion in assets ( HP $52 Billion in assets Wiki)(2017). However, even with well known hashes (let's say, a 'known good' hash set used for file integrity), the issue that is raised is a potential problem. What if an attacker not only modifies the file but discovers a hash collision that allows him to manipulate the protected data while retaining the same hash? The same hash downloadable above in the keys directory would allow an attacker to retain the same hash, but ACL permissions should have been applied to the Keys directory to avoid getting the secret Hash (How hash ensure integrity of data?)(2017). The system described above has some problems. It is slow, and it produces an enormous volume of data — at least double the size of the original information. An improvement on the above scheme is the addition of a one-way ** in the process. A one-way hash function takes variable-length input — in this case, a message of any length, even thousands or millions of bits — and produces a fixed-length output; say, 160-bits. The hash function ensures that, if the information is changed in any way — even by just one bit — an entirely different output value is produced (Hash functions).


OpenPGP uses a cryptographically strong hash function on the plaintext the user is signing. This generates a fixed-length data item known as a message digest. (Again, any change to the information results in a totally different digest.) Below is a diagram on how the hash function works on data:





On the other hand, for a "good" hash function, there is no known way to compute half of the output by knowing just the other half; otherwise, fast attacks against preimage resistance would work due to the exposed SHA2 hash code causing $60 Billion audit and seizure of assets. This yields the following scheme: from your key K, compute SHA-256(K). Split the 256-bit output into two 128-bit halves. Use the first half for indexing (you show it), and the second half for actual encryption (Is it safe to expose SHA Code ?) (2017).

https://gpgtools.tenderapp.com/kb/how-to/introduction-to-cryptography


Exposed Data Link with no ACL permissions applied : ftp://ftp.hp.com/pub

Souce: (MIMIC SHA2 Code to make match on forged data)(2017)

==============================================

Anyone can surf the web and purchase a 1 Terabyte hard drive for as little as $54.99 through 520,000 sources and anyone can be able to extract all Corporations sensitive data and resell on the Deep Web due to incorrect permissions replicated on internal servers exposing massive amounts of sensitive forbidden documents on over 75 Million servers because of incorrectly configuration in the DNS root zone as 13 named authorities


13 DNS root zone named servers rigged

Anyone in the world can simply walk, take a bus, or drive any make or model of Car to Wal-Mart and purchase a 1 Terabyte hard drive for as little as $61 - $100 however to make it more convenient. Anyone can surf the web and purchase a 1 Terabyte hard drive for as little as $54.99 through 520,000 sources and anyone can be able to extract all Corporations sensitive data and resell on the Deep Web, which all of this data as researchers inspected it to be major financial data, trade secrets, and other highly sensitive information ranging from small pee business owners to massive corporations.(Google vendors selling 1 Terabyte harddrives)(2017). The problem is this issue with the glitch involving the 13 DNS root servers existed during the past fiscal years however this was ignored by Info Tech professionals and Developers as a rule of thumb to profit for large, private, to mid sized corporations. The cost and burden would be close to the U.S National Debt, but the way it looks it would be an outrageous amount. As of the writing of this article, this identifies this to be Information Technology most complex disaster in U.S History and WorldWide in many foreign countries, which will be like powering out the power grid in U.S.A, but these are 13 DNS root zone servers that is currently running has been running for the past years when the Internet officially was first matured in the 70's as a result of the TCP/IP architecture that came out of the joint work of Bob Kahn at ARPA and Vint Cerf at Stanford and others throughout the 70's. Vint Cerf explains it like this, "Actually, Bob produced a list of desiderata for open networking, but the ideas of TCP and later TCP/IP arose out of our joint work while I was at Stanford and he was at ARPA. I would not say that the architecture was originally developed by Bob at BBN. The Unix to Unix Copy Protocol (UUCP) was invented in 1978 at Bell Labs. Usenet was started in 1979 based on UUCP. Newsgroups, which are discussion groups focusing on a topic, followed, providing a means of exchanging information throughout the world . While Usenet is not considered as part of the Internet, since it does not share the use of TCP/IP, it linked unix systems around the world, and many Internet sites took advantage of the availability of newsgroups. It was a significant part of the community building that took place on the networks. He came to Bell Labs with the problem and Bell Labs jointly developed these concepts from about March - September 1973 at which point we briefed the International Network Working Group meeting In London in September 1973 and then published a paper in IEEE Transactions on Communications, May 1974 issue (HISTORY OF THE INTERNET)(2017).It was adopted by the Defense Department in 1980 replacing the earlier Network Control Protocol (NCP) and universally adopted by 1983., which was when the Sin to do countless server migrations was first developed and incorrect permissions that was set mutated to make a MASSIVE rubber band Ball (Permission problems after domain migration)(2017)( 1.3 Million Howto Guides to do Domain migrations involving permissions)(2017). When I Google domain migration permission problems however I get 8 Million search results compared to 1.3 Million results to find how to do domain migrations, which this shows that this is a impossible complex situation (8 Million search results to migration permission problems)(2017). The catch 22 is many minds of people came up with ideas to develop corporations and businesses in many different technological scenarios that made this situation very complex and more like a computer grinder to slaughter corporate capital. The issue starts from the 13 DNS root zone servers, which servers from many public/private internal connected networks connects to these 13 DNS root zone named authorities. The following link shows all corporations that exists at Securities Exchanges Commission, which doesn’t include there are 45508 companies listed in stock exchanges around the world (List of Companies at SEC)(2017). According to reports, The latest article that researchers found named “Neira Jones” reported the U.S Largest Databreach on Paper.li site that reports large to midsized databreaches (Hackbusters Reports Largest Databreach in History)(2017). Root Servers. The authoritative name servers that serve the DNS root zone, commonly known as the “root servers”, are a network of hundreds of servers in many countries around the world, which permissions were replicated on internal servers exposing massive amounts of sensitive forbidden documents. They are configured in the DNS root zone as 13 named authorities, as follows.(INTERNET ROOT SERVERS)(2017). These are listed companies. The number of formal unlisted companies would be a wild guess as there is no central quantified effort to build an international registry. So taking a different approach, in countries like USA, there is one company for every 11 people. In some others, there are 0. A rough extrapolation based on economic freedom and worldwide business numbers from UN reports would mean that there is a company for every 60 people on the planet. That brings us to ~115 Million companies. Root name server. ... A root name server is a name server for the root zone of the Domain Name System (DNS) of the Internet. It directly answers requests for records in the root zone and answers other requests by returning a list of the authoritative name servers for the appropriate top-level domain (TLD). The DNS root zone is the top-level DNS zone in the hierarchical namespace of the Domain Name System (DNS) of the Internet.

Google currently processes over 20 petabytes of data a day


According to this source, as of 2014 there were an estimated 75 million server's powering the internet, with Microsoft having the most number of servers at 1 million, while Google having 900,000 servers. An authoritative Nameserver is a nameserver (DNS Server) that holds the actual DNS records (A, CNAME, PTR, etc) for a particular domain/ address. The problem occurs on altered permissions to the particular domain/address, which many servers uses apache, nginx, IIS, and other web server products causing the 200 Terabytes of data to be downloadable.  A recursive resolver would be a DNS server that queries an authoritative nameserver to resolve a domain/ address. Internet Corporation for Assigned Names and Numbers (ICANN) looks after most top-level domain. It operates the Internet Assigned Numbers Authority (IANA) and is responsible for maintaining the DNS root zone. DNS server which keeps all root zone is called TLD name server. You can also configure your server to forward queries according to specific domain names using conditional forwarders. A DNS server on a network is designated as a forwarder when the other DNS servers in the network are configured to forward the queries that they cannot resolve locally to that DNS server. DNS: Why It's Important & How It Works. The Domain Name System (aka DNS) is used to resolve human-readable hostnames like www.Dyn.com into machine-readable IP addresses like 204.13.248.115. DNS also provides other information about domain names, such as mail services. Google Public DNS operates recursive name servers for public use at the following IP addresses: 8.8.8.8 and 8.8.4.4 for IPv4 service, as well as 2001:4860:4860::8888 and 2001:4860:4860::8844, for IPv6 access. Currently, there are over 1 billion websites on the world wide web today. This milestone was first reached in September of 2014, as confirmed by NetCraft in its October 2014 Web Server Survey and first estimated and announced by Internet Live Stats (see the tweet from the inventor of the World Wide Web, Tim Berners-Lee). The Indexed Web contains at least 4.62 billion pages (Wednesday, 30 March, 2016). The Dutch Indexed Web contains at least 231.99 million pages (Wednesday, 30 March, 2016). As of 2014 Google has indexed 200 Terabytes (TB) of data. To put that into perspective 1 TB is equivalent to 1024 Gigabytes (GB). However, Google's 200 TB is just an estimated 0.004 percent of the total Internet. Perhaps even more impressive is the fact those 16 years of video is uploaded to YouTube every day. Verisign, a global leader in domain names and Internet security, today announced six million domain names were added to the Internet in the first quarter of 2015, bringing the total number of registered domain names to 294 million worldwide across all top-level domains (TLDs) as of March 31, 2015, according to the latest Domain Name Industry Brief. The reality is that DNS queries can also use TCP port 53 if UDP port 53 is not accepted. Now with the impending deployment of DNSSEC and the eventual addition of IPv6 we will need to allow our firewalls for forward both TCP and UDP port 53 packets. DNS servers also listen on UDP port 53 to accept queries from client resolvers. TCP port 53 is used for domain transfers, which permissions are transferred along with the migration as well however this according to a number of news sources such as Paper.li, Hackbusters, Hacker News, PHPDrill, and other news sources the databreach came from HP-Google Partership (U.S History Largest Data Breach)(2017). The partnerships created among the Tech companies to indulge into sanctions of project consumption and consolidation involving migrations would be assessed by Courts to liquidate assets of the companies who joined HP-Google in other offspring ventures. The company’s assets combined totaled out to be $829 Billion to be audited because of this Armageddon data breach known to Man Kind however the audited assets would come from HP, Google, AT&T among other companies who partnered with HP-Google because of POGO blaming HP-Google due to “SMBITINABOX” databreach (POGO Prosecutors blame HP-Google on TWITTER)(2017). The writing of this article leaves the entire Internet Crippled due to this finding and the accessibility anyone can remotely extract terabytes of high powered data, but when will the audit begin to replenish unknown trade secrets, not known to public documents, confidential, and secret, top secret documents from many nations that ruined Billions of dollars that exceeds to Trillions of dollars in R&D? Corporations would file their claim of Lawsuit once corporations recognize revenue loss do to this massive data breach situation once they discover their sensitive confidential, do not distribute, not for public inspection documents and trade secrets is readily available now for download, but can corporations and business owners recoup loss and theft of investments ?



Report Page