Jump to content

Snail

Members
  • Posts

    26
  • Joined

  • Last visited

Reputation

17 Good

About Snail

  • Rank
    Registered user
    Newbie
  • Birthday 02/02/1998

Recent Profile Visitors

The recent visitors block is disabled and is not being shown to other users.

  1. TIOBE Software: The Coding Standards Company
  2. Un articol foarte interesant. How to be a Programmer: A Short, Comprehensive, and Personal Summary
  3. .:: Phrack Magazine ::. Toti tineri trebuie sa stie acest articol!
  4. |=-----------------------------------------------------------------------=| |=--------------------=[ The Fall of Hacker Groups ]=--------------------=| |=-----------------------------------------------------------------------=| |=--------------=[ Strauss <strauss@REMOVEME.phrack.org> ]=--------------=| |=-----------------------------------------------------------------------=| --[ Contents 1 - Introduction 2 - Background 3 - Nowadays 4 - Conclusion 5 - Shouts 6 - Bibliography 7 - Notes --[ 1 - Introduction The earlier, bigger part of hacking history often had congregations as protagonists. From CCC in the early 80s to TESO in the 2000s, through LoD, MoD, cDc, L0pht, and the many other sung and unsung teams of hacker heroes, our culture was created, shaped, and immortalized by their articles, tools, and actions. This article discusses why recently we do not see many hacker groups anymore, and why the ones we do, such as Anonymous and its satellite efforts, do not succeed in having the same cultural impact as their forefathers. --[ 2 - Background Hacking is, in its very essence, an underground movement. Those who take part on it have always been the ones who (ab)used technology in ways beyond the knowledge of the larger userbase. It is tightly linked to intense efforts in unveiling previously unknown information as well as in sharing these discoveries. These premises hold true for as long as we know hackers: since computers had barely no users up until the informatic massification of today. The nature of the hacker interests intrinsically poses difficulties: growing knowledge on anything is hard. It requires heavy research, experimentation, and can turn into an endless journey if objectives are not carefully set. Just like in any field of scientific studies, it calls for a good amount of colaboration, an attitude which, luckily for hackers, was greatly enabled by the advent of computer networks and, most notably, the Internet. Computer networks increasingly made it possible to transmit unlimited and uncensored information across their geographical extent with little effort, with little costs, and in virtually no time. From the communication development standpoint, one would expect that the events that followed the 80s to our days would lead to a geometric progression in the number of hacker communities. In effect, hacking has arguably grown. Hacker communities, definitely not. So what went wrong? --[ 3 - Nowadays We live in days of limited creativity. Moreover, as contraditory as it may seem, it looks particularly rare for creativity to arise from groups or teams. Communities, rather than individuals, should be more intellectually empowered to create, but lately we have been watching the force of the solo, the age of the ego. That, of course, when we do see anything that catches our attention for originality, which is an ever scarcer pleasure. In "Time Wars" [1], Mark Fisher explains that post-fordism has taken us to this catatonic inability to innovate. Our nearly obsessive compulsion for work consumes not only our time, in the literal form of labor hours, but our minds, by distracting us from everything else we could be doing otherwise. These distractions include our unceasing connection to ubiquous media (e.g. the frequent checks for new e-mail, or accesses to social networks on mobile devices) as well as an increased concern with financial stability and provisioning, a concern that grows as welfare is invariably trimmed by both the governments and the private sector. It is important to note that our capitalist worries are more deeply rooted in us than might seem at first, even in the most politically diverse people. Supporting oneself is not easy, it does not come for free. Getting some education, finding a job, staying up-to-date... regardless of what your aspirations are, whatever you feel obliged to do is probably a lot, already. And it likely involves a prevalence of "minding your own business". The unsettlement created in our thoughts affects intellectual solidarity in even more severe ways than it does individual creation. Simply put, if it is already so difficult for one person to focus away from these "distractions" and into inspired productivity, let alone for a group to join in a true collective mind. The ties that bind collective-minded parties together take dedication to build, and our egotistical concerns do not help (see note "A"). Not only is commitment required for the actual work to be accomplished, but also to identify the shared values and goals that enable true human connectivity. Notice this does not concern _collaboration_ as much as it does _collectiveness_. Collaboration typically breaks down the creative process in a way it can be incrementally achieved with very self-sufficient, individualistic contributions. Such is the case in most open-source software projects. Roles are very well segregated so that a minimum of human integration is required, as far as most modern software development processes go, anyway. A true "hive mind" [2] cannot exist without the support from a stronger, more unisonant cognitive bond. Funny enough, the popular variants of LOIC, the DDoS tool used by "Anonymous", contain a "hive mind" feature (i.e. getting a target automatically from a given IRC server and channel and firing your packets against it). You wish it was that easy. The concept of the "conscience collective" was first established by Emile Durkheim who, in his 1893 book "The Division of Labor in Society", expressed 'that the more primitive societies are, the more resemblances (particularly as reflected in primitive religion) there are among the individuals who compose them; inversely, the more civilized a people, the more easily distinguishable its individual members', as put by R. Alun Jones [3]. Well, following (or despite) the prosperous adoption of atheism and agnosticism as professed in the Internet and other popular media, it is understood that religious beliefs are in a low, taking a bit of what socities traditionally saw as a point of unity. In fact, there seems to be an ever growing search for uniqueness in the modern man, especially that from the apparently overpopulated metropolises (see note "B"). In this never-ending crowd of interesting, outstanding personas, we want to shine somehow, to prove ourselves different and original. In the end, it turns into a pointless battle, against God-knows-who, for apparent singularity. Instead of reaching for the fellow man, we want to set ourselves apart, and thus, remarkable. --[ 4 - Conclusion Modern life nearly conspires against the collective. We are tormented by a relentless flow of information as well as the daily worries of an eternally insecure, unwarranted life. Furthermore, we dread the thought of being alike, of sharing multiple views and opinions. As such, we are turning progressively judgemental of who we should be partnering with, on the basis that "they do not understand". In hacking, it yet implicates on the delicate subject of trust, which would require an essay on itself, given the undeniable importance the matter has acquired over the years. If our thoughts on creating hacker groups were to be summarized, this is how they would look: No one ever feels like we do. They are not to be trusted and we do not have the time for them. The only attitude consonant to our search for a comfortable, safe life is to constrain ourselves to our own limitations, ignore the intelligent life out there, and surrender to the mediocracy that our society has condemned our leisure time to. --[ 5 - Shouts My only acknowledgements go to whoever reads this through and puts his/her thoughts to it. I eagerly await for your comments. --[ 6 - Bibliography 1 - "Time Wars", Mark Fisher - (pagina’s niet in het hoofdmenu) | Gonzo (circus) | Muziek.Kunst.Meer. incubate-special-exclusive-essay-time-wars-by-mark-fisher/ 2 - "Collective Consciousness", Wikipedia - Collective consciousness - Wikipedia, the free encyclopedia 3 - Excerpt of "Emile Durkheim: An Introduction to Four Major Works", Robert Alun Jones - The Division of Labor in Society (1893) --[ 7 - Notes A) In respect to social networks, while they are a valid community-building mechanism in nature, selfishness prevails in common usage, by means of the indulgent pleasure that fuels chronic "pluggedness", at times voyeur, at times exhibitionist and needy. It is arguably the case, though, that the globalizing aspect of the Internet has brought the feeling of upsetting commonality to the citizens of even the more unpopulated places. Un articol interesant.
  5. Snail

    Hall of Fame

    Propun crearea unei astfel de zone in care sa fie afisate persoanele importante pe care le are acest forum:membri care au o contributie la dezvoltarea acestui forum, cei care au avut niste proiecte de succes care au promovat forumul ,care au gasit vulnerabilitati sau alte chesti in forum (ceva gen https://www.hackthissite.org/hof ,etc. Astfel membri noi ar intelege ce persoane sa respecte ... P.S:Imi cer scuze daca este o idee idioata dar mi-a trecut prin cap si am zis sa o sugerez.
  6. IBM has announced it’s surmounted one of the biggest hurdles on the road toward creating the world’s first true usable quantum computer. A number of analysts have predicted that the jump from traditional computing to quantum chips could be on par with the revolution we saw when the world moved from vacuum tubes to integrated circuits back in the early sixties. The reason for this increased power is that quantum computers are capable of processing multitudes more calculations than traditional CPUs at once, because instead of a transistor existing in one of either two states — on, or off — independently of one another, a quantum bit can be both at the same time. How is that possible? Well, while the specifics of the mechanism that makes it work involves a bit more math than I could sit through in college, at its essence the computer is taking advantage of a quantum phenomena known as “superposition,” wherein an atom can act as both a wave and a particle at once. In short, this means that at least in theory, quantum bits (or “qubits”), can process twice as much information twice as fast. This has made the race to create the world’s first true quantum computer a bit of a Holy Grail moment for big chip makers, who have found themselves inching closer to maxing out Moore’s Law as 22 nano-meter transistors shrink to to 14nm, and 14nm tries to make the jump to 10. Related: Leaked table of Intel’s sixth-generation processors packs few surprises So far we’ve seen just one company pull out in front of the herd with its own entry, D-Wave, which first debuted all the way back in 2013. Unfortunately for futurists, the D-Wave is more a proof of concept that quantum computing is at least possible, but still not necessarily all that much quicker than what we have to work with today. Now though, according to a statement released by IBM Research, it seems Big Blue may have found a way around one of the biggest qualms in quantum computing by sorting out the problem of something known as “quantum decoherence.” Decoherence is a stumbling block that quantum computers run into when there’s too much “noise” surrounding a chip, either from heat, radiation, or internal defects. The systems that support quantum chips are incredibly sensitive pieces of machinery, and even the slightest bit of interference can make it impossible to know whether or not the computer was able to successfully figure out that two plus two equals four. IBM was able to solve this by upping the number of available qubits laid out on a lattice grid to four instead of two, so the computer can compensate for these errors by running queries against itself and automatically compensating for any difference in the results. In laymen’s, this means that researchers can accurately track the quantum state of a qubit, without altering the result through the act of observing alone. “Quantum computing could be potentially transformative, enabling us to solve problems that are impossible or impractical to solve today,” said Arvind Krishna, senior vice president and director of IBM Research, in a statement. Related: Intel may turn to Quantum Wells to enforce Moore’s Law While that may not sound huge, it’s still a big step in the right direction for IBM. The company believes the quantum revolution could be a potential savior for the supercomputing industry, a segment that is projected to be hardest hit by the imminent slowdown of Moore’s trajectory. Other possible applications up for grabs include solving complex physics problems beyond our current understanding, testing drug combinations by the billions at a time, and creating unbreakable encryption through the use of quantum cryptography. Se pare ca aceste tipuri de calculatoare vor conduce la "securitatea suprema". Sursa:Quantum computing may not be as far off as we think, says IBM | Digital Trends
  7. Now that machine-learning algorithms are moving into mainstream computing, the Massachusetts Institute of Technology is preparing a way to make it easier to use the technique in everyday programming. In June, MIT researchers will present a new programming language, called Picture, that could radically reduce the amount of coding needed to help computers recognize objects in images and video. It is a prototype of how a relatively novel form of programming, called probabilistic programming, could reduce the amount of code needed for such complex tasks. In one test of the new language, the researchers were able to cut thousands of lines of code in one image recognition program down to fewer than 50. They plan to present the results at the Computer Vision and Pattern Recognition conference in June With probabilistic programming, “we’re building models of what faces look like in general, and use them to make pretty good guesses about what face we’re seeing for the first time,” said Josh Tenenbaum, an MIT professor of computational cognitive science who assisted in the work. A new twist Picture uses statistical inference to cut away much of the basic computational work needed for computer vision. It works much like the inverse to computer animation. Computer graphics programs, such as those used by Pixar and other animation companies, make two-dimensional representations of three-dimensional objects, given a relatively small amount of instruction from programmers. The Picture language works in the opposite direction. It can recognize an object within a two-dimensional image by comparing it to a set of models of what the objects could be. The work stems from a program that the U.S. Defense Advanced Research Projects Agency launched in 2013 to develop probabilistic programming languages to further facilitate the use of machine learning. Although an academic pursuit for decades, machine learning is quickly becoming a feasible technique for commercial use, thanks to more powerful computers and new cloud machine learning services offered by Amazon Web Services and Microsoft Azure. Although probabilistic programming does not require machine learning to work, it can provide a way to streamline the use of machine learning, Tenenbaum said. “In pure machine learning, you drive performance increases by just collecting more and more data and just letting machine learning do the work,” Tenenbaum said. In probabilistic programming, “the underlying system is more knowledge-based, using the causal process of how images are formed,” Tenenbaum said. Picture is one of a number of different probabilistic programming languages that MIT is currently working on. Another, more general-use, probabilistic programming language from the team, called Venture, can be used to solve other kinds of problems, Tenenbaum said. MIT's Picture language could be worth a thousand lines of code | PCWorld
  8. Ce parere aveti despre aceasta carte:http://http://www.amazon.com/The-Programming-Language-2nd-Edition/dp/0131103628 Daca are cineva un link de download ,il rog sa ma ajute. P.S:Va raman recunoscator daca ma ajutati si cu niste carti despre python.
  9. Am gasit urmatorul puzzle foarte interesant: During the recent BrainBashers cipher convention, a binary code contest took place. The contest consisted of a binary code transmission where the spaces between the letters were missing and there was no punctuation. Each letter of the alphabet was translated into its binary equivalent based on its position in the alphabet: a=1, b=10, c=11, d=100, e=101, f=110, g=111, h=1000, i=1001, j=1010, k=1011, l=1100, m=1101, n=1110, o=1111, p=10000, q=10001, r=10010, s=10011, t=10100, u=10101, v=10110, w=10111, x=11000, y=11001, z=11010. What is the answer to the question being asked? 110011101001000100110011100110011110110 101100101100110010011101101001111010111 001010010000101011101011010110010110011 010010001111101011111000101001001101001 011111111010111001001000101110010000100 111010011100111011101100110011100111011 000011001011000110101101100111010010011 111111010111100011010010011001111111110 101100001100101011001111111110101 Ma gandesc daca poate cineva sa il faca. PS:Eu am incercat dar m-am dat batut dupa primul rand.
  10. Snail

    Udemy

    www.udemy.com/learn-the-basics-of-ethical-hacking-and-penetration-testing/?couponCode=freebuf Imi cer scuze daca a mai fost postat.
×
×
  • Create New...