Jump to content

M2G

Moderators
  • Posts

    1838
  • Joined

  • Last visited

  • Days Won

    31

Everything posted by M2G

  1. Functioneaza! On: Lasa sa mearga in cantonament fara computer. Doar nu e cantonamnet de gaming. Bravo tatalui! Astia mici isi ard prea mult creieru pe calculatoare. Deja i s-a instalat dependenta psihica pustiului si nu e de bine. Daca o tine tot asa probabil ca o sa sufere de depresie, respingere sociala etc... Noi apasam butoanele, nu ele pe noi.
  2. Iti arata asa pentru ca ai IPv6. Ala e formatul pentru IPV6 IPV4 este exprimat pe 32 biti. IPV6 este exprimat pe 128 biti. De aici lungimea mai mare.
  3. Fa update la ultima versiune de coailii.exe si o sa mearga, cel putin la mine asta era problema. Ultimul patch de la coailii suporta "-shit -nakeporn" pentru a trece de protectia din noul algoritm ("esiafarahackere") folosit de google. Bafta! //Pentru cine nu se descurca pot sa-i vand pe 5000 €(e scump, dar merita) scriptul "fu_thel.py" care face toate aceste operatii automat si ascunde urmele lasate pe faianta serverelor.
  4. M2G

    Fun stuff

    Russian grandma kills wolf using her bare hands and an axe - Odd News - Digital Spy Gasiti asemanarile:
  5. La ObjectOutputStream ideea e sa nu inchizi streamul decat dupa ce ai scris toate obiectele pentru ca ObjectOutputStream scrie un header in fisier. In momentul in care tu faci new ObjectOutputStream se creaza o alta instanta care isi pune si ea headerul. Deci fisierul se suprascrie. Nu poti sa folosesti mai multe instante de ObjectOutputStream pentru a scrie un fisier pentru ca atunci InputStream-ul nu o sa mai poata sa citeasca. O ideea pentru a creste performanta ar fii sa nu mai tot scrii si citesti obiecte in for. Iti faci un arraylist de obiecte Scor si pentru fiecare obiect Scor il adaugi in arraylist. Scor test = new Scor(bla bla parametrii); Arraylist<Scor> scores = new Arraylist<Scor>(); scores.Add(scor); Dupa asta pui pe ObjectOutputStream intreg arraylistul cu obiecte scor in loc sa pui fiecare obiect pe rand. In momentul in care citesti obiectul din fisier o sa iti returneze un singur obiec de tip arraylist. Tot ce trebuie sa faci este sa faci Add la noile scoruri in acel array si sa il scrii inapoi in fisier. Astfel va contine vechiile valori + noile valori la care ai dat Add.
  6. Foloseste PrintWriter sau FileWriter care au metode prin care poti face append. PrintWriter are metoda println si FileWriter are metoda writeln. In clasa Scor iti poti face o metoda toString() care sa iti faca un output sub forma de string al scorului, asa cum il doresti tu afisat. Dupa asta foloseste metoda toString pentru a scrie scorurile in fisier cu FileWriter sau PrintWriter. Intrebare: De ce vrei sa salvezi scorurile intr-un fisier text? Ca urmatoarea data cand deschizi jocul sa continui cu scorul precedent?
  7. Mie mi se pare cel mai bun de pe piata. Am instalat acum cateva zile Bitdefender cu liceenta si in momentul in care am vrut sa instalez Nvidia Tegra Developement Kit, mi-a zis ca fisierul de install e suspicios. I-am zis sa ii dea allow dar l-a blocat si nu am mai putut sa il scot de la block sub nici o forma. M-am uitat prin optiuni si nu am gasit cum pot sa ii dau drepturi de rulare fisierului asa ca am sters antivirusul. Poate nu stiu eu sa il folosesc bine pentru ca am fost obisnuit cu KIS dar mi se pare absurd sa nu pot sa rulez ceva chiar daca vreau pentru ca il considera el periculos(si era soft de la nvidia ). KIS mi se pare mai bun.
  8. http://www.youtube.com/watch?v=P3oBZ4_TNys
  9. Asta e forum de securitate. Da, postezi tutoriale, stiri din securitate si orice altceva care creste calitatea forumului. La market nu poti posta pana nu ai 50 de posturi pentru ca ceilalti useri sa isi faca o parere despre persoana cu care au de a face. In plus, e un privilegiu ca poti sa postezi aici anunturi cu vanzari/cumparari din momentu ce forumul are cateva sute de vizite zilnic. Deci postul tau are expunere. Vrei sa primesti ceva ce se ofera aici? Ofera! Forumul creste pe spinarea oamenilor care sunt interesati de domeniul securitatii. Cei interesati si cei care vor sa faca parte din comunitatea aceasta, posteaza calitativ. Oferi calitate forumului, primesti acces la market, statut de VIP, etc. Nu te intereseaza astea ci vrei doar sa vinzi/cumperi? Atunci locul tau nu e aici. Am incheiat subiectul, daca nici acum nu intelegi, esti batut in cap.
  10. Nu ma insel, cred ca ai nevoie sa ti se deseneza ca sa intelegi. 1. N-ai facut nimic pentru forum - Corect 2. Ceri dreptul de a posta la market chiar daca nu esti activ aici - Corect Nu te-am atacat, ti-am explicat situatia in care esti. Regula ramane in picioare ti-am zis si de ce si au mai zis si altii inaintea mea.
  11. Ai ceva de oferit? Ofera! Nu e corect sa vii aici si sa ai pretentii. Cine vine aici, vine sa fie informat si sa invete. Asta e privilegiul lui ca are de la cine si ca sunt aici utilizatori care sunt dispusi sa ajute si sa faca programe si tutoriale fara sa ceara nimic la schimb. Asta inseamna comunitate! Nici nu stiu cum ai tupeul sa faci scandal cand n-ai facut nimic aici decat sa ceri. Poti incerca sa faci acele 50 de posturi in 20 de minute sa vezi ce se intampla.
  12. Mi se pare un mod de a arata cu degetul catre bitdefender care afirma ca sunt #1 in lume. All animals are equal, but some are more equal than others.” Thus spake Napoleon, the head-hog in Orwell’s dystopian classic. The genius of this phrase lies in its universality – a small addition turns the truth inside out. Alas, this witty paradox [sic.] is met not only in farmer-revolutionary sagas, but also in such (seemingly very distant) themes as – and you won’t believe this – antivirus tests! Thus, “All published AV-test results are equal, but some are more equal than others.” Indeed, after crafty marketing folk have applied their magic and “processed” the results of third-party comparative AV tests, the final product – test results as published by certain AV companies – can hardly be described as equal in value: they get distorted so much that nothing of true value can be learned from them. Let’s take an imaginary antivirus company – one that hardly distinguishes itself from its competitors with outstanding technological prowess or quality of protection, but which has ambitions of global proportions and a super-duper sales plan to fulfill them. So, what’s it gonna first do to get nearer its plan for global domination? Improve its antivirus engine, expand its antivirus database, and/or turbo charge its quality and speed of detection? No, no, no. That takes faaaar too much time. And costs faaaar too much money. Well, that is – when you’re in the Premiership of antivirus (getting up to the First Division ain’t that hard). But the nearer the top you get in the Champions League in terms of protection, the more dough is needed to secure every extra hundredth of a real percent of detection, and the more brains it requires. It’s much cheaper and quicker to take another route – not the technological one, but a marketing one. Thus, insufficient technological mastery and quality of antivirus detection often gets compensated by a cunning informational strategy. But how? Indirectly; that’s how… Now, what’s the best way to evaluate the quality of the protection technologies of an antivirus product? Of course it’s through independent, objective opinion by third parties. Analysts, clients and partners give good input, but their impartiality naturally can’t be guaranteed. Comparative tests conducted by independent, specialized testing labs are where the real deal’s at. However, testers are peculiar beasts: they concentrate purely on their narrow trade – that’ll be testing – which is good, as testing done well – i.e., properly and accurately – is no easy task. But their results can often come across as… slightly dull, and could do with a bit of jazzing up. Which is where testing marketing done by those who order the testing kicks in: cunning manipulation of objective test results – to make the dirty-faced appear as angels, and/or the top-notchers appear as also-rans. It all becomes reminiscent of the ancient Eastern parable about the blind men and the elephant. Only in this case the marketing folk – with perfect eyesight – “perceive” the results deliberately biasedly. The blind men couldn’t help their misperceptions. There’s nothing criminal of course in manipulating test results. It’s just very difficult for users to be able to separate the wheat from the chaff. Not good. Example: It’s easy to pluck from several nerdy and not overly professional tests selective data on, say, system resource usage of one’s darling product, while keeping shtum about the results of (vastly more important) malware detection. Then that super system resource usage – and it only – is compared with the resource usage of competitors. Next, all marketing materials bang on ad nauseam about that same super system resource usage and this is deemed exemplary product differentiation! Another example: An AV company’s new product is compared with older versions of competitors’ products. No joke! This stuff really does happen. Plenty. Shocking? Yes! Thing is, no lies are actually being peddled in all of this – all it is is selective extraction of favorable data. The meddling isn’t meddling of formal, fixed criteria – it’s just meddling with ethical criteria. And since there are thus no legal routes to get “justice”, what’s left is for us to call upon you to be on your guard and carefully filter the marketing BS that gets bandied about all over the place. Otherwise, there’s a serious risk of winding up with over-hyped yet under-tech’ed AV software, and of your losing faith in tests and in the security industry as a whole. Who needs that? Not you. Not anyone. The topic of unscrupulous testing marketing has been discussed generally here before. Today, let’s examine more closely the tricks marketing people use in plying their dubious trade, and how it’s possible to recognize and combat them. So, let me go through them one by one… 1) Use of “opaque” testing labs to test the quality of malware detection. This is perhaps the simplest and least risky ruse that can be applied. It’s particularly favored by small antivirus companies or technologically deficient vendors. They normally find a one-man-show, lesser known “testing center” with no professional track record. They’re cheap, the methodology usually isn’t disclosed, checking the results is impossible, and all reputational risks are for the testing center itself (and it has no problem with this). Conclusion: If a test’s results don’t give a description of the methodology used, or the methodology contains serious “bugs” – the results of such a test cannot be trusted. 2) Using old tests. Why go to all that trouble year after year when it’s possible every two or three years to win a test by chance, and then for several years keep wittering on about that one-off victory as if it was ostensive proof of constant superiority? Conclusion: Check the date of a test. If it’s from way back, or there’s no link to the source (public results of the test with their publication date) – don’t trust such a test either. 3) Comparisons with old versions. There are two possibilities here. First: Comparisons of old products across the board – among which the given AV company’s product magically pulled a rare victory. Of course, in the meantime, the steamships of the industry have been sailing full steam ahead, with the quality of virus detection advancing so much as to be unrecognizable. Second: Comparisons of the given AV company’s product with old versions of products of competitors. Talk about below the belt! Conclusion: Carefully cast an eye on the freshness of product versions. If you find any discrepancies, wool is most certainly being pulled over the proverbials. Forget these “test results” too. 4) Comparing the incomparable. How best to demonstrate your technological prowess? Easy! To compare the incomparable in one or two carefully chosen areas, naturally. For example, products from different categories (corporate and home), or fundamentally differing protection technologies (for example, agentless and agent-based approaches to protection of virtual environments). Conclusion: Pay attention not only to the versions and release dates of compared products, but also to the product titles! 5) Over-emphasizing certain characteristics while not mentioning others. Messed up a test? Messed up all tests? No problem. Testing marketing will sort things out. The recipe’s simple: Take a specific feature (usually high scan speed or low system resource usage (the usual features of hole-ridden protection)), pull only this out of all the results of testing, and proceed to emblazon it on all your banners all around the globe and proudly wave them about as if they represent proof of unique differentiation. Here’s a blatant example. Conclusion: if they bang on about “quickly, effectively, cheaply” – it’s most likely a con-job. Simple as that. 6) Use of irrelevant methodologies. Here there’s normally plenty of room for sly maneuver. Average users usually don’t (want to) get into the details of testing methodologies (as they’ve better things to do) but, alas, the devil’s always in the details. What often happens is that tests don’t correspond to real world conditions and so don’t reflect the true quality of antivirus protection of products in that same real world. Another example: Subjective weightings of importance given to test parameters. Conclusion: if apples and pears are being compared – you might as well compare the results of such testing with… the results of toilet-flushing loudness testing: the usefulness of the comparisons will be about the same. 7) Selective testing. This takes some serious analytical effort to pull off and requires plenty of experience and skill in bamboozlement/spin/flimflam Here some complex statistical-methodological know-how is applied, where the strong aspects of a product are selectively pulled out of different tests, and then one-sided “comparative” analysis is “conducted” using combinations of the methods described above. A barefaced example of such deception is here. Conclusion: “If you’re going to lie, keep it short”©! If they go on too long and not very on-theme – well, it’s obvious… 8) Plain cheating. There are masses of possibilities here. The most widespread is stealing detection and fine-tuning products for specific tests (and onwards as per the above scenarios). In AV industry chat rooms a lot is often talked about other outrageous forms of cheating. For example, it was claimed that one unnamed developer gave testers a special version of its product tailored to work in their particular testing environment. The trick was easy: to compensate for its inability to detect all infected files in the test bed the product detected just about all files it came across. Of course as a side effect this produced an abnormal number of false positives – but guess what? The resultant marketing materials never contained a single word about them. Nice! Not. Conclusion: Only professionals can catch cheaters out. Average users can’t, unfortunately. So what’s to be done? Look at several competing tests. If in one test a certain product shows an outstanding result, while in others it bombs – it can’t be ruled out that the testers were simply deceived somehow. 9) And the last trick – also the simplest: to refuse to participate in testing. Or to forbid testers to name products with their actual names, and instead hide behind “Vendor A”, “Vendor B”, etc. So why take part in the first place if the results of testing will prove that the emperor indeed wears no clothes? If a title disappears from lists of tested products, is it possible to trust such a product? Of course not. Conclusion: Carefully check whether a given product takes part in all public tests worthy of trust. If it only takes part in those where its merits are shown and no/few shortcomings – crank up the suspiciousness. Testing marketing could again be at work doing its shady stunts. By the way, if any readers of this blog might want to investigate different antivirus test results more closely – and you find any KL testing marketing doing any of the above-mentioned monkey business – don’t be shy: please fire your flak in this direction in the comments. I promise to make amends and to… have a quiet (honest!) word with those responsible And now – to recap, briefly: what a user should do, who to believe, and how to deal with hundreds of tables, tests and charts. (Incidentally, this has already been dealt with – here). So, several important rules for “reading” test results in order to expose crooked testing marketing: Check the date of a test, and the versions and features of the tested products. Check the history of “appearances” of a product in tests. Don’t focus on a specific feature. Look at the whole spectrum of capabilities of a product – most importantly, the quality of protection. Have a look at the methodology and check the reputation of the testing lab used. The last point (4) won’t really concern all readers, maybe just specialists. For non-specialists, I also recommend getting to know the below list of testing centers. These are respected teams, with many years of experience in the industry, who work with tried and tested and relevant methodologies, and who fully comply with AMTSO (the Anti-Malware Testing Standards Organization) standards. But first a disclaimer – to head off the trolling potential from the outset: we don’t always take top place in these testers’ tests. It’s only their professionalism that I base my recommendations on. AV-test.org AV-Comparatives.org Virus Bulletin MatouSec Dennis Technology Labs And here – a final power chord: the ranking of testing centers for 2011-2012. Some interesting background for those who yearn for more info: And really finally finally – already totally in closing, to summarize the above large volume of letters: Probably someone will want to have a pop at me and say the above is my having a cheap pop at competitors. In actual fact the main task here – for the umpteenth time – is to (re)submit for public discussion an issue about which a lot has often been said for ages already, but for which a solution hasn’t appeared. Namely: that still to this day there exist no established and intelligible methodologies for conducting comparative testing of antivirus products agreed upon by all across the board. The methodologies that do exist are alas not fully understood – not only by users, but also by developers! In the meantime, let’s just try and get along, but not close our eyes to the problem that faces this field, all the while calming ourselves with the knowledge that how far “creative manipulation” of statistics goes normally at least lies within certain reasonable limits – for every developer. As always, users simply need to make their choices better by digging deeper to find real data – separating wheat from chaff. Alas, not everyone has the time or patience for such an undertaking. Understandable. Shame. Last words: Be alert! Source
  13. Daca mai postezi de cateva ori aici, nu devii membru "trusted" pentru market. Numarul de posturi nu garanteaza veridicitatea cumparatorului/vanzatorului. Dar cand vezi un post la market, poti sa te uiti la celelalte posturi ale omului sa vezi cam ce fel de persoana este.
  14. Online Domain Tools - Useful tools to make your life easier
  15. Hai mai fa-ti vreo 20 de conturi ca imi simt mana in forma azi. Vreau sa-mi bat recordul precedent in ceea ce priveste timpul in care iei ban.
  16. Din jd-gui dai save all sources si o sa ai ca rezultat un fisier de tip arhiva cu toate sursele din acel .jar. Daca dezarhivezi, o sa gasesti acolo un folder care se numeste "com". Pur si simplu tragi folderul com peste folderul src din eclipse pentru a incarca sursele in proiectul eclipse. Nu mai face dublu post ca iei warn. Ai buton de edit.
  17. Majoritatea dintre noi muncim si/sau suntem studenti si nu avem timp sa stam 24/7 sa citim toate posturile de pe forum. Cand gasiti ceva care nu este in regula dati report post si cu siguranta va intra cineva si va rezolva problema. Se dau warn-uri si ban-uri chiar daca nu scriem asta intr-un post in momentul in care am dat warn.
  18. Hai gata. Cartonas galben si trash.
  19. M2G

    Fun stuff

    http://www.youtube.com/watch?v=G3DHm5GSKhI
  20. Facebook friend adder pro registration bypass. Autor: M2G @ Romanian Security Team Data: 12. 01. 2013 In acest tutorial va voi prezenta un mod de a trece de verificarea de username si parola pentru programul facebook friend adder pro. Inainte de a incepe vreau sa precizez ca aceasta metoda este una destul de simpla si ca sunt sanse sa nu mearga pentru produse software carora li s-a acordat mai multa atentie in ceea ce priveste capitolul securitate. O descriere a programului cum apare pe site-ul lor: Ce-i de la lively ofera o versiune trial, care limiteaza accesul la 3 rulari ale aplicatiei. Pentru aceasta demonstratie am folosit varianta trial pentru windows: Facebook FriendAdder Pro Download - Lively 24*7 Service Primul lucru pe care il facem este sa extragem arhiva si sa exploram structuda fisierelor si directoarelor. Putem deduce usor ca aplicatia este scrisa in java si este folosit un launcher pentru a crea un executabil care sa lanseze aplicatia in executie. Cu alte cuvinte, toata logica aplicatiei este incapsulata in fisierele .jar. Vom folosii jd-gui pentru a incerca sa decompilam fisierele .class din arhiva .jar. Despre fisierele jar: JAR (file format) - Wikipedia, the free encyclopedia Despre fisierele .class: Java class file - Wikipedia, the free encyclopedia Deschidem unul din fisierele .jar cu ajutorul jd-gui si surpriza: avem acces la tot codul sursa si mai ales neobfuscat al programului. De aici incepem sa cautam clasele si metodele care sunt resposabile pentru determinarea unui register valid in sistem. Dupa ceva cautari ajungem la fisierul com.lively.browser_6.0.0.201212262126.jar Navigand prin pachete ajunge la clasa Preference din com.lively.browser.preference unde gasim niste metode care par sa contina ceva logica prin care se face autentificarea. Navigand prin aceasta clasa gasim 2 metode (isValidMemeber si verify) care aparent se ocupa cu inregistrarea si validarea userilor. public static boolean isValidMember(int member, String id, String sn) { switch (member) { case 0: return true; case 1: return verify(id, sn); case 2: return (verify(id, sn)) && (id.length() <= 4); case 3: return (verify(id, sn)) && (id.length() <= 3); } return false; } public static boolean verify(String id, String sn) { try { if ((id == null) || (id.equals("")) || (sn == null) || (sn.equals(""))) { return false; } id = id.trim(); sn = sn.trim(); Signature sig = Signature.getInstance("SHA1withRSA"); sig.initVerify(getPublicKey()); sig.update(id.getBytes("UTF8")); boolean result = sig.verify(Base64.decodeBase64(sn.getBytes("UTF8"))); if (result) { getInstance().idDecryptCipherMap.put(id, getDecryptCipher(sn)); } if ((result) && ( ((Integer.parseInt(id) >= 650000) && (Integer.parseInt(id) <= 650073)) || ( (Integer.parseInt(id) >= 659990) && (Integer.parseInt(id) <= 659999)))) { String mac = MacAddr.getMacAddr(); result = mac.equals(getLicenseMac(id)); } return result; } catch (Exception localException) { } return false; } Primul lucru pe care il observam este ca aceste clase returneaza o valoare booleana. De aici putem sa deducem ca daca am forta cumva una din aceste metode sa returneze true de fiecare data, problema noastra cu inregistrarea ar disparea si am putea sa facem bypass la acel register. Codul sursa pe care il vedem in jd-gui este defapt codul decompilat din fisierele .class din acea arhiva .jar. Nu putem sa modificam metoda direct in jd-gui si sa speram ca in felul acesta o sa reusim. Pentru a face asta ne vom folosii de Eclipse. Eclipse classic este perfect pentru ce avem noi nevoie. Il descarcam si il rulam. In eclipse vom crea un nou proiect java. Ii dam proiectului un nume, nu conteaza ce nume. Pentru acest demo am ales: "ffap_RST_demo" Apasam finish si putem sa observam ca proiectul a fost creat. Din jd-gui avem optiunea sa salvam toate fisierele deschise ca si cod sursa. Deci o sa alegem File -> Save all sources avand incarcat in prealabil jar-ul com.lively.browser_6.0.0.201212262126.jar Dezarhivam fisierul .zip cu sursele, creat te catre jd-gui. Acum trebuie sa tragem cu drag & drop folderul com peste folderul src din eclipse pentru a incarca sursele decompilate in proiectul nostru din eclipse. Daca apare un dialog alegeti copy files and folders si apasati ok. Deschidem in editor clasa care ne intereseaza si observam ca avem foarte multe erori. Aceste erori sunt din cauza ca exista dependente intre clasele pe care le-am decompilat si incarcat si celelalte clase din celelalte fisiere .jar. Nu o sa putem modifica si compila fisierul atat timp cat acesta are erori asa ca aceste dependente trebuie rezolvate. Pentru a face asta dam click dreapta pe proiectul java din eclipse si alegem properties. Alegem din panoul din stanga "Java build path" iar in dreapta selectam tabul Libraries Aici apasam pe "Add external jars" si selectam toate celelalte fisiere .jar din directorul programului. Observam ca o mare parte din erori s-au rezolvat si am mai ramas cu o singura eroare la un import: O cautare pe google ne ajuta sa ajungem la libraria care mai lipseste. Codec - Download Commons Codec Descarcam ultima versiune, extragem arhiva si adaugam fisierele .jar la proiect la fel cum am facut si cu celelalte dependente si am scapat si de aceasta problema. Acum putem sa modificam metoda si sa o compilam. Navigam pana la metoda verify, ii stergem tot continutul si scriem "return true" Deci metoda verify din: public static boolean verify(String id, String sn) /* */ { /* */ try /* */ { /* 323 */ if ((id == null) || (id.equals("")) || (sn == null) || (sn.equals(""))) { /* 324 */ return false; /* */ } /* 326 */ id = id.trim(); /* 327 */ sn = sn.trim(); /* 328 */ Signature sig = Signature.getInstance("SHA1withRSA"); /* 329 */ sig.initVerify(getPublicKey()); /* 330 */ sig.update(id.getBytes("UTF8")); /* 331 */ boolean result = sig.verify(Base64.decodeBase64(sn.getBytes("UTF8"))); /* 332 */ if (result) { /* 333 */ getInstance().idDecryptCipherMap.put(id, getDecryptCipher(sn)); /* */ } /* 335 */ if ((result) && ( /* 336 */ ((Integer.parseInt(id) >= 650000) && (Integer.parseInt(id) <= 650073)) || ( /* 337 */ (Integer.parseInt(id) >= 659990) && (Integer.parseInt(id) <= 659999)))) { /* 338 */ String mac = MacAddr.getMacAddr(); /* 339 */ result = mac.equals(getLicenseMac(id)); /* */ } /* 341 */ return result; } catch (Exception localException) { /* */ } /* 343 */ return false; /* */ } va deveni: /* */ public static boolean verify(String id, String sn) /* */ { return true; /* */ } Aceasi chestie o putem face si cu metoda isValidMember dar nu e necesar aici. O chestie importanta care trebuie stiuta este ca eclipse compileaza fisierele in momentul in care acestea sunt salvate asa ca tot ce trebuie sa facem este sa salvam modificarile si sa mergem in directorul in care am creat proiectul java. Aici putem sa observam ca avem clasa de care avem nevoie gata compilata. Stiind ca un fisier .jar este defapt o arhiva putem sa luam fisieru compilat de noi si sa-l inlocuim cu cel existent in directorul programului. Acum rulam programul in speranta ca ideea noastra a functionat si ca putem sa trecem de acel register. Dar surpriza: Se pare ca avem o problema dar aplicatia a creat un fisier de log. De aici putem observa ca aplicatia a fost compilata cu java 1.5 dar noi am compilat cu versiunea curenta 1.7 si aceasta pare sa fie cauza crash-ului. Ne intoarcem in eclipse dam click dreapta pe proiect si alegem properties. De aici alegem java compiler din panoul din stanga. In panoul din dreapta setam "Compiler compliance level la 1.5" Acum e nevoie de un rebuild al proiectului. Pentru asta este necesar doar sa tastam un spatiu in editor si sa salvam (ctrl+s). Daca nu merge asa dati Project si Build project. Dupa build luam fisierul si il punem in arhiva jar, ca si mai sus. Ruland aplicatia suntem intampinati de acelasi dialog de register. Incercam o inregistrare cu credentialele de mai sus(vezi nota 1) si:
  21. Vorbesti prostii! .NET e un framework matur. Nu inteleg de unde zici tu ca vb.net e "mai mult ceva gui". On: Poti sa le incerci pe ambele. Sunt tutoriale pe net cu gramada pentru ambele tehnologii. Poti sa decizi singur care iti place mai mult. Daca esti incepator iti recomand si eu ca si nytro sa incepi cu C# in loc de VB. Motivul fiind ca o sa te obisnuiesti cu o sintaxa asemanatoare cu cea de la C++. Sintaxa pe care se bazeaza majoritatea limbajelor de programare. Cu alte cuvinte, daca te obisnuiesti cu sintaxa C# iti va fi mai usor sa inveti un alt limbaj de programare mai rapid. Totul depinde de tine. Eu as zice sa te uiti peste ambele limbaje si sa vezi care ti se potriveste mai mult. De asemenea o sa fii nevoit sa inveti OOP. (programare orientata obiect) Pentru java iti recomand aceasta carte in romana: JAVA De la 0 la Expert[RO][stefan Tanasa][Cristian Olaru][stefan Andrei][Ed. Polirom - 2003].pdf Sau aceasta, in engleza:http://www.mediafire.com/view/?cyb22o2wded7a6k Pentru C# nu stiu ce sa iti recomand pentru ca l-am invatat "din mers" citind ce aveam nevoie si avand un background in alte limbaje.
  22. M2G

    Burnout

    Trashed and warned. Suntem RST, nu facebook.
  23. Easy.
  24. Dovezi?
  25. Cat va mai place sa comentati la topicuri idioate!
×
×
  • Create New...