Jump to content
NEW ANNOUNCEMENT - Check Featured Topic! ×


  • Content Count

  • Joined

  • Last visited

  • Days Won


Dae314 last won the day on November 23 2012

Dae314 had the most liked content!

Community Reputation

54 Accepted

About Dae314

  • Rank
    Tale Master of Tales
  • Birthday 08/10/1991

Profile Information

  • Gender
  • Location

Social Networks

Recent Profile Visitors

The recent visitors block is disabled and is not being shown to other users.

  1. Windows defrag Windows firewall (unless you have strict security requirements in which case a lot of people I know use zone alarm last I asked which was like 4 years ago). CCleaner For like games? NVIDIA and AMD both have tuning software out I don't remember the names. There's also stuff like game booster by razer. I don't think these softwares do much though. You need to manually tune your system. If you mean like reorganizing file systems, free disk analyzer can help with that as well. Malwarebytes to clean out an existing infection. Avast to make sure you don't get infected again. idk I usually xdcc
  2. I'm pretty sure you can find an iso of Windows XP SP3 floating around somewhere. Here's my advice though: Dual boot or use a VM. Run regular Windows 7/8 as your primary OS and use a VM or a second partition for Windows XP (depending on how much performance you need). VM is more convenient since you won't have to reboot to jump in it, but a partition runs faster. I successfully got Rollercoaster Tycoon I and II running on my Windows 8.1 however during that experience I found out that Windows 8.1 is not as legacy friendly as Windows 7. One of the main problems I ran into was the OS display effects. I don't know the official name, but in Windows 7 under the application compatibility tab, you can disable the aero effects component of Windows completely. Windows 8 does not allow you to do that (even if you switch to a flat theme it doesn't get disabled) since the metro start screen and the side bar stuff use it. If the game was popular, there's usually instructions scattered around the internet to get it running in Windows 7/8.
  3. I guarantee you that there's some users here who won't read this thread but will read the front page.
  4. If the old address will redirect to the new site how will we know when the cutover happened? Can we modify the banner or something for a couple weeks to say "Change yo bookmarks now!!!" or something?
  5. That sounds like an external firewall although I'm sure there's other ways to set that up. The benefit of NFS is that you should be able to mount the share folders from the Pi on your Windows boxes as drive letters. After that you just treat it like any normal drive. Not sure what you mean by SSH... SFTP? So you just setup the Pi as an FTP server and store files on there?
  6. My first instinct for the NAS would be to use NFS. I don't know how well that will work going to a Windows client, nor do I know whether the Raspbian OS you want to use supports NFS. If you can get NFS working Linux -> Windows you just have to have everything on the same LAN for things to connect (and firewall between the 2 disabled on the NFS ports). I think NFSv4 uses just TCP2049 but I don't remember all the ports NFSv3 uses.
  7. You're making the mistake of comparing CPUs by frequency and cache size. Put simply (and therefore a little incorrectly), the amount of "work" that can be done in a single "core" in a single CPU cycle is not the same if you have 2 cores with different architecture. Easy example, AMD uses something called CPU modules. Each module they claim is 2 CPUs because most of the work it does is done as 2 CPUs. However, during certain workloads (specifically floating point work) that 2 CPU module will only work as 1 CPU because the two cores share some components. Intel doesn't do this at all which makes it hard to compare the 2 CPUs by frequency alone. As for cache size there's pros and cons to larger caches. The larger the cache the more likely it will be that the data you want is in there. However, it will also take longer to search the cache. If your data isn't in the cache, you'll have to wait for the whole cache to be searched before going to main memory to fetch the instructions you need. As for maximum memory speed... meh. Most people use 1333MHz DDR3 (as in non-enthusiast-buy-a-dell people). High frequency memory gives you a very small performance boost and shouldn't be your first priority. All of this is covered in the guide. Go with the recommendations you received earlier in the thread, they're good. As for PSU, 650W sounds good. If you're planning to upgrade the graphics card and/or overclock maybe 750W. If you're going for Intel, note that Devil's Canyon CPUs have been getting pretty good reviews. I haven't put these in the guide yet so you'll have to find them yourself. I don't know how much extra they cost over their Haswell counterparts though.
  8. Yup your summary of the issue I highlighted is spot on. All of the paragraphs before the end talking about packet corruption were to give background info in case you didn't have any networking background. There are probably a lot more issues that get more technical as to what happens as you get farther from the source. Since you're a CE major and especially if you pursue a career in networking, you'll probably come across these as you study. Also note, the resend issue I highlighted is protocol dependent. I'm sure you've heard "TCP/IP" in reference to the internet. Those are 2 protocols that pass a lot of internet data, but there are others. UDP is another popular alternative to TCP, and it does not natively support packet validation. Depending on the type of data you're sending, you may not need to have perfect data transfer. For example, if you're streaming a movie, it's not critical that pixel (1248,103) is exactly a certain color. If that packet comes in a little corrupted who cares? You'd be wasting your time resending that packet for an unnoticable increase in quality. However, with something like a torrent, you do want that data to be exactly the same. Networking is a very messy topic with lots of variables to consider. That's part of the reason why networking people tend to get paid more for what they do. Computer networks are complex making them difficult to troubleshoot, but if the network goes down you affect a ton of people. Desktop issues are usually fairly straight forward (reboot it!) and will usually only affect a few people.
  9. Hey check out my post in the wifi topic you replied to

  10. I'm not a networking expert so I can't explain all of the factors that affect wireless communication, but I can highlight probably the biggest issue for your question: Noise (EM Interference) If you're an audio engineer or you've ever tried to design a system that needs to pass data over a very long distance, you know exactly what this is. When you send your data over a physical medium (whether that's wifi frequencies or copper wires) you aren't sending your signal into a perfect vacuum. The world is a messy place and there's lots and lots of different things that can affect your signal no matter how well shielded it is. Even when your signal is passing through a wire, you still have some noise buildup. It's part of the reason why different classes/types of wires have different maximum effective lengths. For example, I think cat5 ethernet cable has an advised maximum length of 100m. Other factors play a larger role in cable length limits than noise but we're trying to stay simple here. Now just think. If there's some noise in a relatively isolated environment like a wire, how much noise would there be if you were sending your signal over the air completely unshielded from environmental interference like wifi is? It's amazing you can even connect at all. Imagine if you were receiving an encrypted file via email on your phone. Lets say there is no error correction at all on the network you're using. Thanks to noise, some of the bits in the original message get flipped (1->0 or 0->1). The way some encryption methods work, even a small change like that could completely corrupt the file you get when you unencrypt it. Fortunately, there are lots of layers of error correction protecting your data as it crosses a noisy environment at every stage in the OSI network model (see wikipedia). Feel free to look some of these terms up: checksum, cyclic redundancy check, parity bit, error correcting code. You need some math skill to fully understand how some of the more complicated ones work, but you should be able to understand the basics of all of them. Unfortunately, even with the best math can offer, there are still times when your data arrives uncorrectable or corrupted. A good error correction algorithm will at least be able to see that the data is bad though. So the clever network engineers of the past put in a mechanism to request data be resent (at least if you're using the right protocol e.g. TCP) if you find out it's corrupted. That's great! Now if some data happens to be corrupted, as long as you know it's corrupted you can just rerequest it! Joy. But wait. That's fine if it's just maybe 1 packet (stuff that holds your data) is corrupted out of every say... 1,000,000,000,000 packets you receive. But remember that thing I told you about noise buildup and distance earlier? As you get further from the source of the signal, there is a higher chance that your data will arrive to you corrupted beyond repair and you'll need to ask for it again. And again. And again. So now you're on Wifi and far from the source. Instead of 1/1,000,000,000,000 packets arriving corrupted, you're hitting 1/1,000,000. You're 1,000,000x more likely to need to request a resend. You're obviously not going to be as fast as you were before. So there's a ton of more information I didn't even mention in this post, and about 95% of it I don't even know about. If you want to learn more, I suggest you pick up a networking book (or heck maybe a physics book about electrical engineering). If this scared you, you probably want to just accept that the network is the way that it is and just thank the networking gurus that things actually work most of the time. I fall into the latter category for now until I have time to pursue a Cisco certification .
  11. Are you overclocking? Check the guide for a link to a good (but old) article explaining voltage regulation, it's not the same as wattage. You shouldn't need to bump your PSU up just for the CPU unless you're planning a very high overclock.
  12. I guess I should've specified minimum fps. I think the CPU topic usually comes up when people want a minimum of 120fps in the game. CPUs haven't been jumping in power over generations like they used to. There's no reason to upgrade it unless you want better performance in a specific application. AMD's pretty much given all of its focus to its APUs recently, and Intel tends to focus more on power consumption and heat for their presentations now. Starting out with a good base CPU wouldn't hurt though. On the motherboard, I'd check the chipset just to make sure it's got what you want and the voltage regulators to make sure you have the phases/brand you want. Sticking to a good brand and not going dirt cheap usually gets you passable voltage regulation as long as you're not overclocking.
  13. The heat won't matter much for you. I've noticed that high end games have started relying more on the CPU. BF4 for instance actually gets a noticeable boost from a better CPU. It's not true for all games, but the CPU bottleneck isn't as insignificant as it used to be.
  14. Ya you probably want a motherboard with an updated chipset unless you're going to just get an i3.
  15. I don't expect it to be a silver bullet that will solve all of his problems. To be honest I don't quite understand what he's trying to do, but I did identify that the main question he had in his post was exactly how data is passed on the physical/link layers in the ethernet standard. That question should be covered in the IEEE standards documents or documents related to it.
  • Create New...