Free Cloud Mining Profits. Minergate uses software that let you mine different coins with your CPU and GPU. However, this is not super productive, and you must think. Filters: Latest Drivers & Tech Support • • • (download and share Wattman profiles) • • • General Information Welcome to! In this subreddit, we discuss and share news, rumors, ideas, and knowledge relating to AMD, their hardware and software products, and the silicon industry. Please note that this subreddit is community run and does not represent AMD unless otherwise specified. Rules Rule 1: Tech support questions are only allowed in and must instead be posted. Any other tech support posts will be removed at moderator discretion. Rule 2: No referral links, including Amazon! Product links are fine, affiliate or referral links that benefit you are not. Rule 3: Be civil and obey reddiquette. Please remember that behind every poster is a human. This means no brigade incitements, personal attacks, or 'mentioning' a user in order to annoy or harass them, etc. Rule 4: Use of slurs of any kind, racial, homophobic, or whatever, in any context will result in a ban. This includes derogatory comments such as 'retard'. There's no need for petty insults on this sub. Rule 5: All posts must be related to AMD or AMD products. Example of okay: RX480 vs 1060. Not okay: GTX 1060 vs 1080. Rule 6: No religion/politics! There are plenty of other places for that. Rule 7: Use original sources. Copypasta articles sourced from other websites are not allowed. Quotes are fine, but pasting the entire article in a textpost is not. Original articles are always better than a reddit textpost. Rule 8: Shitposts, memes, and plain box pictures are not allowed as linkposts (you can still include them within normal posts or comments). Visit for dank shitposts and memes. Strawpolls are not allowed. Rule 9: The moderators of reserve the right to allow posts or comments that could technically break rules #1 (tech support) or #8 (no memes), when a situation has arisen where the post is especially necessary, funny, educational, or useful to the users of the subreddit. Reports are always welcome, but remember that content sometimes remains up due to this rule, rather than because of lack of moderator work. Rule 10: No bamboozling Links • • Tech Support Megathreads: • • • • • • • • • Related Subreddits. E: Going to bed, will contribute more tomorrow. Thanks for the discussion! Myth: Mining is more stressful than gaming. Fact: It depends. During the old days, this was plausible, because older GPUs (Pre-polaris) are/were bottlenecked by core clock when mining the most profitable coins. Thus, miners overclocked and overvolted these cards quite frequently, especially with cheap electricity. This meant that those cards were often run hot, pushing the limits and stressing VRM and fans quite a lot. Nowadays, ethash (Ethereum) is the most profitable algorithm for AMD cards 99% of the time, and newer GPUs (Polaris) are limited by memory bandwidth and latency. Miners can underclock core to the low 1100MHz range before seeing performance drop. To save power, miners who know what they are doing also undervolt, since it is no longer necessary to sustain a high core clock. Thus, it is quite feasible to run polaris cards below 70C at a reasonable fan speed. However, dual mining (mining more than one coin at once) does increase power consumption by up to 20%, and there are also idiots who run their polaris cards OCd while mining. With the exception of a few idiots, miners treat their Polaris GPUs pretty much the same; that is, running underclocked and undervolted 24/7 with a memory strap mod and mem OC. On the other hand, former gaming cards are highly variable in use cases. Some gamers leave their cards at stock settings, some undervolt, and some OC and/or overvolt. Most of the time, these cards are thermal cycled far more often than mining cards, which is known to weaken solder. Another thing to consider is that manufacturers have learned (somewhat) from their mistakes of putting shit tier fans in GPUs, and many fans on modern GPUs are ball bearing and/or swappable. Even some budget cards, such as MSI Armor, use decent ball bearing fans. Bottom line: the risk of buying mined Polaris cards is not as high as the risk of buying older mined cards. I would not be against buying mined polaris cards, but it's not necessarily better than buying a gamer's card instead. At the end of the day, it depends more on how the owner treated it than what they used it for. Myth: GPUs are obsolete because of FPGAs and ASICs Fact: Mostly false. Older algorithms such as scrypt and SHA256 (lite/doge/feather/bitcoin etc) are no longer feasible to mine with GPUs, but there have been multiple algorithms since then that are built to deter ASICs; most of the time it is done by making it memory-hard because designing an ASIC with high memory throughput is considerably more expensive to design and manufacture. Many devs prefer their blockchain to be ASIC resistant to avoid the concentration of power problem that Bitcoin is having nowadays, where a giant, near-monopolistic ASIC manufacturer (Bitmain) is causing a lot of (subjective) controversy. Blockchains based on ethash (Ethereum and its forks), equihash (Zcash and its forks) and cryptonight (Monero and forks) are some examples, but there are scores of other shitcoins and a few other algos that are GPU dominant. It is almost impossible that there will be another ASIC takeover, which is what was responsible for the stop in GPU demand in the bitcoin and litecoin days. Bottom line: ASICs no longer threaten GPU miners, or the demand for GPUs Myth: Ethereum switching to Proof of Stake will kill mining soon Fact: Doomsayers have been preaching about proof of stake since late 2015. It has always been 'coming soon.' The fact is, the Ethereum roadmap goes from proof of work (mining) -> Casper (mining + PoS) -> Metropolis (PoS). Currently, the release date of Casper is not even announced yet, nor is it being tested in a (public) testnet. Proof of Stake might one day take over, but mining is here to stay for a while yet. Another thing to consider is that there are tons of other GPU mineable blockchains, and although Ethereum is biggest, it is certainly feasible that mining stays profitable even after Ethereum goes PoS (if it ever does). However, it is possible that profits will be low enough to discourage new miners. Bottom line: It's very unlikely. E: I screwed up the roadmap; here is a better source than me with some interesting information: Myth: The current Ethereum demand spike is a bubble Opinion: Honestly, I don't know. I would not be surprised if stricter regulations on ICOs come sooner or later, which would fuck with Ether prices. There is also the inherent volatility of cryptocurrencies. However, it is also possible that blockchain technology continues to gain traction; that is, the price could just as easily go up as go down. Although it's fun to read about other people's opinions, only time-travelling wizards can tell you when it will become economical again to upgrade your poor HD5770. Bottom line: No one knows. Myth: Miners will 'steal' all the RX Vegas Fact: Only a reckless miner would buy Vegas on release, since mining performance is not known. In fact, it is possible that it can't mine at all (or at some stupidly low speed) until devs add support to existing miners. It would be even more reckless than gamers who buy without seeing benchmarks, since at least gamers can expect the games to actually run. It's also not necessarily the case that Vega will be good once miners do add support. Maybe there will be enough reckless miners to affect supply, maybe not. Of course, it is possible that miners will deplete the supply after it is demonstrated that Vega is good for mining. Bottom line: Most miners won't preorder, but it's possible that a significant number will. E: Important to remember that even if mining demand isn't high, doesn't mean that supply will be plentiful. Myth: Nvidia cards SUCK at mining Fact: Mostly false. They USED to suck in the old pre-Maxwell days, but now they are actually more efficient at mining Ethereum and Zcash compared to AMD cards, even after both cards are undervolted. The flipside is that they (used to) cost more for the equivalent hashrate. For reference, my old 5xRX470 rig drew just under 800W when mining ETH only and hashed at 150MH/s. My current 6xGTX1060 rig draws just over half of that (. Are you SERIOUSLY arguing that the selfish use is something you're entitled to over sharing the compute power? Is not mining a selfish pursuit? Don't be a moron; selfish has nothing to do with it. It's not like these miners are angels who are all running distributed scientific applications to solve humanities' problems. They're in it because they think they can make some money fairly quickly. Honestly, both mining and gaming are just as legitimate. If you have a card, do what ever the fuck you want with it. • • 2017-03-21 • This is a difference of or in case of AVX times more instructions per clock. As ofthe fastest CPUs have up to 6, 8, or 12 cores and a somewhat higher frequency clock MHz vs. A CPU is an executive A CPU is primarily to be an executive and make decisions, as directed by the software. For example, if you type a document and save it, it is the CPU's job to turn your document into the appropriate file type and direct the hard disk to write it as a file. CPU's are also highly capable of following instructions of the 'if this, do that, otherwise do something else'. A large bulk of the structures inside a CPU are concerned with making sure that the CPU is ready to deal with having to switch to a different task on a moment's notice when needed. CPU's also have to deal with quite a few other things which add complexity, including: Yes, a GPU can do math, and can also do 'this' and 'that' based on specific conditions. However, GPU's have been designed so they are very good at doing video processing, and less executive work. Video processing is a lot of repetitive pgu, since it is constantly being told to do the same thing to large groups of pixels on the screen. Bitcoin hashrate gpu shader order to make this run efficiency, video processors are far heavier on the ability to do repetitive work, than the ability to rapidly switch tasks. Still in the comparison chart above there are more specs that could be important I guess: That is a mega-crunching machine we put together after doing all of our other card testing to see just how much we could push out of a single system. See Some Performance Specs Below: With over 20 years of computer programming experience, John is able to teach you the shortcut to mining the most bitcoins in the quickest possible bitcoin hashrate gpu shader. A hpu of mine who has been mining for a while thinks the number of shaders is the most important one. It is very visible in all ALU-bound GPGPU workloads such as Bitcoin, password bruteforcers, etc. GPU's have large numbers of ALU's, more so than CPU's. As a result, they can do large amounts of bulky mathematical labor in a bittcoin quantity than CPU's. Analogy One way to visualize it is a CPU works like a small group of very smart people who can quickly do any task given to them. A GPU is a large group of relatively dumb people who individually very fast or smart, but who can be trained to do repetitive tasks, and collectively can be more productive just due to the sheer number of people. It's not that a CPU is fat, spoiled, or lazy. Both CPUs and GPUs are creations made from billions of microscopic transistors crammed on a small piece of silicon. These libraries bitcoin shader hashrate gpu information On silicon chips, size is expensive. The structures that make CPUs good at what they do take up lots of space. When those structures are omitted, that hasgrate plenty of room for many 'dumb' ALU's, which individually are very small. The ALUs of a GPU are partitioned into groups, and each group of ALUs shares management, so members of the group cannot be made to work on separate tasks. They can either all work on nearly identical variations of one single task, in perfect sync with one another, bifcoin nothing at all. Trying different hashes repeatedly - the process behind Bitcoin mining - is a very repetitive task suitable for a GPU, shadet each attempt varying only by the changing of one number called a 'nonce' in the data being hashed. The ATI Radeon is a popular video card for Bitcoin mining and, to date, offers the best known performance of any video card for this purpose. This particular card has 3, 'Stream Processors', which can be thought of as 3, very dumb execution units that can be trained to all do the same repetitive task, just so long as they don't have to make any decisions that interrupts their flow. Those execution units are contained in blocks. The uses a VLIW-5 architecture, which means the 3, Stream Processors are actually 'Cores,' Each able to process 5 instruction per clock cycle. Nvidia would call these cores 'Cuda Cores', but as mentioned in this article, they bitvoin not VLIW, meaning they cannot do as much work per cycle. This is why comparing graphics cards by core count alone is not an accurate method of determining performance, and this is also why nVidia lags so far behind ATI in SHA hashing. Since ALU's are what do all the work of Bitcoin mining, the number of available ALU's has a direct effect on the hash output. Compare that to a 4-core CPU that can switch tasks on a dime, but has ALU's in some small multiple of four, if not just four ALU's alone. Trying a single SHA hash in the context of Bitcoin mining requires around 1, simple mathematical steps that must be performed entirely by ALU's. That, in a nutshell, is why GPU's can mine Bitcoins so much faster than CPU's. Bitcoin mining requires no decision making - it is repetitive mathematical work for a computer. The only decision making bitcoiin must be made in Bitcoin mining is, 'do I have a valid block' or 'do I not'. That's an excellent workload to run on a GPU. Why are AMD GPUs faster than Nvidia GPUs? Because of gou VLIW vs. When hashrate bitcoin shader gpu important factor This translates to a shadeer ALU performance advantage for AMD: AMD Radeon HD It is very visible in all ALU-bound GPGPU workloads such as Bitcoin, password bruteforcers, etc. Secondly, another difference favoring Bitcoin mining on AMD GPUs instead of Nvidia's is that the mining algorithm is based on SHA, which makes heavy use of the bit integer right rotate operation. This alone gives AMD another 1. Combined together, these 2 factors make AMD GPUs overall 3x-5x faster when mining Bitcoins. NVIDIA Releases NEW Generations of GPU Cards NVIDIA's new flagship card 'GeForce GTX ' is now beefier than it's younger sibling - GTX EVGA has also decided to use the same chipset on its flagship card 'EVGA GeForce GTX Signature'. But what are the comparitive figures for the AMD and new NVIDIA GPU's? See Some Performance Specs Below: GeForce GTX 4,MB: GPU Clock MHz 1,GFLOPS 5, Single Precision, Double Precision Figures unavailable, ALU's manufacturer refer to this as CUDA Cores AMD Radeon Whader.
0 Comments
Leave a Reply. |
AuthorWrite something about yourself. No need to be fancy, just an overview. Archives
March 2018
Categories |