Future of asic mining is now what is your opinion ?
You need to provide your opinion and why you think that way about your opinion.
GPU mining is almost dead major coins have not forked like zcash so i think the best time to mine is right now.
GPU mining dead? don’t think so
Maybe maybe not only time will tell. The ASIC’s haven’t been out for that long and well Zcash isn’t all that great of a coin anyhow. I would like to see what some of the newer coins do before making this claim.
The other issue I have with your comment is the total conflict of interest as you are trying to sell ASIC’s to us by saying GPU mining is dead with no real evidence that it is dead.
Ethereum’s DAG is approaching new heights and Most of the GPUS are unable to mine ETHEREUM, Zcash have ASICS out now so it’s pointless to mine zcash with GPUs. I asked your opinion for that exact point
Ethereum’s DAG is pretty far behind ETC with regards to size. Secondly it will take the next year or so to knock the 4gb cards off the list. This means that you still have 6 8 and 11gb cards of the current generation which will take on memory size into 2020 at minimum.
It may be pointless to mine Zcash but I know very few people in the community here or over at BBT that were mining Zcash.
With you being a mining hardware supplier I think you would be able to provide a much greater opinion on something of this nature than. Zcash is not worth mining with GPU’s and that ETH’s DAG has reached new highs when it has only knocked 2gb and 3gb cards off the table which are the low end of cards to begin with. Those cards are still extremely profitable mining other coins so they are not dead because of the ETH DAG.
the future isn’t asic or gpu. its fpga
it will probably take 2 years or so for those to take a foothold again but i think that their superior hash power, much larger memory banks, and ability to switch algos is really where its at. all we really need is someone who is willing to share a user friendly software stack to the masses so that adoption can take off.
You should head over to BBT’s discord. They have a very active community on this. I haven’t been as present there as I would like but I know the FPGA channel is pretty active.
I helped build a GPU farm that mined ZEC. I tried to convince the owner to switch off ZEC on several occasions, but he was a big ZEC fanboy for whatever reason. He was using GTX 1060 6GB GPUs. At its peak there was 112 GPUs. There was a single rig with 8 x GTX 1080s.
The evolution goes CPU, GPU, FPGA, then ASIC. That’s the normal evolution of hardware and it doesn’t matter what your software stack is. There is no way you can break that evolution, you only prolong the inevitable.
In your time frame of “2 years”, ASICs will still continue to dominate. For evidence of this, you don’t have to look further than the history of BTC to understand this. FPGA’s are not making a comeback to mine BTC compared to ASICs. FPGA’s are simply being used to mine algorithms that ASICs are not publicly available for.
You have that already with OhGodAGirl and OhGodACompany and Minority.io. However, that’s not going to compete directly with ASIC farms. They are offering a completely different type of mining service from what is typical today. This is so they can try to distance themselves from your typical cloud mining facility and product offering. It is interesting but does not compete with ASICs.
This is what is causing a blood bath in shit coins. People not being loyal to any particular network and jumping from one to the other only helps to centralize and reduce security of the networks being mined. This is decentralized consensus networking 101, but so many miners have no clue what they are doing to hurt the network rather than help make it stronger.
so lets assume for a moment that a miner is loyal to monero. an asic comes along, and monero changes the algo. wouldn’t it be best to have the best performing hardware which can switch algos in order to keep being loyal to asic resistant projects…?
GPU mining will be around for at least another 3-5 years. There are enough GPU mining focused companies with investment runways that can take them 3-5 years out. You are spreading FUD about GPU mining and the fact some coins haven’t forked yet is meaningless when new shit coins can be produced at will. Ask the BTCP fans, or [insert shit coin here] fans. RVN is another example where we see an attempt to be ASIC resistant and then some pleb comes along and copy/pastes the whitepaper to create PGN. This is the perfect example of how there is no limit to the amount of greed that fuels GPU mining specifically.
If a miner is loyal to a network, it doesn’t matter if you have the best performing hardware. That’s why they make USB ASICs. This isn’t rocket science. Loyalty has nothing to do with being the best or having the best of anything. Loyalty is just about being loyal no matter the costs, strengths, or weaknesses. You’re confusing loyalty with greed.
Monero specifically was mined by ASICs long before it forked into 5 or more shit coins. It will continue to be mined by ASICs due to the forking so you could stay “loyal” in the sense you are talking about. This is really not that complicated.
The performance of your singular higher performing hardware doesn’t matter when we are talking about decentralized consensus base networks. There will be large data center operations that support FPGA’s for doing exactly what you are saying. That’s a given, but it’s not competing with ASICs and it’s not going to ever beat out an ASIC in mass quantities. There are many more variables and factors that go into deploying this FPGA hardware that impact the performance and profitability. This is why a company like Mineority.io with major amounts of experience on the topic are struggling to make it happen when compared to ASIC cloud mining farms.
I have a client who has insisted on mining ETH with 1060 3GB’s. It is a waste of time in my opinion but the 1060 3GB does still mine ETH.
Ah ok I thought they were done for. ETC’s Dag is larger isn’t it? If so maybe that is what I got mixed up on.
If you use Linux, you can fit the DAG in 3GB of VRAM. Windows bloats more than Linux.