Hacker Newsnew | past | comments | ask | show | jobs | submitlogin
What’s New in TensorFlow 2.10? (tensorflow.org)
38 points by RafelMri on Sept 6, 2022 | hide | past | favorite | 12 comments


Is there any point on which Tensorflow actually wins out against Pytorch? My sense is that Pytorch is easier for almost every if not every single use case. Not to mention the 1 => 2 transition gave me serious Python 2 => 3 vibes with the switch to default eager execution etc. Anyone still using Tensorflow care to weigh in?


I use pytorch the vast majority of the time, but I think there are 2 reasons tensorflow remains competitive.

First, is TF's deployment has always been ahead of Pytorch's (although the gap is closing, especially as Onnx becomes more popular).

However, the more important reason is how amazingly good value TPUs are in terms of their RAM and FLOPS. Although pytorch has XLA support, it just doesn't work as well as TF on TPU pods.

In Kaggle competitions, when the input data can fit on a consumer GPU everyone uses pytorch. When it doesn't, everyone uses the free TPUs on the Kaggle platform and reverts to tensorflow/keras.


I seem to be in a small group that doesn't have a huge preference between the two, maybe I'm not doing that much advanced stuff?

If I'm just prototyping something myself, I usually reach for TF first, mostly because Keras feels like the "right" level of abstraction for most stuff. I used to prototype things with fastai/pytorch more, but newer developers didn't like how much was hidden behind multiple levels of *kwargs and some of the dataloader stuff could get tricky if you tried to do anything non-standard. I haven't tried PyTorch Lightning.

Besides the deployment story, which is pretty big and others have touched on, there are some minor things that feel nice in the TF/Keras ecosystem:

- Part of deployment, I guess, but I like that more of the preprocessing can happen in the model, vs as a separate step. The less transforming of data that has to be done in the serving code, the fewer possibilities for things to get out of sync and introduce bad data at runtime.

- Keras being able to infer input sizes in layers is nice for avoiding a bunch of bookkeeping code to calculate layer sizes.

That said, I feel like maybe the TF ecosystem has more sharp edges? I've encountered more than a few of them lately as I've been doing work on some recommender models using tfrs. I've also run into things like tensorboard logging not working with entire classes of layers and causing training to crash.

I'm curious - what are some things about PyTorch that make it better for almost every single use case?


And google started using Jax instead of Tensorflow


If you want to actually deploy AI models, TFLite is still your best option. And for exporting there, you first need things to run well inside regular TF.

Also, I often find myself using the TF data pre-processing pipeline even from inside PyTorch, because it's just so much easier to get excellent processing performance with TF. Not sure why, but PyTorch is in many cases "unnecessarily" single-threaded.


There are a ton of mobile deployment options that support PyTorch+TF models. It's hard to argue TFLite is the best.

https://github.com/alibaba/MNN

https://github.com/Tencent/ncnn

https://github.com/Tencent/tnn

https://github.com/XiaoMi/mace

Plus, each chip vendor has their own SDK (e.g., Qualcomm has SNPE, MediaTek has NeuroPilot)


> If you want to actually deploy AI models

On mobile*.

> I often find myself using the TF data pre-processing pipeline even from inside PyTorch

the tf.data pipeline is quite nice, has some neat auto-tuning features.


Tensorflow still has the edge for sensor-level inference ("TinyML") - via TFlite for microcontrollers.


I think TF is still popular (with beginners, at least) because of the keras API (far easier to get started!)


Slowly becoming irrelevant: https://paperswithcode.com/trends


For that statement you would need to show industry graphs, not only research


Industry does not exist in a vacuum, it follows research.

Additionally, see Google trends: https://trends.google.com/trends/explore?date=today%205-y&ge...




Consider applying for YC's Summer 2026 batch! Applications are open till May 4

Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: